Initial commit

This commit is contained in:
St. Nebula
2026-04-23 23:58:59 -05:00
commit 47b9e3c159
257 changed files with 18913 additions and 0 deletions
@@ -0,0 +1,57 @@
---
name: developing-genkit-dart
description: Generates code and provides documentation for the Genkit Dart SDK. Use when the user asks to build AI agents in Dart, use Genkit flows, or integrate LLMs into Dart/Flutter applications.
metadata:
genkit-managed: true
---
# Genkit Dart
Genkit Dart is an AI SDK for Dart that provides a unified interface for code generation, structured outputs, tools, flows, and AI agents.
## Core Features and Usage
If you need help with initializing Genkit (`Genkit()`), Generation (`ai.generate`), Tooling (`ai.defineTool`), Flows (`ai.defineFlow`), Embeddings (`ai.embedMany`), streaming, or calling remote flow endpoints, please load the core framework reference:
[references/genkit.md](references/genkit.md)
## Genkit CLI (recommended)
The Genkit CLI provides a local development UI for running Flow, tracing executions, playing with models, and evaluating outputs.
check if the user has it installed: `genkit --version`
**Installation:**
```bash
curl -sL cli.genkit.dev | bash # Native CLI
# OR
npm install -g genkit-cli # Via npm
```
**Usage:**
Wrap your run command with `genkit start` to attach the Genkit developer UI and tracing:
```bash
genkit start -- dart run main.dart
```
## Plugin Ecosystem
Genkit relies on a large suite of plugins to perform generative AI actions, interface with external LLMs, or host web servers.
When asked to use any given plugin, always verify usage by referring to its corresponding reference below. You should load the reference when you need to know the specific initialization arguments, tools, models, and usage patterns for the plugin:
| Plugin Name | Reference Link | Description |
| ---- | ---- | ---- |
| `genkit_google_genai` | [references/genkit_google_genai.md](references/genkit_google_genai.md) | Load for Google Gemini plugin interface usage. |
| `genkit_anthropic` | [references/genkit_anthropic.md](references/genkit_anthropic.md) | Load for Anthropic plugin interface for Claude models. |
| `genkit_openai` | [references/genkit_openai.md](references/genkit_openai.md) | Load for OpenAI plugin interface for GPT models, Groq, and custom compatible endpoints. |
| `genkit_middleware` | [references/genkit_middleware.md](references/genkit_middleware.md) | Load for Tooling for specific agentic behavior: `filesystem`, `skills`, and `toolApproval` interrupts. |
| `genkit_mcp` | [references/genkit_mcp.md](references/genkit_mcp.md) | Load for Model Context Protocol integration (Server, Host, and Client capabilities). |
| `genkit_chrome` | [references/genkit_chrome.md](references/genkit_chrome.md) | Load for Running Gemini Nano locally inside the Chrome browser using the Prompt API. |
| `genkit_shelf` | [references/genkit_shelf.md](references/genkit_shelf.md) | Load for Integrating Genkit Flow actions over HTTP using Dart Shelf. |
| `genkit_firebase_ai` | [references/genkit_firebase_ai.md](references/genkit_firebase_ai.md) | Load for Firebase AI plugin interface (Gemini API via Vertex AI). |
## External Dependencies
Whenever you define schemas mapping inside of Tools, Flows, and Prompts, you must use the [schemantic](https://pub.dev/packages/schemantic) library.
To learn how to use schemantic, ensure you read [references/schemantic.md](references/schemantic.md) for how to implement type safe generated Dart code. This is particularly relevant when you encounter symbols like `@Schema()`, `SchemanticType`, or classes with the `$` prefix. Genkit Dart uses schemantic for all of its data models so it's a CRITICAL skill to understand for using Genkit Dart.
## Best Practices
- Always check that code cleanly compiles using `dart analyze` before generating the final response.
- Always use the Genkit CLI for local development and debugging.
@@ -0,0 +1,380 @@
# Genkit Core Framework
Genkit Dart is an AI SDK for Dart that provides a unified interface for text generation, structured output, tool calling, and agentic workflows.
## Initialization
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_google_genai/genkit_google_genai.dart'; // Or any other plugin
void main() async {
// Pass plugins to use into the Genkit constructor
final ai = Genkit(plugins: [googleAI()]);
}
```
## Generate Text
```dart
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash'), // Needs a model reference from a plugin
prompt: 'Explain quantum computing in simple terms.',
);
print(response.text);
```
## Stream Responses
```dart
final stream = ai.generateStream(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'Write a short story about a robot learning to paint.',
);
await for (final chunk in stream) {
print(chunk.text);
}
```
## Embed Text
```dart
final embeddings = await ai.embedMany(
documents: [
DocumentData(content: [TextPart(text: 'Hello world')]),
],
embedder: googleAI.textEmbedding('text-embedding-004'),
);
print(embeddings.first.embedding);
```
## Define Tools
Models can use define actions and access external data via custom defined tools.
Requires the `schemantic` library for schema definitions.
```dart
import 'package:schemantic/schemantic.dart';
@Schema()
abstract class $WeatherInput {
String get location;
}
final weatherTool = ai.defineTool(
name: 'getWeather',
description: 'Gets the current weather for a location',
inputSchema: WeatherInput.$schema,
fn: (input, _) async {
// Call your weather API here
return 'Weather in ${input.location}: 72°F and sunny';
},
);
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'What\'s the weather like in San Francisco?',
toolNames: ['getWeather'], // Use the tools
);
```
## Structured Output
You can ensure the generative model returns a typed JSON object by providing an `outputSchema`.
```dart
@Schema()
abstract class $Person {
String get name;
int get age;
}
// ... inside main ...
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'Generate a person named John Doe, age 30',
outputSchema: Person.$schema, // Force the model to return this schema
);
final person = response.output; // Typed Person object
print('Name: ${person.name}, Age: ${person.age}');
```
## Define Flows
Wrap your AI logic in flows for better observability, testing, and deployment:
```dart
final jokeFlow = ai.defineFlow(
name: 'tellJoke',
inputSchema: .string(),
outputSchema: .string(),
fn: (topic, _) async {
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'Tell me a joke about $topic',
);
return response.text; // Value return
},
);
final joke = await jokeFlow('programming');
print(joke);
```
### Streaming Flows
Stream data from your flows using `context.sendChunk(...)` and returning the final value:
```dart
final streamStory = ai.defineFlow(
name: 'streamStory',
inputSchema: .string(),
outputSchema: .string(),
streamSchema: .string(),
fn: (topic, context) async {
final stream = ai.generateStream(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'Write a story about $topic',
);
await for (final chunk in stream) {
context.sendChunk(chunk.text); // Stream the chunks
}
return 'Story complete'; // Value return
},
);
```
## Calling remote Flows from a dart client
The `genkit` package provides `package:genkit/client.dart` representing remote Genkit actions that can be invoked or streamed using type-safe definitions.
1. Defines a remote action
```dart
import 'package:genkit/client.dart';
final stringAction = defineRemoteAction(
url: 'http://localhost:3400/my-flow',
inputSchema: .string(),
outputSchema: .string(),
);
```
2. Call the Remote Action (Non-streaming)
```dart
final response = await stringAction(input: 'Hello from Dart!');
print('Flow Response: $response');
```
3. Call the Remote Action (Streaming)
Use the `.stream()` method on the action flow, and access `stream.onResult` to wait on the async return value.
```dart
final streamAction = defineRemoteAction(
url: 'http://localhost:3400/stream-story',
inputSchema: .string(),
outputSchema: .string(),
streamSchema: .string(),
);
final stream = streamAction.stream(
input: 'Tell me a short story about a Dart developer.',
);
await for (final chunk in stream) {
print('Chunk: $chunk');
}
final finalResult = await stream.onResult;
print('\nFinal Response: $finalResult');
```
## Calling remote Flows from a Javascript client
Install `genkit` npm package:
```bash
npm install genkit
```
1. Call a remote flow (non-streaming)
```ts
import { runFlow } from 'genkit/beta/client';
async function callHelloFlow() {
try {
const result = await runFlow({
url: 'http://127.0.0.1:3400/helloFlow', // Replace with your deployed flow's URL
input: { name: 'Genkit User' },
});
console.log('Non-streaming result:', result.greeting);
} catch (error) {
console.error('Error calling helloFlow:', error);
}
}
callHelloFlow();
```
2. Call a remote flow (streaming)
```ts
import { streamFlow } from 'genkit/beta/client';
async function streamHelloFlow() {
try {
const result = streamFlow({
url: 'http://127.0.0.1:3400/helloFlow', // Replace with your deployed flow's URL
input: { name: 'Streaming User' },
});
// Process the stream chunks as they arrive
for await (const chunk of result.stream) {
console.log('Stream chunk:', chunk);
}
// Get the final complete response
const finalOutput = await result.output;
console.log('Final streaming output:', finalOutput.greeting);
} catch (error) {
console.error('Error streaming helloFlow:', error);
}
}
streamHelloFlow();
```
## Data Models
Genkit uses standard data models for representing prompts (messages & parts) and responses. These classes are implemented using schemantic library.
```dart
import 'package:genkit/genkit.dart';
import 'package:schemantic/schemantic.dart';
@Schema()
abstract class $MyDataModel {
// uses Genkit's Message schema (not schemantic's Message)
List<$Message> get messages;
List<$Part> get parts;
}
void example() {
// --- Parts ---
// A Text part
final textPart = TextPart(text: 'some text', metadata: {'foo': 'bar'});
// A Media/Image part
final mediaPart = MediaPart(
media: Media(url: 'https://...', contentType: 'image/png'),
metadata: {'foo': 'bar'},
);
// A Tool Request initiated by the model
final toolRequestPart = ToolRequestPart(
toolRequest: ToolRequest(
name: 'get_weather',
ref: 'abc',
input: {'location': 'Paris, France'},
),
metadata: {'foo': 'bar'},
);
// The resulting data from a Tool execution
final toolResponsePart = ToolResponsePart(
toolResponse: ToolResponse(
name: 'get_weather',
ref: 'abc',
output: {'temperature': '20C'},
),
metadata: {'foo': 'bar'},
);
// Model reasoning (e.g. for Claude's "thinking" models)
final reasoningPart = ReasoningPart(
reasoning: 'thinking...',
metadata: {'foo': 'bar'},
);
// A custom fallback part
final customPart = CustomPart(
custom: {'provider': {'specific': 'data'}},
metadata: {'foo': 'bar'},
);
// --- Messages ---
final systemMessage = Message(
role: Role.system,
content: [textPart, mediaPart],
metadata: {'foo': 'bar'},
);
final userMessage = Message(
role: Role.user,
content: [textPart, mediaPart], // Can contain media (multimodal)
);
final modelMessage = Message(
role: Role.model,
// Models can emit text, tool requests, reasoning, or custom parts
content: [textPart, toolRequestPart, reasoningPart, customPart],
);
// --- Ergonomic Data Access (schema_extensions.dart) ---
// The Genkit SDK provides extensions on `Message` and `Part` to easily access fields
// without needing to cast them manually.
// Get concatenated text from all TextParts in a Message
print(modelMessage.text);
// Get the first Media object from a Message
print(modelMessage.media?.url);
// Iterate over tool requests in a Message
for (final toolReq in modelMessage.toolRequests) {
print(toolReq.name);
}
// Inspect individual parts
for (final part in modelMessage.content) {
if (part.isText) print(part.text);
if (part.isMedia) print(part.media?.url);
if (part.isToolRequest) print(part.toolRequest?.name);
if (part.isToolResponse) print(part.toolResponse?.name);
if (part.isReasoning) print(part.reasoning);
if (part.isCustom) print(part.custom);
}
// --- Streaming Chunks ---
// Data emitted by ai.generateStream() calls
final generateResponseChunk = ModelResponseChunk(
content: [textPart],
index: 0, // Index of the message this chunk belongs to
aggregated: false,
);
// Chunks also have text and media accessors
print(generateResponseChunk.text);
// --- Advanced: Schemas ---
// Use Genkit type schemas directly in Schemantic validations
final messageSchema = Message.$schema;
final partSchema = Part.$schema;
final mySchema = SchemanticType.map(
.string(),
.list(Message.$schema), // Requires a list of Messages
);
// --- Generate Response ---
// ai.generate() returns a GenerateResponseHelper which provides ergonomic getters
// over the underlying ModelResponse:
final response = await ai.generate(...);
print(response.text); // Concatenated text
print(response.media?.url); // First media part
print(response.toolRequests); // All tool requests
print(response.interrupts); // Tool requests that triggered an interrupt
print(response.messages); // Full history of the conversation, including the request and response
print(response.output); // Structured typed output (if outputSchema was used)
}
```
@@ -0,0 +1,41 @@
# Genkit Anthropic Plugin (`genkit_anthropic`)
The Anthropic plugin for Genkit Dart, used for interacting with the Claude models.
## Usage
Requires `ANTHROPIC_API_KEY` to be passed to the init block.
```dart
import 'dart:io';
import 'package:genkit/genkit.dart';
import 'package:genkit_anthropic/genkit_anthropic.dart';
void main() async {
final ai = Genkit(
plugins: [anthropic(apiKey: Platform.environment['ANTHROPIC_API_KEY']!)],
);
final response = await ai.generate(
model: anthropic.model('claude-sonnet-4-5'),
prompt: 'Tell me a joke about a developer.',
);
print(response.text);
}
```
## Claude Thinking Configurations
Provides specific configurations for utilizing Claude 3.7+ "thinking" model capabilities.
```dart
final response = await ai.generate(
model: anthropic.model('claude-sonnet-4-5'),
prompt: 'Solve this 24 game: 2, 3, 10, 10',
config: AnthropicOptions(thinking: ThinkingConfig(budgetTokens: 2048)),
);
// The thinking content is available in the message parts
print(response.message?.content);
```
@@ -0,0 +1,23 @@
# Genkit Chrome AI Plugin (`genkit_chrome`)
Chrome Built-in AI (Gemini Nano) plugin for Genkit Dart, allowing local offline execution within a Chrome application.
## Usage
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_chrome/genkit_chrome.dart';
void main() async {
final ai = Genkit(plugins: [ChromeAIPlugin()]);
final stream = ai.generateStream(
model: modelRef('chrome/gemini-nano'),
prompt: 'Write a story about a robot.',
);
await for (final chunk in stream) {
print(chunk.text);
}
}
```
@@ -0,0 +1,23 @@
# Genkit Firebase AI Plugin (`genkit_firebase_ai`)
The Firebase AI plugin for Genkit Dart, used for interacting with Gemini APIs through Firebase AI Logic.
## Usage
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_firebase_ai/genkit_firebase_ai.dart';
void main() async {
// Initialize Genkit with the Firebase AI plugin
final ai = Genkit(plugins: [firebaseAI()]);
// Generate text
final response = await ai.generate(
model: firebaseAI.gemini('gemini-2.5-flash'),
prompt: 'Tell me a joke about a developer.',
);
print(response.text);
}
```
@@ -0,0 +1,95 @@
# Genkit Google GenAI Plugin (`genkit_google_genai`)
The Google AI plugin provides an interface against the official Google AI Gemini API.
## Usage
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_google_genai/genkit_google_genai.dart';
void main() async {
// Initialize Genkit with the Google AI plugin
final ai = Genkit(plugins: [googleAI()]);
// Generate text
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash'),
prompt: 'Tell me a joke about a developer.',
);
print(response.text);
}
```
## Embeddings
```dart
final embeddings = await ai.embedMany(
embedder: googleAI.textEmbedding('text-embedding-004'),
documents: [
DocumentData(content: [TextPart(text: 'Hello world')]),
],
);
```
## Image Generation
The plugin also supports image generation models such as `gemini-2.5-flash-image`.
### Example (Nano Banana)
```dart
// Define an image generation flow
ai.defineFlow(
name: 'imageGenerator',
inputSchema: .string(defaultValue: 'A banana riding a bike'),
outputSchema: Media.$schema,
fn: (input, context) async {
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash-image'),
prompt: input,
);
if (response.media == null) {
throw Exception('No media generated');
}
return response.media!;
},
);
```
The media (url field) contain base64 encoded data uri. You can decode it and save it as a file.
## Text-to-Speech (TTS)
You can use text-to-speech models to generate audio from text. The generated `Media` object will contain base64 encoded PCM audio in its data URI.
```dart
// Define a TTS flow
ai.defineFlow(
name: 'textToSpeech',
inputSchema: .string(defaultValue: 'Genkit is an amazing AI framework!'),
outputSchema: Media.$schema,
fn: (prompt, _) async {
final response = await ai.generate(
model: googleAI.gemini('gemini-2.5-flash-preview-tts'),
prompt: prompt,
config: GeminiTtsOptions(
responseModalities: ['AUDIO'],
speechConfig: SpeechConfig(
voiceConfig: VoiceConfig(
prebuiltVoiceConfig: PrebuiltVoiceConfig(voiceName: 'Puck'),
),
),
),
);
if (response.media != null) {
return response.media!;
}
throw Exception('No audio generated');
},
);
```
Google AI also supports multi-speaker TTS by configuring a `MultiSpeakerVoiceConfig` inside `SpeechConfig`.
@@ -0,0 +1,115 @@
# Genkit MCP (`genkit_mcp`)
MCP (Model Context Protocol) integration for Genkit Dart.
## MCP Host (Recommended)
Connect to one or more MCP servers and aggregate their capabilities into the Genkit registry automatically.
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_mcp/genkit_mcp.dart';
void main() async {
final ai = Genkit();
final host = defineMcpHost(
ai,
McpHostOptionsWithCache(
name: 'my-host',
mcpServers: {
'fs': McpServerConfig(
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '.'],
),
},
),
);
// Tools can be discovered and executed dynamically using a wildcard...
final response = await ai.generate(
model: 'gemini-2.5-flash',
prompt: 'Summarize the contents of README.md',
toolNames: ['my-host:tool/fs/*'],
);
// ...or by specifying the exact tool name
final exactResponse = await ai.generate(
model: 'gemini-2.5-flash',
prompt: 'Read README.md',
toolNames: ['my-host:tool/fs/read_file'],
);
}
```
## MCP Client (Advanced / Single Server)
Connecting to a single MCP server with a client object is an advanced usecase for when you need manual control over the client lifecycle. Standalone clients do not automatically register tools into the registry, so they must be passed into `generate` or `defineDynamicActionProvider` manually.
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_mcp/genkit_mcp.dart';
void main() async {
final ai = Genkit();
final client = createMcpClient(
McpClientOptions(
name: 'my-client',
mcpServer: McpServerConfig(
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '.'],
),
),
);
await client.ready();
// Retrieve the tools from the connected client
final tools = await client.getActiveTools(ai);
final response = await ai.generate(
model: 'gemini-2.5-flash',
prompt: 'Read the contents of README.md',
tools: tools,
);
}
```
## MCP Server
Expose Genkit actions (tools, prompts, resources) over MCP.
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_mcp/genkit_mcp.dart';
void main() async {
final ai = Genkit();
ai.defineTool(
name: 'add',
description: 'Add two numbers together',
inputSchema: .map(.string(), .dynamicSChema()),
fn: (input, _) async => (input['a'] + input['b']).toString(),
);
ai.defineResource(
name: 'my-resource',
uri: 'my://resource',
fn: (_, _) async => ResourceOutput(content: [TextPart(text: 'my resource')]),
);
// Stdio transport by default
final server = createMcpServer(ai, McpServerOptions(name: 'my-server'));
await server.start();
}
```
### Streamable HTTP Transport
```dart
import 'dart:io';
final transport = await StreamableHttpServerTransport.bind(
address: InternetAddress.loopbackIPv4,
port: 3000,
);
await server.start(transport);
```
@@ -0,0 +1,84 @@
# Genkit Middleware (`genkit_middleware`)
A collection of useful middleware for Genkit Dart to enhance your agent's capabilities. Register plugins when initializing Genkit:
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_middleware/genkit_middleware.dart';
void main() {
final ai = Genkit(
plugins: [
FilesystemPlugin(),
SkillsPlugin(),
ToolApprovalPlugin(),
],
);
}
```
## Filesystem Middleware
Allows the agent to list, read, write, and search/replace files within a restricted root directory.
```dart
final response = await ai.generate(
prompt: 'Check the logs in the current directory.',
use: [
filesystem(rootDirectory: '/path/to/secure/workspace'),
],
);
```
**Tools Provided:**
- `list_files`, `read_file`, `write_file`, `search_and_replace`
## Skills Middleware
Injects specialized instructions (skills) into the system prompt from `SKILL.md` files located in specified directories.
```dart
final response = await ai.generate(
prompt: 'Help me debug this issue.',
use: [
skills(skillPaths: ['/path/to/skills']),
],
);
```
**Tools Provided:**
- `use_skill`: Retrieve the full content of a skill by name.
## Tool Approval Middleware
Intercepts tool execution for specified tools and requires explicit approval. Returns `FinishReason.interrupted`.
```dart
final response = await ai.generate(
prompt: 'Delete the database.',
use: [
// Require approval for all tools EXCEPT those below
toolApproval(approved: ['read_file', 'list_files']),
],
);
if (response.finishReason == FinishReason.interrupted) {
final interrupt = response.interrupts.first;
// Ask user for approval
final isApproved = await askUser();
if (isApproved) {
final resumeResponse = await ai.generate(
messages: response.messages, // Pass history
toolChoice: ToolChoice.none, // Prevent immediate re-call
interruptRestart: [
ToolRequestPart(
toolRequest: interrupt.toolRequest,
metadata: {
...?interrupt.metadata,
'tool-approved': true
},
),
],
);
}
}
```
@@ -0,0 +1,54 @@
# Genkit OpenAI Plugin (`genkit_openai`)
OpenAI-compatible API plugin for Genkit Dart. Supports OpenAI models and other compatible APIs (xAI, DeepSeek, Together AI, Groq, etc.).
## Basic Usage
```dart
import 'dart:io';
import 'package:genkit/genkit.dart';
import 'package:genkit_openai/genkit_openai.dart';
void main() async {
final ai = Genkit(plugins: [
openAI(apiKey: Platform.environment['OPENAI_API_KEY']),
]);
final response = await ai.generate(
model: openAI.model('gpt-4o'),
prompt: 'Tell me a joke.',
);
}
```
## Options
`OpenAIOptions` allows configuring sampling temperature, nucleus sampling, token generation, seed, etc:
`config: OpenAIOptions(temperature: 0.7, maxTokens: 100)`
## Groq API override
Specify custom `baseUrl` and custom models to integrate with third-party providers.
```dart
final ai = Genkit(plugins: [
openAI(
apiKey: Platform.environment['GROQ_API_KEY'],
baseUrl: 'https://api.groq.com/openai/v1',
models: [
CustomModelDefinition(
name: 'llama-3.3-70b-versatile',
info: ModelInfo(
label: 'Llama 3.3 70B',
supports: {'multiturn': true, 'tools': true, 'systemRole': true},
),
),
],
),
]);
final response = await ai.generate(
model: openAI.model('llama-3.3-70b-versatile'),
prompt: 'Hello!',
);
```
@@ -0,0 +1,59 @@
# Genkit Shelf Plugin (`genkit_shelf`)
Shelf integration for Genkit Dart, used to serve Genkit Flows.
## Standalone Server
Serve Genkit Flows easily on an isolated HTTP server using `startFlowServer`.
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_shelf/genkit_shelf.dart';
void main() async {
final ai = Genkit();
final flow = ai.defineFlow(
name: 'myFlow',
inputSchema: .string(),
outputSchema: .string(),
fn: (String input, _) async => 'Hello $input',
);
await startFlowServer(
flows: [flow],
port: 8080,
);
}
```
## Existing Shelf Application
Mount Genkit Flow endpoints directly to an existing Shelf `Router` using `shelfHandler`.
```dart
import 'package:genkit/genkit.dart';
import 'package:genkit_shelf/genkit_shelf.dart';
import 'package:shelf/shelf.dart';
import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_router/shelf_router.dart';
void main() async {
final ai = Genkit();
final flow = ai.defineFlow(
name: 'myFlow',
inputSchema: .string(),
outputSchema: .string(),
fn: (String input, _) async => 'Hello $input',
);
final router = Router();
// Mount the flow handler at a specific path
router.post('/myFlow', shelfHandler(flow));
// Start the server
await io.serve(router.call, 'localhost', 8080);
}
```
Access deployed flows using genkit client libraries (from Dart or JS).
@@ -0,0 +1,137 @@
# Schemantic
Schemantic is a general-purpose Dart library used for defining strongly typed data classes that automatically bind to reusable runtime JSON schemas. It is standard for the `genkit-dart` framework but works independently as well.
## Core Concepts
Always use `schemantic` when strongly typed JSON parsing or programmatic schema validation is required.
- Annotate your abstract classes with `@Schema()`.
- Use the `$` prefix for abstract schema class names (e.g., `abstract class $User`).
- Always run `dart run build_runner build` to generate the `.g.dart` schema files.
## Installation
Add dependencies:
```bash
dart pub add schemantic
```
## Basic Usage
1. **Defining a schema:**
```dart
import 'package:schemantic/schemantic.dart';
part 'my_file.g.dart'; // Must match the filename
@Schema()
abstract class $MyObj {
String get name;
$MySubObj get subObj;
}
@Schema()
abstract class $MySubObj {
String get foo;
}
```
2. **Using the Generated Class:**
The builder creates a concrete class `MyObj` (no `$`) with a factory constructor (`MyObj.fromJson`) and a regular constructor.
```dart
// Creating an instance
final obj = MyObj(name: 'test', subObj: MySubObj(foo: 'bar'));
// Serializing to JSON
print(obj.toJson());
// Parsing from JSON
final parsed = MyObj.fromJson({'name': 'test', 'subObj': {'foo': 'bar'}});
```
3. **Accessing Schemas at Runtime:**
The generated data classes have a static `$schema` field (of type `SchemanticType<T>`) which can be used to pass the definition into functions or to extract the raw JSON schema.
```dart
// Access JSON schema
final schema = MyObj.$schema.jsonSchema;
print(schema.toJson());
// Validate arbitrary JSON at runtime
final validationErrors = await schema.validate({'invalid': 'data'});
```
## Primitive Schemas
When a full data class is not required, Schemantic provides functions to create schemas dynamically.
```dart
final ageSchema = SchemanticType.integer(description: 'Age in years', minimum: 0);
final nameSchema = SchemanticType.string(minLength: 2);
final nothingSchema = SchemanticType.voidSchema();
final anySchema = SchemanticType.dynamicSchema();
final userSchema = SchemanticType.map(.string(), .integer()); // Map<String, int>
final tagsSchema = SchemanticType.list(.string()); // List<String>
```
## Union Types (AnyOf)
To allow a field to accept multiple types, use `@AnyOf`.
```dart
@Schema()
abstract class $Poly {
@AnyOf([int, String, $MyObj])
Object? get id;
}
```
Schemantic generates a specific helper class (e.g., `PolyId`) to handle the values:
```dart
final poly1 = Poly(id: PolyId.int(123));
final poly2 = Poly(id: PolyId.string('abc'));
```
## Field Annotations
You can use specialized annotations for more validation boundaries:
```dart
@Schema()
abstract class $User {
@IntegerField(
name: 'years_old', // Change JSON key
description: 'Age of the user',
minimum: 0,
defaultValue: 18,
)
int? get age;
@StringField(
minLength: 2,
enumValues: ['user', 'admin'],
)
String get role;
}
```
## Recursive Schemas
For recursive structures (like trees), must use `useRefs: true` inside the generated jsonSchema property. You define it normally:
```dart
@Schema()
abstract class $Node {
String get id;
List<$Node>? get children;
}
```
*Note*: `Node.$schema.jsonSchema(useRefs: true)` generates schemas with JSON Schema `$ref`.