Event-driven, long-running AI Agent framework with enterprise-grade persistence and multi-agent collaboration.
- Event-Driven Architecture - Three-channel system (Progress/Control/Monitor) for clean separation of concerns
- Long-Running & Resumable - Seven-stage checkpoints with Safe-Fork-Point for crash recovery
- Multi-Agent Collaboration - AgentPool, Room messaging, and task delegation
- Enterprise Persistence - SQLite/PostgreSQL support with unified WAL
- Extensible Ecosystem - MCP tools, custom Providers, Skills system
One-liner setup (install dependencies and build):
./quickstart.shOr install as a dependency:
npm install @shareai-lab/kode-sdkSet environment variables:
export ANTHROPIC_API_KEY=sk-...
export ANTHROPIC_MODEL_ID=claude-sonnet-4-20250514 # optional, default: claude-sonnet-4-20250514
export ANTHROPIC_BASE_URL=https://api.anthropic.com # optional, default: https://api.anthropic.com$env:ANTHROPIC_API_KEY="sk-..."
$env:ANTHROPIC_MODEL_ID="claude-sonnet-4-20250514" # optional, default: claude-sonnet-4-20250514
$env:ANTHROPIC_BASE_URL="https://api.anthropic.com" # optional, default: https://api.anthropic.comMinimal example:
import { Agent, AnthropicProvider, JSONStore } from '@shareai-lab/kode-sdk';
const provider = new AnthropicProvider(
process.env.ANTHROPIC_API_KEY!,
process.env.ANTHROPIC_MODEL_ID
);
const agent = await Agent.create({
provider,
store: new JSONStore('./.kode'),
systemPrompt: 'You are a helpful assistant.'
});
// Subscribe to progress events
for await (const envelope of agent.subscribe(['progress'])) {
if (envelope.event.type === 'text_chunk') {
process.stdout.write(envelope.event.delta);
}
if (envelope.event.type === 'done') break;
}
await agent.send('Hello!');Run examples:
npm run example:getting-started # Minimal chat
npm run example:agent-inbox # Event-driven inbox
npm run example:approval # Tool approval workflow
npm run example:room # Multi-agent collaborationFor production deployments serving many users, we recommend the Worker Microservice Pattern:
+------------------+
| Frontend | Next.js / SvelteKit (Vercel OK)
+--------+---------+
|
+--------v---------+
| API Gateway | Auth, routing, queue push
+--------+---------+
|
+--------v---------+
| Message Queue | Redis / SQS / NATS
+--------+---------+
|
+--------------------+--------------------+
| | |
+--------v-------+ +--------v-------+ +--------v-------+
| Worker 1 | | Worker 2 | | Worker N |
| (KODE SDK) | | (KODE SDK) | | (KODE SDK) |
| Long-running | | Long-running | | Long-running |
+--------+-------+ +--------+-------+ +--------+-------+
| | |
+--------------------+--------------------+
|
+--------v---------+
| Distributed Store| PostgreSQL / Redis
+------------------+
Key Principles:
- API layer is stateless - Can run on serverless
- Workers are stateful - Run KODE SDK, need long-running processes
- Store is shared - Single source of truth for agent state
- Queue decouples - Request handling from agent execution
See docs/en/guides/architecture.md for detailed deployment guides.
| Provider | Streaming | Tools | Reasoning | Files |
|---|---|---|---|---|
| Anthropic | ✅ | ✅ | ✅ Extended Thinking | ✅ |
| OpenAI | ✅ | ✅ | ✅ | ✅ |
| Gemini | ✅ | ✅ | ✅ | ✅ |
Note: OpenAI-compatible services (DeepSeek, GLM, Qwen, Minimax, OpenRouter, etc.) can be used via
OpenAIProviderwith custombaseURLconfiguration. See Providers Guide for details.
| Section | Description |
|---|---|
| Getting Started | |
| Installation | Setup and configuration |
| Quickstart | Build your first Agent |
| Concepts | Core concepts explained |
| Guides | |
| Events | Three-channel event system |
| Tools | Built-in tools & custom tools |
| Providers | Model provider configuration |
| Database | SQLite/PostgreSQL persistence |
| Resume & Fork | Crash recovery & branching |
| Reference | |
| API Reference | Complete API documentation |
| Examples | All examples explained |
MIT