Stop wiring AI agents by hand. With
@orka-js/nestjs, your agents live in the DI container, respond to events, protect routes semantically, and run as microservices — all with NestJS idioms you already know.
You've spent months building a clean NestJS architecture.
DI everywhere. CQRS for complex domains. Event-driven with @nestjs/event-emitter. Microservices on Redis. Guards for access control.
Then your AI agent arrives — and none of that applies to it.
The agent lives in a service file, instantiated by hand, injected as a raw class, called directly. It doesn't participate in your architecture. It's a guest that doesn't respect the house rules.
@orka-js/nestjs fixes that. Your agents become actual NestJS citizens — DI-injectable, event-driven, guarded, pipeable, CQRS-aware.
Install
npm install @orka-js/nestjs @orka-js/agent @orka-js/openai @nestjs/common @nestjs/core reflect-metadata rxjs
1. OrkaModule — Register Agents in the DI Container
import { Module } from '@nestjs/common'
import { OrkaModule } from '@orka-js/nestjs'
import { StreamingToolAgent } from '@orka-js/agent'
import { OpenAIAdapter } from '@orka-js/openai'
const llm = new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! })
@Module({
imports: [
OrkaModule.forRoot({
agents: {
assistant: new StreamingToolAgent({ goal: 'Helpful assistant', tools: [] }, llm),
analyst: new StreamingToolAgent({ goal: 'Data analyst', tools: [] }, llm),
},
path: 'ai', // mounts HTTP routes under /ai
}),
],
})
export class AppModule {}
This single forRoot() call gives you:
GET /ai → list registered agents
GET /ai/:agent → agent info
POST /ai/:agent → run agent (blocking)
POST /ai/:agent/stream → run agent (SSE streaming)
No controllers to write. No routing logic to maintain. Same API contract as @orka-js/express and @orka-js/hono.
2. forRootAsync — Config from ConfigService
Production apps read API keys from environment. Here's how to do it the NestJS way:
import { ConfigModule, ConfigService } from '@nestjs/config'
import { OrkaModule } from '@orka-js/nestjs'
@Module({
imports: [
ConfigModule.forRoot(),
OrkaModule.forRootAsync({
imports: [ConfigModule],
path: 'ai',
useFactory: (config: ConfigService) => ({
agents: {
assistant: new StreamingToolAgent(
{ goal: 'Helpful assistant', tools: [] },
new OpenAIAdapter({ apiKey: config.get('OPENAI_API_KEY')! })
),
},
}),
inject: [ConfigService],
}),
],
})
export class AppModule {}
The factory resolves asynchronously — your agents are configured after all environment variables and secrets are loaded.
3. @InjectAgent — Use Agents Anywhere
Once registered, agents are DI-injectable throughout your application:
import { Injectable } from '@nestjs/common'
import { InjectAgent } from '@orka-js/nestjs'
import type { BaseAgent } from '@orka-js/agent'
@Injectable()
export class OrderService {
constructor(
@InjectAgent('assistant') private agent: BaseAgent,
private readonly db: OrderRepository,
) {}
async summarize(orderId: string): Promise<string> {
const order = await this.db.findById(orderId)
const result = await this.agent.run(JSON.stringify(order))
return result.output
}
}
The token is deterministic — @InjectAgent('assistant') is equivalent to @Inject('ORKA_AGENT:assistant'). You can inspect it, override it in tests, and swap implementations without touching consumers.
4. @AgentReact — Event-Driven Agents
The most underused pattern in NestJS AI apps: reacting to domain events with an agent.
Without @AgentReact, you write this boilerplate on every event handler:
@OnEvent('order.created')
async onOrderCreated(payload: OrderCreatedEvent) {
const agent = this.fulfillmentAgent
if (!agent?.run) throw new Error('...')
await agent.run(JSON.stringify(payload))
}
With @AgentReact, the method body is entirely replaced at decoration time:
import { Injectable } from '@nestjs/common'
import { OnEvent } from '@nestjs/event-emitter'
import { InjectAgent, AgentReact } from '@orka-js/nestjs'
import type { BaseAgent } from '@orka-js/agent'
@Injectable()
export class OrderEventHandler {
constructor(
@InjectAgent('fulfillment') private agent: BaseAgent,
@InjectAgent('churn') private churnAgent: BaseAgent,
) {}
// Awaited — the event emitter waits for the agent to finish
@OnEvent('order.created')
@AgentReact()
async onOrderCreated(payload: OrderCreatedEvent) {}
// Fire-and-forget — returns immediately, agent runs in background
@OnEvent('customer.churned')
@AgentReact({ agent: 'churnAgent', async: true })
onChurnDetected(payload: ChurnEvent): void {}
}
The async: true option is critical for high-frequency events: the event emitter doesn't block, the agent processes in background. Use it for analytics, notifications, logging — anything that doesn't need a synchronous response.
5. OrkaSemanticGuard — LLM as a Security Layer
Traditional guards check JWT claims, roles, IP ranges. OrkaSemanticGuard checks intent.
import { Module } from '@nestjs/common'
import { APP_GUARD } from '@nestjs/core'
import { OrkaSemanticGuard } from '@orka-js/nestjs'
// Applied to a specific controller
@UseGuards(new OrkaSemanticGuard(llm, 'Only allow requests from authenticated admin users managing their own data'))
@Controller('admin')
export class AdminController {
@Get('users')
listUsers() { /* ... */ }
}
// Or globally via module
@Module({
providers: [
{
provide: APP_GUARD,
useValue: new OrkaSemanticGuard(
llm,
'Block any request that attempts to access data belonging to other users, perform bulk operations, or export raw data'
),
},
],
})
export class AppModule {}
The guard sends method, URL, body (truncated), and authorization headers to the LLM. It asks for ALLOW or DENY. If the LLM throws or is unavailable, it fails closed — denies access by default. Security > availability.
Use it for:
- Rate-limit evasion detection ("this looks like a scraping attempt")
- Prompt injection prevention ("this input is trying to jailbreak the API")
- Business rule enforcement ("admins can't delete other admins")
6. AgentValidationPipe — NLP → DTO
Your API consumers don't always send clean structured data. Sometimes they send natural language:
POST /products/search
{ "input": "show me red sneakers under 100 euros size 42" }
AgentValidationPipe transforms this into a typed DTO using your LLM:
import { z } from 'zod'
import { Controller, Post, Body } from '@nestjs/common'
import { AgentValidationPipe } from '@orka-js/nestjs'
const ProductSearchSchema = z.object({
color: z.string().optional(),
maxPrice: z.number().optional(),
size: z.number().optional(),
category: z.string().optional(),
})
type ProductSearch = z.infer<typeof ProductSearchSchema>
@Controller('products')
export class ProductController {
@Post('search')
async search(
@Body(new AgentValidationPipe(ProductSearchSchema, llm))
filters: ProductSearch,
) {
// filters = { color: 'red', maxPrice: 100, size: 42, category: 'sneakers' }
return this.productService.search(filters)
}
}
The pipe handles three input shapes:
-
{ input: "natural language description" }→ LLM extracts the DTO - Structured object → validated directly with
schema.safeParse(); LLM only called as fallback - Validation failure →
BadRequestExceptionwith clear error message
Your API accepts both human-readable and machine-readable inputs. Same endpoint. No if/else branching.
7. CQRS Integration
If your app uses @nestjs/cqrs, agents plug directly into the query/command bus:
npm install @nestjs/cqrs
import { IQuery } from '@nestjs/cqrs'
import { AgentQueryHandler, OrkaQueryHandler, InjectAgent } from '@orka-js/nestjs/cqrs'
import type { BaseAgent } from '@orka-js/agent'
// Define your query
export class SearchProductsQuery implements IQuery {
constructor(public readonly filters: ProductSearch) {}
}
// Handler — extends OrkaQueryHandler, registered via @AgentQueryHandler
@AgentQueryHandler(SearchProductsQuery)
export class SearchProductsHandler extends OrkaQueryHandler<SearchProductsQuery> {
constructor(@InjectAgent('search') protected agent: BaseAgent) {
super()
}
}
OrkaQueryHandler serializes the query to JSON and passes it to the agent. The handler registers itself on the query bus. You call it like any other CQRS handler:
@Controller('products')
export class ProductController {
constructor(private readonly queryBus: QueryBus) {}
@Post('search')
async search(@Body() filters: ProductSearch) {
return this.queryBus.execute(new SearchProductsQuery(filters))
}
}
Your agent is now part of the command/query pipeline. Observable, testable, replaceable.
8. Agent as Microservice
When your AI workload gets heavy, isolate it behind a NestJS microservice transport:
npm install @nestjs/microservices
The agent microservice (separate process):
import { NestFactory } from '@nestjs/core'
import { Transport } from '@nestjs/microservices'
import { OrkaModule } from '@orka-js/nestjs'
async function bootstrap() {
const module = await OrkaModule.forMicroservice({
agents: { assistant: assistantAgent, analyst: analystAgent },
})
const app = await NestFactory.createMicroservice(
{ module: class AppModule {}, imports: [module] },
{ transport: Transport.REDIS, options: { host: 'localhost', port: 6379 } }
)
await app.listen()
console.log('Agent microservice listening on Redis')
}
bootstrap()
The consumer (your main API):
import { OrkaClientModule, InjectAgentClient } from '@orka-js/nestjs/microservice'
import type { AgentClient } from '@orka-js/nestjs/microservice'
@Module({
imports: [
OrkaClientModule.forRoot({
clients: [{
name: 'ai',
transport: Transport.REDIS,
options: { host: 'localhost', port: 6379 },
}],
}),
],
})
export class AppModule {}
@Injectable()
export class OrderService {
constructor(@InjectAgentClient('ai') private client: AgentClient) {}
async process(order: Order): Promise<AgentResult> {
return this.client.run('assistant', JSON.stringify(order))
}
}
Your main API stays thin. The AI workload scales independently. Add more microservice instances as load grows.
Putting It Together: A Real Feature
Here's a complete order processing flow — event-driven, validated, and CQRS-aware:
// 1. Order submitted via HTTP — NLP input normalized to DTO
@Post('orders')
async createOrder(
@Body(new AgentValidationPipe(OrderSchema, llm))
order: OrderDTO,
) {
return this.commandBus.execute(new CreateOrderCommand(order))
}
// 2. Order created — fulfillment agent reacts asynchronously
@OnEvent('order.created')
@AgentReact({ agent: 'fulfillmentAgent', async: true })
onOrderCreated(payload: OrderCreatedEvent): void {}
// 3. Order queried — search agent handles the query bus
@AgentQueryHandler(FindSimilarOrdersQuery)
class FindSimilarOrdersHandler extends OrkaQueryHandler<FindSimilarOrdersQuery> {
constructor(@InjectAgent('search') protected agent: BaseAgent) { super() }
}
// 4. Admin routes protected by semantic guard
@UseGuards(new OrkaSemanticGuard(llm, 'Only authenticated admins can access order management'))
@Controller('admin/orders')
class AdminOrderController { /* ... */ }
Each piece uses a NestJS primitive you already know. The agents slot in where domain logic lives — not alongside it.
The Complete Feature Set
| Feature | NestJS Primitive | OrkaJS Addition |
|---|---|---|
| HTTP routes | @Controller |
OrkaModule.forRoot({ path }) |
| DI injection | @Inject |
@InjectAgent('name') |
| Event handling | @OnEvent |
@AgentReact({ async }) |
| Access control | CanActivate |
OrkaSemanticGuard |
| Input validation | PipeTransform |
AgentValidationPipe |
| CQRS | IQueryHandler |
OrkaQueryHandler |
| Transport | @MessagePattern |
OrkaModule.forMicroservice |
Every feature is opt-in. You pick what you need. The rest of your architecture stays unchanged.
Testing
In tests, replace agents with mocks at the DI level:
const moduleRef = await Test.createTestingModule({
imports: [
OrkaModule.forRoot({
agents: {
assistant: {
run: vi.fn().mockResolvedValue({ output: 'mocked response', steps: 1 })
} as unknown as BaseAgent,
},
}),
],
providers: [OrderService],
}).compile()
No mocking frameworks needed. The DI container swaps the real agent for your mock. Your service tests remain fast and deterministic.
OrkaJS — Build production AI agents in TypeScript.
- Docs: orkajs.com
- npm:
npm install @orka-js/nestjs - GitHub: github.com/orka-ai/orkajs
What NestJS architecture pattern do you wish AI frameworks supported natively? Drop it in the comments — the next OrkaJS feature might be yours.
Top comments (0)