Running AI workloads in serverless functions is a trap. Claude processes a 50-page document in 90 seconds. Vercel's default function timeout is 10 seconds. You either pay for Pro (60s limit) or you hit the wall mid-generation.
Trigger.dev v3 solves this with background tasks that run outside your request lifecycle — no queue infrastructure, no Redis, no worker processes to manage.
What Changed in v3
v2 was a self-hosted platform with a complex local dev setup. v3 rewrote the runtime:
- Tasks run in your own code with a decorator pattern
- Local dev with hot reload
- No Docker required
- Retries, delays, and concurrency built into the task definition
- Cloud-hosted workers with 1-hour timeout on free tier
Setup
npm install @trigger.dev/sdk@v3
npx trigger.dev@latest init
This adds a trigger.config.ts and a src/trigger/ directory.
Your First AI Task
// src/trigger/process-document.ts
import { task, logger } from '@trigger.dev/sdk/v3'
import Anthropic from '@anthropic-ai/sdk'
export const processDocument = task({
id: 'process-document',
maxDuration: 300, // 5 minutes
retry: {
maxAttempts: 3,
factor: 2,
minTimeoutInMs: 1000
},
run: async (payload: { documentId: string; userId: string }) => {
const { documentId, userId } = payload
logger.info('Processing document', { documentId })
const doc = await db.documents.findById(documentId)
if (!doc) throw new Error(`Document ${documentId} not found`)
const client = new Anthropic()
const response = await client.messages.create({
model: 'claude-opus-4-7',
max_tokens: 4096,
messages: [{
role: 'user',
content: `Analyze this document and extract key insights:\n\n${doc.content}`
}]
})
const analysis = response.content[0].type === 'text'
? response.content[0].text
: ''
await db.documents.update(documentId, {
analysis,
processedAt: new Date()
})
logger.info('Document processed', { documentId, analysisLength: analysis.length })
return { documentId, success: true }
}
})
Triggering From Your API Route
// app/api/documents/process/route.ts
import { processDocument } from '@/trigger/process-document'
export async function POST(request: Request) {
const { documentId } = await request.json()
const { id: userId } = await getCurrentUser()
// Returns immediately — task runs in background
const handle = await processDocument.trigger({
documentId,
userId
})
return Response.json({
success: true,
runId: handle.id // Poll this for status
})
}
The API route returns in milliseconds. The task runs for as long as it needs.
Checking Task Status
// app/api/documents/status/[runId]/route.ts
import { runs } from '@trigger.dev/sdk/v3'
export async function GET(
_req: Request,
{ params }: { params: { runId: string } }
) {
const run = await runs.retrieve(params.runId)
return Response.json({
status: run.status, // QUEUED | EXECUTING | COMPLETED | FAILED
completedAt: run.completedAt,
output: run.output
})
}
Scheduled Tasks (Cron)
// src/trigger/daily-digest.ts
import { schedules } from '@trigger.dev/sdk/v3'
export const dailyDigest = schedules.task({
id: 'daily-digest',
cron: '0 9 * * *', // 9 AM every day
run: async (payload) => {
const users = await db.users.findAll({ notifications: true })
for (const user of users) {
const recentActivity = await db.activity.getForUser(user.id, {
since: new Date(Date.now() - 86400000)
})
await sendDigestEmail(user.email, recentActivity)
}
return { sent: users.length }
}
})
Batch Processing With Wait
import { task, wait, batch } from '@trigger.dev/sdk/v3'
export const processUserBatch = task({
id: 'process-user-batch',
run: async (payload: { userIds: string[] }) => {
// Trigger sub-tasks for each user
const runs = await batch.triggerAndWait(
payload.userIds.map(userId => ({
task: analyzeUserActivity,
payload: { userId }
}))
)
const results = runs.runs.map(r => r.output)
return { processed: results.length }
}
})
batch.triggerAndWait fans out work across parallel workers and waits for all to complete — without managing thread pools or Promise.all error handling.
Idempotency Keys
await processDocument.trigger(
{ documentId, userId },
{
idempotencyKey: `document-${documentId}-v1`
}
)
Same idempotency key = same run — safe to call from retry logic without double-processing.
Local Development
npx trigger.dev@latest dev
Runs a local worker that connects to Trigger.dev's cloud. Tasks execute in your local process with hot reload — you can set breakpoints and see logs in real time.
Pricing Reality Check
- Free: 5,000 task runs/month, 1-hour max duration
- $20/mo: 100,000 runs, higher concurrency
- Self-host: Available, but the cloud setup is clean enough that most teams don't bother
For AI SaaS at early scale, the free tier handles substantial volume.
The AI SaaS Starter Kit includes Trigger.dev v3 wired for Claude document processing and scheduled content jobs — background AI tasks out of the box without the infrastructure setup.
Top comments (0)