Trigger.dev runs long-running background jobs with zero infrastructure — and the free tier includes 50K runs per month.
Define a Task
// trigger/tasks.ts
import { task } from '@trigger.dev/sdk/v3'
export const processOrder = task({
id: 'process-order',
run: async (payload: { orderId: string }) => {
// Step 1: Validate order
const order = await db.order.findUnique({ where: { id: payload.orderId } })
// Step 2: Charge payment (can take 30+ seconds)
const payment = await stripe.charges.create({
amount: order.total,
currency: 'usd',
customer: order.customerId
})
// Step 3: Send confirmation
await resend.emails.send({
to: order.email,
subject: 'Order Confirmed',
html: `Your order ${order.id} is confirmed!`
})
return { paymentId: payment.id }
}
})
Trigger from Your App
// In your API route
import { processOrder } from '@/trigger/tasks'
export async function POST(request) {
const { orderId } = await request.json()
// Trigger the background job
const handle = await processOrder.trigger({ orderId })
// Return immediately — job runs in background
return Response.json({ jobId: handle.id, status: 'processing' })
}
Scheduled Tasks (Cron)
import { schedules } from '@trigger.dev/sdk/v3'
export const dailyReport = schedules.task({
id: 'daily-report',
cron: '0 9 * * *', // Every day at 9 AM
run: async () => {
const stats = await computeDailyStats()
await sendSlackMessage('#reports', formatStats(stats))
}
})
Fan-out / Batch Processing
export const processCSV = task({
id: 'process-csv',
run: async (payload: { fileUrl: string }) => {
const rows = await parseCSV(payload.fileUrl)
// Process all rows in parallel
const results = await processRow.batchTriggerAndWait(
rows.map(row => ({ payload: row }))
)
return { processed: results.length }
}
})
export const processRow = task({
id: 'process-row',
run: async (row: CSVRow) => {
await db.record.upsert({
where: { externalId: row.id },
create: row,
update: row
})
}
})
Real-World Use Case
A team was using BullMQ + Redis for background jobs: order processing, email sending, PDF generation. Each job needed error handling, retries, logging. They moved to Trigger.dev: each task is a TypeScript function, retries are automatic, monitoring is built-in. They deleted their Redis instance and 500 lines of queue management code.
Trigger.dev is background jobs without the infrastructure tax.
Build Smarter Data Pipelines
Need to scrape websites, extract APIs, or automate data collection? Check out my ready-to-use scrapers on Apify — no coding required.
Custom scraping solution? Email me at spinov001@gmail.com — fast turnaround, fair prices.
Top comments (0)