DEV Community

Alex Spinov
Alex Spinov

Posted on

Trigger.dev Has a Free API That Makes Background Jobs in TypeScript Actually Reliable

Trigger.dev is the open-source background job framework for TypeScript. Define tasks with full type safety, automatic retries, and built-in observability.

Define a Task

import { task } from "@trigger.dev/sdk/v3";

export const scrapeProduct = task({
  id: "scrape-product",
  retry: { maxAttempts: 3, factor: 2, minTimeoutInMs: 1000 },
  run: async (payload: { url: string; category: string }) => {
    const html = await fetch(payload.url).then(r => r.text());
    const product = parseProduct(html);

    await db.product.create({
      data: { ...product, category: payload.category, url: payload.url },
    });

    return { title: product.title, price: product.price };
  },
});
Enter fullscreen mode Exit fullscreen mode

Trigger From Anywhere

import { scrapeProduct } from "./trigger/scrape";

// From API route
app.post("/api/scrape", async (req, res) => {
  const handle = await scrapeProduct.trigger({
    url: req.body.url,
    category: req.body.category,
  });

  res.json({ jobId: handle.id });
});

// Batch trigger
const handles = await scrapeProduct.batchTrigger(
  urls.map(url => ({ payload: { url, category: "electronics" } }))
);
Enter fullscreen mode Exit fullscreen mode

Scheduled Tasks (Cron)

import { schedules } from "@trigger.dev/sdk/v3";

export const dailyScrape = schedules.task({
  id: "daily-scrape",
  cron: "0 9 * * *", // Every day at 9 AM
  run: async () => {
    const urls = await db.watchlist.findMany();
    for (const item of urls) {
      await scrapeProduct.trigger({ url: item.url, category: item.category });
    }
    return { triggered: urls.length };
  },
});
Enter fullscreen mode Exit fullscreen mode

Wait and Resume

import { task, wait } from "@trigger.dev/sdk/v3";

export const processOrder = task({
  id: "process-order",
  run: async (payload: { orderId: string }) => {
    await chargePayment(payload.orderId);

    // Wait 24 hours, then follow up
    await wait.for({ hours: 24 });

    const order = await db.order.findUnique({ where: { id: payload.orderId } });
    if (!order.shipped) {
      await sendReminderEmail(order.customerEmail);
    }
  },
});
Enter fullscreen mode Exit fullscreen mode

Sub-Tasks: Compose Complex Workflows

export const fullScrape = task({
  id: "full-scrape-pipeline",
  run: async (payload: { urls: string[] }) => {
    // Fan out: scrape all URLs in parallel
    const results = await scrapeProduct.batchTriggerAndWait(
      payload.urls.map(url => ({ payload: { url, category: "auto" } }))
    );

    // Aggregate results
    const successful = results.runs.filter(r => r.ok).map(r => r.output);
    const failed = results.runs.filter(r => !r.ok);

    // Generate report
    await generateReport.trigger({
      totalUrls: payload.urls.length,
      successful: successful.length,
      failed: failed.length,
      products: successful,
    });

    return { processed: successful.length, errors: failed.length };
  },
});
Enter fullscreen mode Exit fullscreen mode

Build reliable scraping pipelines? My Apify tools + Trigger.dev = production-grade data extraction.

Custom pipeline? Email spinov001@gmail.com

Top comments (0)