DEV Community

Hazim J
Hazim J

Posted on • Edited on

Creating scheduled tasks in Node.js with ZeroQueue

Often times in your project you'll find a need to run certain tasks on a schedule. Whether it's every few minutes, hours, days, weeks, or even months. This is a perfect use case for a job queue which will often let you do just that by specifying the schedule through a crontab.

Here we'll see how you can easily do this with minimal code using ZeroQueue. As a simple example lets say you wanted to aggregate article links from multiple different news sites you follow on a daily basis and save them to an Airtable spreadsheet for easy access.

ZeroQueue Setup

Since a built docker image is currently hosted on Docker Hub, the app should easily work on many different platforms such as Kubernetes or Heroku. For this example we'll keep things simple and just run it locally with docker.

Backing services

To run ZeroQueue we'll first need a SQL database and Redis instance. A SQL database is used to track the different queues and users on the system while Redis is used to manage the jobs on a queue and distribute them to workers accordingly.

First, spin up a postgreSQL database with the following command:

docker run --name zeroqueue-db -e POSTGRES_USER=admin -e POSTGRES_PASSWORD=password -e POSTGRES_DB=zeroqueue_db -p 5432:5432 -d postgres
Enter fullscreen mode Exit fullscreen mode

Next, spin up a Redis instance:

docker run --name zeroqueue-redis -p 6379:6379 -d redis
Enter fullscreen mode Exit fullscreen mode

Starting a ZeroQueue web server

To start the web server we first need to run a migration on the database with the following command:

docker run --rm -e DATABASE_URL=postgres://admin:password@host.docker.internal:5432/zeroqueue_db zeroqueue/zeroqueue npm run db:sync
Enter fullscreen mode Exit fullscreen mode

Once that is completed we can spin up the web server with:

docker run --rm -d -e DATABASE_URL=postgres://admin:password@host.docker.internal:5432/zeroqueue_db -e REDIS_URL=redis://host.docker.internal:6379 -e SESSION_SECRET=this-is-a-secret-value-with-at-least-32-characters -p 9376:9376 --name zeroqueue zeroqueue/zeroqueue
Enter fullscreen mode Exit fullscreen mode

Once that's all done, you should be able to access ZeroQueue via http://localhost:9376 with the default credentials:

  • username: admin
  • password: password

Creating a queue

After logging into the dashboard click the "New Queue" button. We'll call our queue "Fetch Articles" and set the cron as 0 0 * * * to run daily.

Creating a queue

Adding jobs

For this example lets aggregate articles from the RSS feeds of two common sources, Dev.to and HackerNews.

First create the following jobs.json file:

[
  {
    "name": "Dev.to",
    "data": {
      "source": "Dev.to",
      "url": "https://dev.to/feed/"
    }
  },
  {
    "name": "HackerNews",
    "data": {
      "source": "HackerNews",
      "url": "https://news.ycombinator.com/rss"
    }
  }
]
Enter fullscreen mode Exit fullscreen mode

Next we'll add it to the queue by clicking into "Fetch Articles" on the dashboard and uploading the JSON file via the "New Jobs" button.

Add job to queue

Jobs status

Adding workers

The final step is to now write some code for the workers. For each job, we want to fetch the RSS feed and push each item to an Airtable spreadsheet.

Start by installing the necessary dependancies:

npm install airtable bull rss-parser
Enter fullscreen mode Exit fullscreen mode

Bull is the main library we need for the worker to be able to start processing jobs.

We can then create a file called worker.js with the following code:

const Queue = require("bull");
const Parser = require("rss-parser");
const Airtable = require("airtable");

const queue = new Queue("Fetch Articles", process.env.REDIS_URL);
const parser = new Parser();
const base = new Airtable({ apiKey: "API_KEY" }).base("BASE_ID");

queue.process("*", async (job, done) => {
  const { source, url } = job.data;

  try {
    const feed = await parser.parseURL(url);
    await Promise.all(
      feed.items.map((item) =>
        base("Articles").create(
          [
            {
              fields: {
                Name: item.title,
                Source: source,
                Link: item.link,
                Timestamp: new Date(item.pubDate).toISOString(),
              },
            },
          ],
          { typecast: true }
        )
      )
    );
  } catch (error) {
    job.log(error);
  }

  job.progress(100);
  done(null, null);
});
Enter fullscreen mode Exit fullscreen mode

By running node worker.js, the above script will execute according to the schedule we previously set. Everyday the worker will process the jobs in the "Fetch Article" queue. For each job it will pull the RSS feed from the given url and dump the article links into an Airtable spreadsheet.

Alt Text

Summary

In summary we've managed to:

  • Setup a queue management system
  • Scheduled jobs
  • Added a worker

We've also managed to do this with less than 50 lines of code 🎉. You can also check out the example repo at https://github.com/thezeroqueue/zeroqueue-rss-example and the end result at https://airtable.com/shrcr2ihRSNWRBsPC.

If you're interested in using ZeroQueue for your own projects, consider starring it on GitHub at https://github.com/thezeroqueue/zeroqueue.

Note: although the worker is quite basic and won't cover a lot of edge cases such as de-duplication, the main point is that it can still be very easy to schedule tasks for anything when needed.

Top comments (0)