DEV Community

Cover image for Supercharging Go with Asynq: Scalable Background Jobs Made Easy
Athreya aka Maneshwar
Athreya aka Maneshwar

Posted on • Edited on

Supercharging Go with Asynq: Scalable Background Jobs Made Easy

Hello, I'm Maneshwar. I'm working on FreeDevTools online currently building **one place for all dev tools, cheat codes, and TLDRs* — a free, open-source hub where developers can quickly find and use tools without any hassle of searching all over the internet.

In backend systems, jobs play a crucial role in improving performance and scalability.

Instead of blocking API requests with long-running tasks, background jobs help offload work to separate workers, making applications more responsive.

Common Use Cases

1. LLM-Powered Applications (AI & Chatbots)

  • Use Case: Generating AI responses asynchronously to handle h igh traffic efficiently.
  • Example: A customer support chatbot using an LLM (e.g., GPT-4) to generate responses. Instead of making users wait while the model processes queries, the request is added to a background job queue.
  • How Asynq Helps:
    • The API receives a user query and enqueues it as a job in a task queue.
    • The backend instantly returns a job ID, allowing the client to continue without delay.
    • A worker fetches the job, processes it using an LLM API (e.g., OpenAI or a self-hosted Llama model), and stores the response in a cache or database.
    • The frontend polls for updates(status/logs) or uses WebSockets for real-time response delivery.

2. Video Processing & Transcoding (Media Platforms like YouTube, TikTok)

  • Use Case: Efficiently handling large-scale video uploads without blocking user interactions.
  • Example: A user uploads a 4K video to a platform. Instead of making them wait for the video to be processed, the backend queues a job for transcoding it into multiple resolutions (1080p, 720p, 480p) asynchronously.
  • How Asynq Helps:
    • Priority queues ensure faster processing for high-demand tasks.
    • Horizontal scaling allows multiple workers to process jobs in parallel.
    • Automatic retries ensure that failed jobs (e.g., due to network issues) are reattempted without user intervention.

Why Asynq?

Asynq is a Redis-backed task queue that simplifies async task processing in Go.

It helps handle background jobs efficiently while providing features like job scheduling, retries, and monitoring.

2. Why Redis-backed Task Queues?

The Need for Background Processing in Go Applications

Some tasks take too long to execute within a typical API request cycle.

Examples include sending emails, sending notifications, resizing images, or processing payments.

Running these tasks in the background improves user experience and system performance.

Why Redis?

  • Low-latency and high-throughput capabilities.
  • Reliable with built-in persistence options.
  • Supports job queues with powerful data structures.

Comparison: Asynq vs. Other Task Queues

  • Sidekiq (Ruby): Asynq brings similar power to Go.
  • Celery (Python): Requires more setup; Asynq is simpler.
  • BullMQ (Node.js): Asynq is the Go alternative.

3. Getting Started with Asynq

Installing Asynq

To install Asynq, run:

 go get github.com/hibiken/asynq
Enter fullscreen mode Exit fullscreen mode

Setting Up Redis for Asynq

Ensure Redis is running:

docker run --name redis -d -p 6379:6379 redis
Enter fullscreen mode Exit fullscreen mode

Creating a Simple Go Application

First, initialize a Go project:

mkdir asynq-demo && cd asynq-demo
go mod init asynq-demo
Enter fullscreen mode Exit fullscreen mode

4. Defining and Enqueuing Jobs

Creating a Task Type and Payload

Define a task struct for sending emails:

type EmailTask struct {
    To      string
    Subject string
    Body    string
}
Enter fullscreen mode Exit fullscreen mode

Enqueueing a Job in Redis

Create and enqueue a task:

package main

import (
    "context"
    "log"
    "github.com/hibiken/asynq"
)

func main() {
    client := asynq.NewClient(asynq.RedisClientOpt{Addr: "localhost:6379"})
    defer client.Close()

    task := asynq.NewTask("email:send", []byte(`{"to":"lovestaco@gmail.to","subject":"Greet","body":"Hi Mom!"}`))
    info, err := client.Enqueue(task)
    if err != nil {
        log.Fatalf("could not enqueue task: %v", err)
    }
    log.Printf("[Enqueued] Task ID: %s", info.ID)
}
Enter fullscreen mode Exit fullscreen mode

5. Processing Jobs with Workers

Setting Up an Asynq Worker

Create a worker to process the email task:

package main

import (
    "context"
    "log"
    "github.com/hibiken/asynq"
)

func emailHandler(ctx context.Context, t *asynq.Task) error {
    log.Printf("Processing email task: %s", t.Payload())
    return nil // Simulate success
}

func main() {
    srv := asynq.NewServer(asynq.RedisClientOpt{Addr: "localhost:6379"}, asynq.Config{Concurrency: 10})
    mux := asynq.NewServeMux()
    mux.HandleFunc("email:send", emailHandler)
    if err := srv.Run(mux); err != nil {
        log.Fatal(err)
    }
}
Enter fullscreen mode Exit fullscreen mode

Handling Job Failures and Retries

Asynq provides automatic retries for failed jobs. You can configure:

info, err := client.Enqueue(task, asynq.MaxRetry(5))
Enter fullscreen mode Exit fullscreen mode

6. Scheduling and Periodic Jobs

Delayed Tasks with asynq.Schedule()

Schedule a task to run after a delay:

client.Enqueue(task, asynq.ProcessIn(10*time.Minute))
Enter fullscreen mode Exit fullscreen mode

Recurring Jobs Using asynq-cron

Use asynq-cron for periodic tasks:

cron.New(cron.WithRedisClientOpt(asynq.RedisClientOpt{Addr: "localhost:6379"}))
Enter fullscreen mode Exit fullscreen mode

7. Advanced Features

Retry Policies

  • Custom retry logic with exponential backoff.
  • Configure retries per task type.

Middleware

  • Logging, tracing, monitoring with OpenTelemetry.

Prioritization

  • Use different queues for high/low-priority jobs.

Distributed Processing

  • Run multiple workers to scale horizontally.

8. Monitoring and Management

Using AsynqMon for Job Monitoring

Start AsynqMon:

docker run -p 8080:8080 hibiken/asynqmon --redis-addr=redis:6379
Enter fullscreen mode Exit fullscreen mode

Debugging Failed Tasks

  • Check logs for errors.
  • Use AsynqMon to retry or inspect tasks.

Performance Optimizations

  • Tune concurrency settings.
  • Optimize Redis performance.
  • Use worker pools efficiently.

Conclusion

Asynq makes background job processing in Go seamless and scalable.

Whether you're building AI-powered applications, media processing pipelines, or handling transactional emails, Asynq provides a robust framework for handling async tasks efficiently.

FreeDevTools

I’ve been building FreeDevTools.

A collection of UI/UX-focused tools crafted to simplify workflows, save time, and reduce friction in searching tools/materials.

Any feedback or contributors are welcome!

It’s online, open-source, and ready for anyone to use.

👉 Check it out: FreeDevTools
⭐ Star it on GitHub: freedevtools

Let’s make it even better together.

Top comments (0)