At Fotify, we manage thousands of real-time photo uploads and live event interactions every week. Behind the scenes, all that speed and reliability comes down to how we handle our background jobs — and Redis + BullMQ running on Hetzner Cloud plays a big role in that.
In this post, I’ll share how we set up our queue system, the lessons learned, and a few tips for keeping it scalable and efficient.
🧠 Why We Needed Queues
When guests upload photos during an event, each upload triggers multiple asynchronous tasks:
- Image optimization and metadata extraction
- Upload to Cloudflare R2 for storage
- AI-based content filtering
- Push notifications for real-time photo walls
Doing all that directly in the request cycle would slow down the user experience. So, we moved those tasks into background queues.
⚙️ The Stack
Our setup is simple and battle-tested:
- Redis (single node on Hetzner Cloud CX21)
- BullMQ for queue management
- NestJS workers that process the jobs
- Cloudflare R2 for image storage
- PostgreSQL for core app data
We use Docker Compose to deploy the entire stack on Hetzner. Redis runs in a dedicated container, while the workers and API run separately for better isolation.
🚀 How BullMQ Fits In
BullMQ gives us a clean interface to create, manage, and monitor queues.
Each feature has its own queue — for example:
photo-processingpush-notificationsanalytics
In our NestJS service, we inject a queue like this:
import { InjectQueue } from '@nestjs/bullmq';
import { Queue } from 'bullmq';
@Injectable()
export class PhotoService {
constructor(@InjectQueue('photo-processing') private queue: Queue) {}
async handleUpload(photo: Photo) {
await this.queue.add('optimize', { photoId: photo.id });
}
}
This decouples our real-time uploads from heavier tasks that can safely run in the background.
⚡ Performance Tips
A few lessons learned after running this setup for months:
- Use a connection pool — avoid creating new Redis connections for every queue instance.
- Add retry & backoff strategies — jobs may fail due to transient network issues.
- Enable metrics — we use Bull Board to monitor queue performance.
- Deploy Redis close to your API — latency between the app and Redis affects throughput.
🧩 Why Hetzner
Hetzner’s servers give us excellent performance for the price. A small Redis instance on Hetzner easily handles tens of thousands of jobs per hour.
We use snapshots and weekly backups to ensure nothing gets lost.
🌐 Try Fotify
If you’re into event tech or photo-sharing platforms, check out Fotify — our real-time photo-sharing app for events. It’s built entirely with Next.js, NestJS, Redis, and BullMQ, all deployed on Hetzner.
Thanks for reading!
If you’ve built something similar or have tips for scaling Redis queues, I’d love to hear from you in the comments.
Top comments (1)
Heyo!
Great post ❤️
Just a reminder that you likely want to add tags to your post to help with discoverability. Folks follow tags to discover posts in their "relevant" feed on the DEV homepage, so tags are really important for getting your posts seen.
You can add popular, established tags, more niche tags, or create a brand new tag if there's one that has yet to be created. Some tags are centered around a technology (#python, #javascript) and others are more functional (#discuss, #help, #tutorial)... you can use a combination of 4 tags on your post, so choose wisely.
Always make sure that the tags you choose fit the subject matter of your post. Different tags have submission guidelines set up for them which you can view on a tag's landing page. For example, view the landing page for #career - dev.to/t/career and you'll see that the submission guidelines read:
Note that we recruit Tag Moderators to help us create submission guidelines and ensure that posts are properly tagged.
Those are some of the basics of tags! Feel free to visit DEV Help for other helpful information!