You have an AI agent that finds prospects. Another task that builds spec websites. A third that sends outreach. How do they talk to each other?
I tried a database. I tried flat files. Then I tried Redis, and everything got simpler.
The Problem: AI Tasks Need a Handoff Layer
When you are building autonomous pipelines — not toy demos, but systems that actually run unsupervised — the hardest part is not the AI. It is the plumbing between tasks.
My setup looks like this:
- Prospect discovery — finds local businesses with outdated websites
- Site builder — generates a modern redesign preview on GitHub Pages
- Outreach — sends SMS or email with the preview link
- Follow-up — checks tracking pixels, re-engages warm leads
Each step produces output the next step needs. A database works, but it is heavy. Flat files work, but they do not signal. What I needed was a queue — something that says "here is work, go do it."
Why Redis Fits
Redis gives you three things that matter for AI agent coordination:
Lists as queues. LPUSH to enqueue, RPOP to dequeue. Dead simple. Your prospect finder pushes business objects onto a list. Your site builder pops them off. No polling a database, no file watchers.
# Prospect finder adds work
redis-cli LPUSH prospect:queue '{"name": "Acme Auto", "url": "acmeauto.com", "phone": "555-0123"}'
# Site builder grabs next job
redis-cli RPOP prospect:queue
Sets for deduplication. When your prospect finder runs every few hours, it will rediscover the same businesses. Before enqueuing, check the set:
redis-cli SISMEMBER prospect:seen "acmeauto.com"
# Returns 0 (new) or 1 (already processed)
This is how I track saturation. When SISMEMBER returns 1 more often than 0, it is time to expand your search radius or switch categories.
TTL for temporary state. Some data should not live forever. Tracking pixel hits, rate limit counters, session tokens — give them a TTL and forget about cleanup:
redis-cli SET outreach:ratelimit:sms 15 EX 3600
My Actual Setup
I run Redis in a Docker container alongside everything else on a single VPS. The config is minimal:
redis:
image: redis:7-alpine
volumes:
- redis-data:/data
command: redis-server --appendonly yes --maxmemory 256mb --maxmemory-policy allkeys-lru
Append-only file (AOF) for persistence, 256MB cap with LRU eviction. For a solo dev pipeline processing maybe 50-100 prospects a day, this is massive overkill — and that is exactly the point. You never think about it.
The AI agent accesses Redis through simple CLI calls or a lightweight Node script. No ORM, no client library drama. redis-cli is installed on the host, and from inside containers, Redis is just redis:6379.
Patterns That Emerged
The staging pattern. Instead of going straight from discovery to outreach, I queue into prospect:staged. A separate review step (sometimes human, sometimes AI) moves approved prospects to prospect:ready. This gives me a kill switch without touching code.
The dead letter queue. When outreach fails — bad phone number, bounced email — the prospect goes to prospect:failed instead of disappearing. I review these weekly. Sometimes the data just needs cleaning.
The counter pattern. I track daily stats with expiring keys:
redis-cli INCR stats:prospects:2026-03-19
redis-cli EXPIRE stats:prospects:2026-03-19 604800 # 7 days
Seven days of rolling stats, zero maintenance.
What I Would Not Use Redis For
Anything that needs complex queries. If you want "show me all prospects in Miami that were contacted more than 7 days ago and have not responded" — use Postgres. Redis is for flow control, not analytics.
Anything that needs transactions across multiple data types. Redis transactions exist but they are not what you want for business logic.
Long-term storage of records you might need for compliance. Redis is a buffer, not an archive.
The Takeaway
If you are building AI automations that have more than one step, you need a coordination layer. Redis is the lowest-friction option I have found. It took 10 minutes to add to my Docker Compose, and it immediately simplified three different pipelines.
The AI is the flashy part. The queue is what makes it work while you sleep.
Top comments (0)