I wanted to have a a niche in backend engineering and i was drawn to real-time systems. I wanted understand how real-time systems actually work under the hood not just use them, but build one myself. So I built a stock price tracker that fetches live prices every 60 seconds, calculates SMAs, detects crossover alerts, and pushes everything to connected clients over WebSocket.
Here's what I learned.
What it does
Every 60 seconds:
- Fetches live prices for 15 stocks from Finnhub API
- Saves them to PostgreSQL
- Caches the last 5 prices per stock in Redis
- Calculates a 5-period SMA from the cache
- Detects bullish/bearish crossover alerts
- Broadcasts everything to connected WebSocket clients in one message
The stack
- Django + DRF
- Celery + Celery Beat (task scheduling)
- Redis (caching + Channels backend)
- Django Channels (WebSocket)
- Uvicorn (ASGI server)
- Finnhub API
- SQLite for db since i was having issues with postgreql on my mac
The part that clicked for me: Redis could be used for other stuff and I also realized how DSA is very important when I had to implement redis-list.
I had used Redis quite a few times in other projects (mostly as a broker in a message queue that is Celery for background tasks.) but i was exposed to it's other capabilities. I had already watched videos and hearing Redis is for caching but I hadn't implemented one myself.
In this project Redis is doing three different jobs:
1. Celery broker — passes tasks between Celery Beat and the worker
2. Price cache — stores the last 5 prices per stock as a Redis List
3. Channel Layer backend — lets Celery talk to Django Channels to broadcast WebSocket messages
Same Redis instance, three completely different purposes. That was a lightbulb moment.
How the caching works
Each stock has a Redis List that holds its last 5 prices. On every update I use two commands:
RPUSH stock:AAPL:prices 255.78 → add new price to the end
LTRIM stock:AAPL:prices -5 -1 → remove anything older than last 5
That's it. The list always has a maximum of 5 items and the oldest price falls off automatically.
I also used Redis pipelines to batch these commands together. Instead of making separate round-trips to Redis for each stock, I queue all the commands and execute them in one go. With 15 stocks that's the difference between 60 round-trips and 2.
How the SMA and alerts work
Once 5 prices are cached, calculating SMA is just:
sma = sum(last_5_prices) / 5
For alerts I compare the previous price against the previous SMA, and the current price against the current SMA. If the price crossed the line between those two readings, that's a crossover:
# Bullish: price was below SMA, now above
if previous_price < previous_sma and current_price > current_sma:
alert = "bullish"
# Bearish: price was above SMA, now below
if previous_price > previous_sma and current_price < current_sma:
alert = "bearish"
You need the previous values to detect a crossing. That's why caching the SMA matters so you have something to compare against next time.
How real-time broadcasting works
This was the trickiest part to understand. Celery and Django are separate processes. How does a background task push a message to a connected WebSocket client?
The answer is the Channel Layer.
Celery task finishes processing
↓
Publishes message to Channel Layer (Redis)
↓
Django Channels picks it up
↓
Pushes to all connected WebSocket clients
Redis is the bridge between the two processes. Celery writes to it, Channels reads from it.
The WebSocket message looks like this:
{
"type": "stock_update",
"timestamp": "2026-02-14T21:38:22+00:00",
"stocks": [
{ "ticker": "AAPL", "price": 255.78, "sma": 254.32, "alert": null },
{ "ticker": "MSFT", "price": 401.32, "sma": 399.80, "alert": "bullish" },
{ "ticker": "TSLA", "price": 417.44, "sma": 419.10, "alert": "bearish" }
]
}
All 15 stocks in one message, every 60 seconds, automatically.
Running two servers in development
One thing that tripped me up was Django's runserver doesn't support WebSockets. It was a struggle. It's a WSGI server, which is request/response only. WebSockets need a persistent connection, so you need an ASGI server.
I run both during development:
# For the DRF browsable API
python manage.py runserver # port 8000
# For WebSocket connections
uvicorn core.asgi:application --port 8001
REST endpoints on 8000, WebSocket on 8001.
What I actually learned
Going in I knew Django and had used Redis a little. Coming out I understand:
- How background task scheduling works with Celery Beat
- How Redis Lists work and why they're perfect for rolling windows of data
- What Redis pipelines are and why batching matters
- The difference between WSGI and ASGI
- How Django Channels uses a Channel Layer to bridge async and sync code
- How to structure a real-time data pipeline end to end
Building this made real-time systems a lot less mysterious. It's not magic — it's just a producer, a channel, and a consumer.
Source code
GitHub: Stock-Price-Tracker-API
Future plans
- Simple frontend to show how it works
- Making sure API calls are not made when market closes. This should be done automatically.
- Migrate database to PostgreSQL
Top comments (0)