Real-Time Is a Spectrum
Every real-time feature has different requirements. A collaborative document editor and a live sports scoreboard have nothing in common—except they both need updates without page refresh.
Choosing the wrong transport layer wastes engineering time and creates reliability nightmares.
The Three Options
1. Long Polling
The oldest trick: client makes a request, server holds it open until data arrives (or timeout), client immediately re-requests.
async function longPoll(lastEventId: string) {
try {
const response = await fetch(`/api/events?after=${lastEventId}&timeout=30`);
const data = await response.json();
processEvents(data.events);
longPoll(data.lastEventId); // immediately reconnect
} catch (error) {
await sleep(1000); // back off on error
longPoll(lastEventId);
}
}
Server:
app.get('/api/events', async (req, res) => {
const { after, timeout = 30 } = req.query;
const events = await waitForEvents(after, parseInt(timeout) * 1000);
res.json({ events, lastEventId: events.at(-1)?.id ?? after });
});
Pros: Works everywhere. No special infrastructure. Firewall-friendly.
Cons: High latency (one round trip per event). Server holds connections open. Inefficient for high-frequency updates.
Use when: Legacy infrastructure. Updates are infrequent. Maximum compatibility required.
2. Server-Sent Events (SSE)
HTTP streaming: one long-lived HTTP connection where the server pushes events.
// Client
const eventSource = new EventSource('/api/stream');
eventSource.addEventListener('price-update', (event) => {
const data = JSON.parse(event.data);
updateUI(data);
});
eventSource.addEventListener('error', () => {
// Browser auto-reconnects after network errors
});
// Cleanup
eventSource.close();
Server (Node.js/Express):
app.get('/api/stream', (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
// Send event
const sendEvent = (type: string, data: unknown) => {
res.write(`event: ${type}\n`);
res.write(`data: ${JSON.stringify(data)}\n\n`);
};
// Keep-alive ping
const ping = setInterval(() => {
res.write(': ping\n\n');
}, 30000);
// Subscribe to events
const unsubscribe = eventEmitter.on('price-update', (data) => {
sendEvent('price-update', data);
});
req.on('close', () => {
clearInterval(ping);
unsubscribe();
});
});
Pros: Simple. Native browser support. Auto-reconnect. Works through HTTP/2. Stateless server possible.
Cons: Unidirectional (server → client only). HTTP/1.1 has 6 connection limit per domain.
Use when: Dashboards, live feeds, notifications. Anything that only needs server → client.
3. WebSockets
Full-duplex TCP connection. Both sides can send at any time.
// Client
const ws = new WebSocket('wss://api.example.com/ws');
ws.onopen = () => {
ws.send(JSON.stringify({ type: 'subscribe', channel: 'btc-usd' }));
};
ws.onmessage = (event) => {
const message = JSON.parse(event.data);
handleMessage(message);
};
// Client can also send
document.addEventListener('keydown', (e) => {
ws.send(JSON.stringify({ type: 'keystroke', key: e.key }));
});
Server (Node.js with ws library):
import { WebSocketServer } from 'ws';
const wss = new WebSocketServer({ port: 8080 });
wss.on('connection', (ws) => {
ws.on('message', (data) => {
const message = JSON.parse(data.toString());
if (message.type === 'subscribe') {
subscribeToChannel(ws, message.channel);
}
});
ws.on('close', () => {
cleanupSubscriptions(ws);
});
});
// Broadcast to all subscribers
function broadcastToChannel(channel: string, data: unknown) {
wss.clients.forEach((client) => {
if (client.readyState === WebSocket.OPEN && isSubscribed(client, channel)) {
client.send(JSON.stringify(data));
}
});
}
Pros: True bidirectional. Low latency. Efficient for high-frequency updates.
Cons: Stateful connections. Harder to scale horizontally. More complex error handling.
Use when: Chat, collaborative editing, multiplayer games, trading platforms.
Decision Matrix
| Scenario | Recommendation |
|---|---|
| Live dashboard / feed | SSE |
| Collaborative document | WebSocket |
| Notification system | SSE |
| Chat application | WebSocket |
| Live auction | WebSocket |
| Stock ticker (read-only) | SSE |
| Multiplayer game | WebSocket |
| Progress updates | SSE |
| Old enterprise proxy | Long Polling |
Scaling Considerations
SSE Scaling
// Use Redis pub/sub to fan out across instances
import { createClient } from 'redis';
const subscriber = createClient();
await subscriber.subscribe('events', (message) => {
// Broadcast to all SSE connections on this instance
sseClients.forEach(client => client.write(`data: ${message}\n\n`));
});
WebSocket Scaling
// Socket.io with Redis adapter
import { createAdapter } from '@socket.io/redis-adapter';
io.adapter(createAdapter(pubClient, subClient));
// Now messages route across all instances
The Boring Truth
For most applications:
- Start with SSE—it's simpler, stateless, and handles 90% of use cases
- Upgrade to WebSockets only when you need client → server real-time communication
- Long polling only if you're stuck in 2010 infrastructure
The flashiest choice is rarely the right one.
Want pre-built real-time infrastructure for your SaaS? Whoff Agents AI SaaS Starter Kit includes WebSocket and SSE patterns ready to deploy.
Top comments (0)