Stop Defaulting to WebSockets: A Practical Guide to SSE, Polling, and Knowing When You Actually Need Them
You've been there. A ticket lands in your sprint: "Make the dashboard real-time." Your brain immediately jumps to WebSockets. Open a persistent connection, push data to the client, done — right?
Not so fast.
In my experience (and the experience of plenty of devs who've debugged WebSocket auth issues at 2 AM), WebSockets are overkill for most "real-time" features. Notifications, live feeds, progress bars — these are one-way pushes from server to client. You don't need a two-way persistent connection for that.
This article gives you a practical decision framework and working code for the three main approaches — WebSockets, Server-Sent Events (SSE), and polling — so you can stop reaching for the heaviest tool and start picking the right one.
The One Question That Decides Everything
Before you write a single line of code, answer this:
Does the client need to send data back to the server through the same connection?
- No → You probably don't need WebSockets. SSE or polling will do.
- Yes, and it's frequent/continuous → WebSockets are the right call.
That's it. That's the framework. Let me show you why, with code.
Approach 1: Polling — The Underrated Workhorse
Polling gets a bad reputation. Developers treat it like the "amateur" option you eventually graduate away from. But for a lot of features, it's genuinely the best choice.
When to Use Polling
- Updates every 30 seconds or more are acceptable
- You have few concurrent users checking the same resource
- You want maximum simplicity and debuggability
- Your data changes infrequently (e.g., a job status that updates every minute)
The Code
Client-side, it's laughably simple:
async function checkJobStatus(jobId) {
const res = await fetch(`/api/jobs/${jobId}/status`);
const { status, progress } = await res.json();
updateProgressBar(progress);
return status;
}
// Poll every 15 seconds until the job is done
const intervalId = setInterval(async () => {
const status = await checkJobStatus('job_123');
if (status === 'completed' || status === 'failed') {
clearInterval(intervalId);
}
}, 15_000);
Server-side is just a regular endpoint:
// Express.js
app.get('/api/jobs/:id/status', async (req, res) => {
const job = await JobRepository.findById(req.params.id);
res.json({ status: job.status, progress: job.progress });
});
Why It Works
- Zero persistent connections — the server doesn't hold state between requests
- Auth is trivial — regular HTTP, cookies and headers just work
- Debugging is easy — open the Network tab, see the requests, done
- Works everywhere — no browser compatibility concerns, no proxy issues
The Drawback
If you need sub-second updates or have thousands of clients hammering the same endpoint, polling wastes resources. That's where SSE shines.
Approach 2: Server-Sent Events — The Sweet Spot
SSE is the tool most developers overlook. It gives you server-push without the complexity of WebSockets, and it's built into the browser — no libraries needed.
When to Use SSE
- The server needs to push updates to the client
- The client does NOT need to send data back through the same channel
- You want automatic reconnection handled for you
- You're building: notifications, live feeds, progress updates, stock tickers
The Code
Client-side is three lines:
const stream = new EventSource('/api/notifications/stream');
stream.onmessage = (event) => {
const data = JSON.parse(event.data);
updateNotificationBadge(data.unreadCount);
};
// The browser auto-reconnects on disconnect. That's part of the spec.
Server-side with Express:
app.get('/api/notifications/stream', (req, res) => {
// Set SSE headers
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
// Send initial data
res.write(`data: ${JSON.stringify({ unreadCount: 5 })}\n\n`);
// Push updates when they happen
const onNewNotification = (notification) => {
res.write(`data: ${JSON.stringify(notification)}\n\n`);
};
notificationEmitter.on('new', onNewNotification);
// Clean up on disconnect
req.on('close', () => {
notificationEmitter.off('new', onNewNotification);
});
});
Named Events for Structured Data
SSE supports event types, which lets you handle different update kinds without parsing:
// Server
res.write(`event: message\ndata: ${JSON.stringify({ text: 'Hello' })}\n\n`);
res.write(`event: badge\ndata: ${JSON.stringify({ count: 3 })}\n\n`);
// Client
const stream = new EventSource('/api/stream');
stream.addEventListener('message', (e) => {
// Handle chat messages
});
stream.addEventListener('badge', (e) => {
// Handle badge updates
});
SSE with Node.js Streams (Production Grade)
For production, use proper stream handling and heartbeats to keep connections alive through proxies:
import { PassThrough } from 'stream';
app.get('/api/events', (req, res) => {
const stream = new PassThrough();
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
stream.pipe(res);
// Send a heartbeat every 30s to keep the connection alive through proxies
const heartbeat = setInterval(() => {
stream.write(':heartbeat\n\n'); // Comments are ignored by EventSource
}, 30_000);
// Push real events
const onEvent = (data) => {
stream.write(`data: ${JSON.stringify(data)}\n\n`);
};
eventBus.on('update', onEvent);
req.on('close', () => {
clearInterval(heartbeat);
eventBus.off('update', onEvent);
stream.end();
});
});
Why SSE Beats WebSockets for Server-Push
-
Auth just works — it's HTTP, so cookies,
Authorizationheaders, and middleware all compose naturally -
Auto-reconnection — the browser's
EventSourcereconnects on its own, including sending theLast-Event-IDheader so the server can resume from where it left off - Simpler server — each connection is a regular HTTP response, not a stateful socket you have to manage
- Proxy-friendly — corporate proxies and load balancers handle long-lived HTTP responses better than WebSocket upgrade requests
- No framing protocol — you write text, the browser parses it. No binary framing, no opcode handling
The Limitation
It's one-way. The client can only receive. When the user takes an action, you send a regular fetch() POST. For notifications and feeds, that's all you need.
Approach 3: WebSockets — When You Actually Need Them
WebSockets are the right tool when both of these are true:
- The client sends data frequently (not just occasional clicks)
- Low-latency, continuous communication is required in both directions
Real Use Cases Where WebSockets Win
| Use Case | Why WebSocket |
|---|---|
| Chat application | Sending and receiving messages constantly |
| Collaborative editing | Multiple users typing simultaneously, real-time cursor sync |
| Multiplayer games | Continuous state sync in both directions |
| Live trading dashboard | Sub-millisecond updates + user orders going out |
The Code (with ws library)
Server:
import { WebSocketServer } from 'ws';
const wss = new WebSocketServer({ port: 8080 });
wss.on('connection', (ws, req) => {
// ⚠️ Auth is NOT automatic — you have to handle it yourself
const token = new URL(req.url, 'http://localhost').searchParams.get('token');
if (!verifyToken(token)) {
ws.close(4001, 'Unauthorized');
return;
}
ws.on('message', (data) => {
const message = JSON.parse(data);
// Broadcast to all connected clients
wss.clients.forEach((client) => {
if (client.readyState === WebSocket.OPEN) {
client.send(JSON.stringify(message));
}
});
});
});
Client:
let ws;
let reconnectAttempts = 0;
function connect() {
ws = new WebSocket('ws://localhost:8080?token=your-auth-token');
ws.onopen = () => {
reconnectAttempts = 0;
console.log('Connected');
};
ws.onmessage = (event) => {
const data = JSON.parse(event.data);
handleMessage(data);
};
// You must implement your own reconnection logic
ws.onclose = () => {
const delay = Math.min(1000 * 2 ** reconnectAttempts, 30_000);
setTimeout(connect, delay);
reconnectAttempts++;
};
}
connect();
What You're Signing Up For
- Manual reconnection — the browser doesn't auto-reconnect WebSocket. You write that code yourself.
- Manual auth — no cookies on upgrade, no automatic headers. You typically pass tokens in query params or handle auth after connection.
-
Connection state management — you need to track whether the socket is
CONNECTING,OPEN,CLOSING, orCLOSED, and show appropriate UI. - Memory cost — every connected client holds a persistent connection in server memory. 10,000 idle connections = 10,000 open sockets consuming RAM.
- Proxy/CDN issues — some corporate proxies and CDNs don't support WebSocket upgrades. You'll need fallback transports.
None of this is unmanageable, but it's significant overhead if you're building a notification badge.
Decision Framework: Pick in 60 Seconds
Is the communication bidirectional and high-frequency?
├── YES → Use WebSockets
│ Examples: chat, collaborative editing, games
│
└── NO → Are updates needed faster than every 30 seconds?
├── YES → Use Server-Sent Events
│ Examples: notifications, live feeds, progress bars
│
└── NO → Use Polling
Examples: job status, background task progress
Here's a more detailed comparison:
| Factor | Polling | SSE | WebSockets |
|---|---|---|---|
| Direction | Client → Server | Server → Client | Bidirectional |
| Auth | Automatic (HTTP) | Automatic (HTTP) | Manual |
| Reconnection | N/A (stateless) | Automatic (browser spec) | Manual |
| Complexity | Minimal | Low | High |
| Latency | 15-60s typical | Near-instant | Instant |
| Server memory | None | Low (HTTP response) | High (persistent socket) |
| Browser support | Universal | Universal (IE6+) | Universal (IE10+) |
| Proxy/CDN friendly | Yes | Yes | Sometimes |
Real-World Scenario: Building a Notification System
Let's walk through a concrete example. You're building a notification center for a SaaS app:
Requirements:
- Show unread count badge in the header
- Push new notifications as they arrive
- User can mark as read (occasional action)
The Wrong Approach: WebSocket
// Setting up WebSocket for... a notification badge
const ws = new WebSocket('wss://api.example.com/notifications');
ws.onmessage = (event) => {
const data = JSON.parse(event.data);
updateBadge(data.unreadCount);
};
// When user marks as read
document.getElementById('mark-read').addEventListener('click', () => {
ws.send(JSON.stringify({ action: 'mark_read', id: notificationId }));
});
This works, but you've now committed to: managing reconnection, handling auth separately, tracking connection state, and consuming server memory for every idle user with a tab open.
The Right Approach: SSE + fetch
// Notifications come in via SSE (server pushes to client)
const stream = new EventSource('/api/notifications/stream?token=xxx');
stream.onmessage = (event) => {
const data = JSON.parse(event.data);
updateBadge(data.unreadCount);
addNotificationToFeed(data.notification);
};
// Marking as read is a simple POST (client sends to server)
document.getElementById('mark-read').addEventListener('click', () => {
fetch(`/api/notifications/${notificationId}/read`, { method: 'POST' });
});
Same UX. Half the complexity. Auto-reconnection. Auth through headers. Your load balancer understands it.
FAQ
Isn't polling wasteful?
For high-frequency updates, yes. But if you're checking job status every 30 seconds and you have 100 concurrent users, that's ~3 requests/second total. Your server can handle that in its sleep. The waste is relative to your actual requirements.
Can SSE send binary data?
No, SSE is text-only (UTF-8). If you need binary frames, use WebSockets.
What about HTTP/2 Server Push?
HTTP/2 Server Push is about pushing assets (JS, CSS) alongside the initial request. It's not for real-time data streams. Don't confuse it with SSE.
Do SSE connections survive behind nginx/Cloudflare?
Yes, but you need to configure nginx to not buffer the response:
location /api/events {
proxy_pass http://backend;
proxy_http_version 1.1;
proxy_set_header Connection '';
proxy_buffering off;
proxy_cache off;
chunked_transfer_encoding off;
}
Without proxy_buffering off, nginx will hold the SSE stream and deliver it all at once when the connection closes — defeating the purpose entirely.
What if I need both server-push and occasional client sends?
That's the SSE + fetch pattern. Use SSE for the push channel and regular fetch() for client actions. It sounds like two connections, but the fetch is ephemeral — it opens, sends, closes. No persistent state to manage.
Can I use SSE with React?
Absolutely:
function useSSE(url) {
const [data, setData] = useState(null);
useEffect(() => {
const source = new EventSource(url);
source.onmessage = (event) => setData(JSON.parse(event.data));
return () => source.close(); // Clean up on unmount
}, [url]);
return data;
}
// Usage
function NotificationBadge() {
const data = useSSE('/api/notifications/stream');
return <span className="badge">{data?.unreadCount ?? 0}</span>;
}
Conclusion
WebSockets are powerful, but they're a power tool. You wouldn't use a sledgehammer to hang a picture frame. Most "real-time" features are one-directional — the server pushes, the client receives. For those cases:
- Polling if updates every 15-60 seconds are fine. It's simple, debuggable, and works everywhere.
- SSE if you need near-instant server-push. It's HTTP, auth works, reconnection is automatic, and the code is a fraction of what WebSocket requires.
- WebSockets when you have genuine bidirectional, high-frequency communication: chat, collaborative editing, games.
The next time someone says "we need real-time," don't default to WebSockets. Ask: does the client need to send data back through the same connection? If not, reach for SSE. Your 2 AM debugging self will thank you.
Top comments (0)