DEV Community

楊東霖
楊東霖

Posted on • Originally published at devtoolkit.cc

WebSockets vs Server-Sent Events vs Long Polling: Choosing Real-Time Communication

Real-time communication is at the heart of modern web applications — chat apps, live dashboards, collaborative editors, stock tickers, and game multiplayer all need data pushed from server to client without waiting for the user to refresh the page. But there are three main ways to implement this: WebSockets, Server-Sent Events (SSE), and long polling. Each has different trade-offs in complexity, browser support, infrastructure requirements, and suitability for different use cases.

This guide covers all three techniques in depth — how they work under the hood, their strengths and limitations, implementation patterns in Node.js, and a decision framework for picking the right one for your application.

The Problem: HTTP Was Designed for Request-Response

Standard HTTP follows a strict request-response model: the client sends a request, the server replies, and the connection closes. This works perfectly for loading pages and fetching data, but it's fundamentally incompatible with "push" semantics — you can't have the server send new information to the client without the client asking first.

Three workarounds emerged over the years, each making different compromises:

  • Long polling: Client asks "anything new?" and the server holds the connection open until there is something to report (or a timeout occurs), then the client immediately asks again.
  • Server-Sent Events: Client opens a persistent HTTP connection and the server streams events down it one-way using a standardized text format.
  • WebSockets: An HTTP connection is "upgraded" to a full-duplex TCP connection — both sides can send messages at any time.

Long Polling: The Old Reliable

How It Works

Long polling simulates server push using repeated HTTP requests. The client sends a request, the server holds it open (typically up to 30–60 seconds), and either:

  • Responds immediately when new data is available, or
  • Times out after the hold period and sends an empty response

After receiving a response (whether data or timeout), the client immediately sends another request. From the user's perspective, updates appear in near-real-time.

Implementation in Node.js

// Server: Express long-polling endpoint
const pendingClients = [];

app.get('/poll', (req, res) => {
  // Add this client to waiting list
  const client = { res, timeout: null };
  pendingClients.push(client);

  // Auto-timeout after 30 seconds
  client.timeout = setTimeout(() => {
    const idx = pendingClients.indexOf(client);
    if (idx !== -1) pendingClients.splice(idx, 1);
    res.json({ data: null, timeout: true });
  }, 30000);

  // Clean up if client disconnects
  req.on('close', () => {
    clearTimeout(client.timeout);
    const idx = pendingClients.indexOf(client);
    if (idx !== -1) pendingClients.splice(idx, 1);
  });
});

// When new data arrives, push to all waiting clients
function pushUpdate(data) {
  while (pendingClients.length > 0) {
    const client = pendingClients.pop();
    clearTimeout(client.timeout);
    client.res.json({ data });
  }
}

// Client: JavaScript
async function poll() {
  try {
    const res = await fetch('/poll');
    const { data, timeout } = await res.json();
    if (data) handleUpdate(data);
  } catch (e) {
    await sleep(1000); // Back off on error
  }
  poll(); // Immediately poll again
}
Enter fullscreen mode Exit fullscreen mode

Pros and Cons

Pros:

  • Works everywhere — any HTTP client, any proxy, any firewall
  • Stateless per-request: easy to load balance across multiple servers
  • Simple to implement and debug (use your browser's Network tab)
  • Compatible with standard HTTP infrastructure (CDNs, auth middleware, etc.)

Cons:

  • High overhead: each "poll" creates a new HTTP connection with full headers
  • Latency: slight delay between server having data and client receiving it
  • Server resource usage: many open connections holding threads/memory
  • Not suitable for very high frequency updates (thousands of messages/second)

Best for: Applications where updates are infrequent, infrastructure simplicity is paramount, or you need to work behind restrictive corporate proxies.

Server-Sent Events: One-Way Streaming Made Easy

How It Works

Server-Sent Events (SSE) is a W3C standard for streaming text events from server to client over a persistent HTTP connection. The client opens the connection once using the EventSource API, and the server sends newline-delimited text events in a specific format:

data: {"price": 42350.00, "symbol": "BTC"}

event: alert
data: Price crossed $42,000

id: 1234
data: This event has an ID for reconnection

: this is a comment, ignored by the client
Enter fullscreen mode Exit fullscreen mode

The browser's EventSource API handles automatic reconnection (using the Last-Event-ID header to resume from where it left off), making SSE resilient to dropped connections without any client-side code.

Implementation in Node.js

// Server: Express SSE endpoint
app.get('/events', (req, res) => {
  // Set SSE headers
  res.writeHead(200, {
    'Content-Type': 'text/event-stream',
    'Cache-Control': 'no-cache',
    'Connection': 'keep-alive',
    'X-Accel-Buffering': 'no', // Disable nginx buffering
  });

  // Send initial comment to establish connection
  res.write(': connected\n\n');

  let eventId = 0;

  // Subscribe to your event source (Redis pub/sub, EventEmitter, etc.)
  const unsubscribe = eventBus.on('update', (data) => {
    eventId++;
    res.write(`id: ${eventId}\n`);
    res.write(`data: ${JSON.stringify(data)}\n\n`);
  });

  // Send heartbeat every 30s to prevent proxy timeouts
  const heartbeat = setInterval(() => {
    res.write(': heartbeat\n\n');
  }, 30000);

  // Clean up on disconnect
  req.on('close', () => {
    unsubscribe();
    clearInterval(heartbeat);
  });
});

// Client: JavaScript EventSource API
const evtSource = new EventSource('/events');

evtSource.onmessage = (event) => {
  const data = JSON.parse(event.data);
  updateUI(data);
};

evtSource.addEventListener('alert', (event) => {
  showAlert(event.data);
});

evtSource.onerror = (err) => {
  console.error('SSE error:', err);
  // EventSource auto-reconnects — no manual retry needed
};
Enter fullscreen mode Exit fullscreen mode

Multiplexing with Named Events

SSE supports named events, letting you multiplex different types of updates over a single connection:

// Server sends different event types
res.write(`event: price-update\ndata: ${JSON.stringify(priceData)}\n\n`);
res.write(`event: trade-executed\ndata: ${JSON.stringify(tradeData)}\n\n`);
res.write(`event: system-alert\ndata: ${JSON.stringify(alertData)}\n\n`);

// Client subscribes selectively
evtSource.addEventListener('price-update', handlePrice);
evtSource.addEventListener('trade-executed', handleTrade);
evtSource.addEventListener('system-alert', handleAlert);
Enter fullscreen mode Exit fullscreen mode

Pros and Cons

Pros:

  • Built into browsers — no library needed for the client side
  • Automatic reconnection with Last-Event-ID for resuming streams
  • Works over standard HTTP/1.1 and HTTP/2 (multiplexed over H2)
  • Text-based, easy to debug in browser DevTools
  • Much lower overhead than long polling — single persistent connection

Cons:

  • Unidirectional — server to client only. Client can't send messages over SSE
  • Text-only by spec (though you can base64-encode binary data)
  • Browser limit: 6 connections per domain in HTTP/1.1 (not an issue with HTTP/2)
  • Some older proxies/load balancers buffer the response, breaking streaming

Best for: Live dashboards, news feeds, notifications, log streaming, any scenario where you only need server-to-client data flow.

WebSockets: Full-Duplex Bidirectional Communication

How It Works

WebSockets start as an HTTP request with an Upgrade: websocket header. The server responds with 101 Switching Protocols, and the connection is "upgraded" from HTTP to the WebSocket protocol — a lightweight framing protocol on top of TCP. From this point on, both client and server can send frames to each other at any time, with minimal overhead (2–14 bytes per frame vs hundreds of bytes for HTTP headers).

// Upgrade handshake (handled automatically by libraries)
// Client sends:
GET /chat HTTP/1.1
Host: example.com
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==
Sec-WebSocket-Version: 13

// Server responds:
HTTP/1.1 101 Switching Protocols
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Accept: s3pPLMBiTxaQ9kYGzzhZRbK+xOo=
Enter fullscreen mode Exit fullscreen mode

Implementation with ws Library (Node.js)

import { WebSocketServer } from 'ws';
import { createServer } from 'http';

const server = createServer(app);
const wss = new WebSocketServer({ server });

// Track connected clients
const clients = new Map();

wss.on('connection', (ws, req) => {
  const userId = getUserId(req); // Extract from auth cookie/token
  clients.set(userId, ws);

  console.log(`Client connected: ${userId}`);

  ws.on('message', (rawMessage) => {
    const message = JSON.parse(rawMessage.toString());

    // Route messages by type
    switch (message.type) {
      case 'chat':
        broadcastToRoom(message.roomId, message, userId);
        break;
      case 'typing':
        broadcastTypingIndicator(message.roomId, userId);
        break;
      case 'ping':
        ws.send(JSON.stringify({ type: 'pong' }));
        break;
    }
  });

  ws.on('close', () => {
    clients.delete(userId);
    console.log(`Client disconnected: ${userId}`);
  });

  ws.on('error', (err) => {
    console.error(`WebSocket error for ${userId}:`, err);
  });

  // Send initial state
  ws.send(JSON.stringify({ type: 'init', data: getInitialState() }));
});

function broadcastToRoom(roomId, message, senderId) {
  const roomMembers = getRoomMembers(roomId);
  roomMembers.forEach((memberId) => {
    const ws = clients.get(memberId);
    if (ws && ws.readyState === WebSocket.OPEN && memberId !== senderId) {
      ws.send(JSON.stringify(message));
    }
  });
}

// Client: Browser WebSocket API
const ws = new WebSocket('wss://example.com/ws');

ws.onopen = () => {
  console.log('Connected');
  ws.send(JSON.stringify({ type: 'join', roomId: 'general' }));
};

ws.onmessage = (event) => {
  const message = JSON.parse(event.data);
  handleMessage(message);
};

ws.onclose = (event) => {
  if (!event.wasClean) {
    // Reconnect with exponential backoff
    setTimeout(reconnect, Math.min(1000 * 2 ** retries++, 30000));
  }
};
Enter fullscreen mode Exit fullscreen mode

Scaling WebSockets Horizontally

WebSockets maintain persistent connections to a specific server instance, which creates a challenge when you have multiple servers. If User A is connected to Server 1 and sends a message to User B (connected to Server 2), Server 1 doesn't know about Server 2's connection.

The standard solution is a pub/sub broker — Redis is most common:

import { createClient } from 'redis';

const publisher = createClient();
const subscriber = createClient();

await publisher.connect();
await subscriber.connect();

// When this server receives a message, publish to Redis
function onMessageReceived(roomId, message) {
  publisher.publish(`room:${roomId}`, JSON.stringify(message));
}

// All servers subscribe and forward to their local clients
subscriber.subscribe('room:*', (message, channel) => {
  const roomId = channel.replace('room:', '');
  localClients.forEach((ws, userId) => {
    if (isUserInRoom(userId, roomId)) {
      ws.send(message);
    }
  });
});
Enter fullscreen mode Exit fullscreen mode

Pros and Cons

Pros:

  • True bidirectional communication — both sides can send at any time
  • Very low latency and overhead for frequent messages
  • Binary frame support for efficient data transfer
  • Excellent for interactive, high-frequency use cases (games, collaborative editing)

Cons:

  • Stateful connections complicate horizontal scaling
  • Harder to load balance (sticky sessions or pub/sub required)
  • More complex error handling and reconnection logic needed
  • Some corporate firewalls/proxies block WebSocket upgrades
  • No built-in reconnection — must implement yourself

Best for: Chat applications, multiplayer games, collaborative tools, trading platforms — anything needing true two-way communication with low latency.

HTTP/2 Server Push: The Honorable Mention

HTTP/2 introduced Server Push, which lets the server proactively send resources the client will need. However, Server Push was designed for sending HTML/CSS/JS assets before the browser asks for them — not for real-time event streaming. Most browsers have removed or restricted Server Push support. Don't use it for real-time applications.

Decision Framework: Which Should You Use?

Use this decision tree to choose:

1. Do you need the client to send data to the server (beyond initial HTTP requests)?

  • Yes → WebSockets
  • No → Continue to question 2

2. How frequent are your updates?

  • Very frequent (multiple times per second, e.g., live prices, game state) → WebSockets
  • Moderate (seconds to minutes, e.g., notifications, dashboards) → SSE
  • Infrequent (minutes or less critical) → Long polling or SSE

3. What are your infrastructure constraints?

  • Must work behind corporate proxies/firewalls → Long polling (most compatible)
  • Standard cloud infrastructure → SSE or WebSockets
  • Serverless (Lambda, Vercel edge functions) → SSE (WebSockets are harder in serverless)

4. Are you sending binary data?

  • Yes → WebSockets (native binary frame support)
  • No → SSE (text-based, simpler)

Quick Comparison Table

Here's a summary of the key differences:

  • WebSockets: Full-duplex, binary/text, custom protocol, complex scaling, very low latency
  • SSE: Server-to-client only, text, standard HTTP, easy to scale, low latency
  • Long Polling: Simulated push, text, standard HTTP, trivial to scale, slightly higher latency

Testing Your Real-Time Endpoints

Use our API Tester to test HTTP-based endpoints including long polling and SSE. For WebSocket debugging, browser DevTools (Network tab → WS filter) shows frames in real time. Tools like JSON Formatter help inspect the payloads you're receiving.

Common Pitfalls to Avoid

1. Not Handling Reconnection

SSE handles reconnection automatically. WebSockets and long polling don't — you must implement exponential backoff and reconnection logic manually. A common pattern:

let retries = 0;
const MAX_RETRIES = 10;

function connect() {
  const ws = new WebSocket(WS_URL);

  ws.onopen = () => { retries = 0; }; // Reset on success

  ws.onclose = () => {
    if (retries < MAX_RETRIES) &#123;
      const delay = Math.min(1000 * Math.pow(2, retries), 30000);
      retries++;
      setTimeout(connect, delay);
    &#125;
  &#125;;
&#125;
Enter fullscreen mode Exit fullscreen mode

2. Forgetting to Clean Up Server-Side Resources

Always listen for the close event and unsubscribe from events/timers. Memory leaks from lingering subscriptions are a common production issue with long-running connections.

3. Not Sending Heartbeats

HTTP proxies and load balancers often close idle connections after 60–90 seconds. Send a ping/comment every 30 seconds to keep connections alive.

4. Ignoring Backpressure

If your server generates events faster than the client can consume them, messages pile up. For SSE, check res.writableEnded before writing. For WebSockets, monitor ws.bufferedAmount.

Conclusion

There's no universal winner — each technique excels in different scenarios:

  • Use long polling when you need maximum compatibility, updates are infrequent, and simplicity matters more than efficiency.
  • Use SSE when you need one-way server push, want to leverage standard HTTP infrastructure, and work with serverless or edge environments.
  • Use WebSockets when you need bidirectional communication, low latency for frequent updates, or binary data transfer.

For most modern applications, SSE covers the majority of "push notification" use cases with far less complexity than WebSockets. Only reach for WebSockets when you genuinely need the client to send real-time data back to the server — or when your update frequency demands the absolute lowest overhead.

For more on building modern APIs and testing them, see our guides on REST API testing and securing Node.js APIs.

Free Developer Tools

If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.

Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder

🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.

Top comments (0)