If you’ve ever tried to build dashboards or analytics that update in real time, you know the pain. Polling APIs? They lag and waste resources. WebSockets? Great, until you need to push live database results, not just events. The first time I needed to expose live SQL query results over HTTP—so dashboards could show streaming changes—I realized this was a totally different beast than classic REST APIs.
I thought: "How hard can it be? Just pipe the DB results into a stream!" Turns out, there are dragons. But also some cool wins. Here’s what building a streaming SQL API in Node.js taught me—warts, surprises, and all.
Why Stream SQL Over HTTP?
Most APIs return a static snapshot. But teams want to see numbers change as data changes. Think: live order counts, running totals, or alerts that update instantly. The old flow—fetch data, poll every few seconds—is clunky and can hammer the database.
Streaming SQL APIs solve this by keeping the query open and pushing new results as soon as the data changes. Imagine running SELECT COUNT(*) FROM orders WHERE status = 'pending' and getting the count pushed every time a new order arrives.
I started down this road because our product team needed to see real-time metrics, and our polling-based dashboards either lagged or overwhelmed Postgres. I wanted a single endpoint: the client connects, sends a SQL query (with some safety), and gets a live stream of results.
The Core: Streaming Data With Node.js
Node.js shines at handling streams. But most database drivers (like pg for Postgres) return all the rows at once, not as a stream. Even when using cursors, you’re just paginating big result sets—not getting notified when data changes.
So, the trick is to combine two ideas:
- Listen to DB changes (triggered by inserts/updates).
- Stream those changes to the client over HTTP.
Here’s how I pieced it together.
Example 1: Listening for Changes With PostgreSQL
Postgres has a neat feature: LISTEN/NOTIFY. You can set up triggers to send notifications when tables change. Here’s a minimal Node.js snippet using the pg library:
// npm install pg
const { Client } = require('pg');
async function listenForChanges() {
const client = new Client();
await client.connect();
// Listen for notifications on the 'table_update' channel
await client.query('LISTEN table_update');
client.on('notification', (msg) => {
// msg.payload contains any data sent with NOTIFY
console.log('Table change detected:', msg.payload);
// Here, you'd push updates to connected clients
});
// For demo: keep process alive
setInterval(() => {}, 1000);
}
listenForChanges().catch(console.error);
Key lines:
-
LISTEN table_updatesubscribes to change notifications on that channel. - Whenever someone does
NOTIFY table_update, 'something changed'in SQL, your Node app gets an event.
But, Postgres doesn’t send these automatically on row changes. You need a trigger:
-- This trigger runs after insert/update/delete on 'orders'
CREATE OR REPLACE FUNCTION notify_orders_change()
RETURNS trigger AS $$
BEGIN
PERFORM pg_notify('table_update', 'orders changed');
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER orders_update
AFTER INSERT OR UPDATE OR DELETE ON orders
FOR EACH STATEMENT EXECUTE PROCEDURE notify_orders_change();
Now, any change in orders fires a notification. You can get fancier and send row IDs, etc.
Example 2: Streaming Over HTTP (Server-Sent Events)
How do you actually stream updates to the browser or a client? WebSockets work, but for many dashboards, Server-Sent Events (SSE) are simpler. The client just opens an HTTP connection and receives events.
Here’s a barebones SSE server in Node.js:
// npm install express
const express = require('express');
const app = express();
app.get('/stream', (req, res) => {
// Set headers for SSE
res.set({
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.flushHeaders();
// Send a test event every 2 seconds
const interval = setInterval(() => {
res.write(`data: The time is ${new Date().toISOString()}\n\n`);
}, 2000);
// Clean up when client disconnects
req.on('close', () => clearInterval(interval));
});
app.listen(3000, () => console.log('SSE server running on :3000'));
Key lines:
- The
text/event-streamcontent type tells the browser to keep the connection open. - Each
res.writepushes a new event. - When the client closes the connection, we clear the interval.
You can replace the interval with DB notifications from Example 1.
Example 3: Wiring It Together—Reactive SQL Over HTTP
Putting it all together: connect to Postgres, listen for changes, and stream to HTTP clients.
Here’s a simplified version that streams a live count of pending orders:
const express = require('express');
const { Client } = require('pg');
const app = express();
const db = new Client();
db.connect();
// Keep track of connected clients
const clients = new Set();
app.get('/orders/pending/stream', async (req, res) => {
res.set({
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.flushHeaders();
clients.add(res);
// Send the initial count immediately
const { rows } = await db.query("SELECT COUNT(*) FROM orders WHERE status = 'pending'");
res.write(`data: ${JSON.stringify({ count: rows[0].count })}\n\n`);
req.on('close', () => clients.delete(res));
});
// Listen for table changes and broadcast updated count
async function listenAndBroadcast() {
await db.query('LISTEN table_update');
db.on('notification', async () => {
const { rows } = await db.query("SELECT COUNT(*) FROM orders WHERE status = 'pending'");
for (const client of clients) {
client.write(`data: ${JSON.stringify({ count: rows[0].count })}\n\n`);
}
});
}
listenAndBroadcast();
app.listen(3000, () => console.log('Streaming orders count on :3000'));
What’s happening:
- Whenever
/orders/pending/streamis hit, a client is added to theclientsset. - On any change to the
orderstable (from the Postgres trigger), the API fetches the new count and streams it to all connected clients. - You can open this in multiple browser tabs, and all get live updates as soon as new orders come in.
This is, honestly, the foundation of a streaming SQL API. Of course, in production, you’ll add authentication, validation, backpressure handling, and maybe use a library like pg-listen, but this skeleton works.
Where Things Get Messy (And What I Learned)
Keeping Connections Alive: Browsers and proxies love to close idle connections. I had to add periodic heartbeat messages (
data: ping\n\n) every 30 seconds to keep connections up, or clients would silently disconnect.Query Security: Letting users send arbitrary SQL is a footgun. Always validate incoming queries, or better, only allow a safe set of queries or parameters. One weekend I spent rolling back the damage from a test user who dropped half a staging table with a typo.
Scaling:
LISTEN/NOTIFYis per-connection. If you run multiple Node.js servers (say, behind a load balancer), each needs its own DB connection and notification trigger. Also, PostgresNOTIFYpayloads are limited (8000 bytes). If you want to send actual row data, you need a different approach (like using logical replication slots, or an event bus like Redis).Backpressure: If clients are slow to read, your Node process can back up and crash. SSE helps (browser closes the connection if it can’t keep up), but with WebSockets or custom TCP streams, you need to handle flow control.
Client Handling: Browsers will reconnect SSE streams if the connection drops, but you need to handle deduplication and possibly replay missed events. I wrote a custom event ID system to help clients resume from where they left off.
Common Mistakes
1. Ignoring Database Notifications in the App Layer
A lot of devs (including me, early on) just poll the database every few seconds, even when the database supports change notifications. This hammers the DB and doesn’t scale. Use LISTEN/NOTIFY or your DB’s equivalent whenever possible. Polling should be your last resort.
2. Not Handling Client Disconnects Properly
If you forget to remove disconnected clients from your subscribers list, you’ll quickly run out of memory or crash your process. I’ve seen this happen in production—always clean up listeners on close or error.
3. Exposing Arbitrary SQL Queries
It’s tempting to let clients send any SQL, but this is a disaster waiting to happen. Even read-only queries can be abused (think SELECT pg_sleep(100) or huge result sets). Always whitelist queries or use parameterized statements.
Key Takeaways
- Streaming SQL APIs let you build genuinely live dashboards and analytics, but require careful wiring between the database and HTTP layer.
- Use your database’s native notification system (
LISTEN/NOTIFYin Postgres) to avoid wasteful polling and improve performance. - Server-Sent Events are a simple, robust way to stream data to browsers—often better than WebSockets for one-way data flows.
- Always sanitize incoming queries and clean up client connections to avoid security and resource leaks.
- Be aware of scaling limitations—what works on one server may need redesign for a multi-instance or high-throughput setup.
Building a streaming SQL API changed how I think about real-time systems. It’s less about fancy frameworks, more about understanding how events flow from your DB to the browser. If you’re tired of polling and want a more responsive experience, give this a try—you’ll learn a ton along the way.
If you found this helpful, check out more programming tutorials on our blog. We cover Python, JavaScript, Java, Data Science, and more.
Top comments (0)