In the early days of a startup, functional is the goal. You write a query, you get the data, and you send it to the frontend. But as we scaled past 50,000 users, that functional code became a massive liability. We realized that frontend performance means nothing if your backend is bloated. When every millisecond counts, your API response strategy is the difference between a smooth user experience and a hanging loading spinner.
The Invisible Cost of SELECT
The most common mistake Node.js developers make is over-fetching. We often use SELECT * or return entire Mongoose documents because it’s easier. However, at scale, this is devastating. Every extra field you send adds to the payload size, which increases network latency. More importantly, it increases the time the Node.js process spends on JSON serialization.
Converting a massive object into a JSON string is a CPU-intensive, synchronous operation. If your CPU is busy stringifying a 2MB response for User A, User B’s request is sitting in the event loop queue, waiting. We solved this by implementing strict Data Transfer Objects (DTOs). If the mobile app only needs the username and profile_pic, that is all the SQL query should fetch. This reduced our internal bus traffic and slashed response times by 30%.
Master the Event Loop
Node.js is single-threaded. This is its greatest strength and its primary weakness. To scale to 50K users, you must treat the Event Loop as a sacred resource. Any synchronous logic like heavy data transformation, complex regex on large strings, or fs.readSync will block the loop.
When the loop is blocked, the server is effectively dead to all other users. We audited our code to ensure that all heavy lifting was either moved to a background worker (like BullMQ) or handled using asynchronous patterns. We also utilized setImmediate() to break up long-running loops into smaller chunks, allowing the event loop to breathe and process incoming I/O between iterations.
Compression: Gzip vs. Brotli
Finally, never send raw text. We implemented the compression middleware in Express, but we took it a step further by favoring Brotli over Gzip for supported browsers. Brotli offers better compression ratios for text-based assets like JSON.
By compressing our responses, we reduced our bandwidth costs significantly. However, remember that compression itself costs CPU cycles. The sweet spot we found was setting a threshold (e.g., only compress responses larger than 1KB) to ensure we weren't wasting CPU power on tiny payloads where the overhead of compression outweighed the savings.
This sample demonstrates how to use Brotli/Gzip compression, DTOs (Data Transfer Objects) to prevent over-fetching, and how to keep the Event Loop clear.
const express = require('express');
const compression = require('compression');
const app = express();
// 1. Implement Compression (Prioritizing Brotli for efficiency)
app.use(compression({
level: 6, // Balance between CPU usage and compression ratio
threshold: 1024, // Only compress responses larger than 1KB
filter: (req, res) => {
if (req.headers['x-no-compression']) return false;
return compression.filter(req, res);
}
}));
// 2. Data Transfer Object (DTO) Logic
// Avoid 'SELECT *'. Only return what the client needs.
const formatUserResponse = (user) => ({
id: user.id,
username: user.username,
avatar: user.profile_image_url,
memberSince: user.created_at
});
app.get('/api/users/:id', async (req, res) => {
try {
const userId = req.params.id;
// Simulate DB call - specifically selecting fields at the query level
const user = await db.user.findUnique({
where: { id: userId },
select: { id: true, username: true, profile_image_url: true, created_at: true }
});
if (!user) return res.status(404).json({ error: 'User not found' });
// 3. Non-blocking processing
// If you have heavy logic, use setImmediate to let the Event Loop breathe
setImmediate(() => {
console.log(`Log: Profile accessed for user ${userId}`);
});
res.json(formatUserResponse(user));
} catch (err) {
res.status(500).send('Server Error');
}
});

Top comments (0)