Most Node.js apps run directly on port 3000 during development. Production systems almost never work that way. Nginx sits in front of your app and handles the heavy HTTP work so Node can focus on application logic.
Here are six Nginx patterns that dramatically improve performance, security, and scalability for JavaScript applications.
1. Put Nginx in Front of Node.js as a Reverse Proxy
Running Node directly on the public internet works for development but breaks down in production. A reverse proxy sits between users and your application and handles HTTP traffic efficiently.
Before
Node exposed directly to the internet.
import express from "express"
const app = express()
app.get("/", (req, res) => {
res.send("Hello from Node")
})
app.listen(3000)
Clients connect directly to port 3000.
https://myapp.com:3000
Node now handles TLS, static files, and connection management. That is not what it is optimized for.
After
Nginx receives traffic and forwards it to Node.
server {
listen 80
server_name myapp.com
location / {
proxy_pass http://127.0.0.1:3000
proxy_http_version 1.1
proxy_set_header Host $host
proxy_set_header X-Real-IP $remote_addr
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for
proxy_set_header X-Forwarded-Proto $scheme
}
}
Users connect to Nginx. Nginx forwards requests to Node. This adds security, hides your application server, and lets Nginx manage raw HTTP traffic much more efficiently.
2. Let Nginx Serve Static Files Instead of Node
Many Node apps serve images, JavaScript bundles, and CSS from the same server process. That wastes the event loop.
Before
Express serving static assets.
import express from "express"
const app = express()
app.use("/static", express.static("public"))
app.listen(3000)
Every static file request now passes through Node.
After
Nginx serves static files directly from disk.
server {
listen 80
server_name myapp.com
location /static/ {
alias /var/www/myapp/public/
expires 1y
add_header Cache-Control "public, immutable"
access_log off
}
location / {
proxy_pass http://127.0.0.1:3000
}
}
Nginx serves static files using kernel level optimizations. Node now only handles API and SSR requests. This can increase throughput several times under load.
3. Enable HTTPS Automatically with Let’s Encrypt
Browsers treat HTTP sites as insecure. Every production application needs HTTPS.
Before
Manual SSL configuration or none at all.
server {
listen 80
server_name myapp.com
location / {
proxy_pass http://127.0.0.1:3000
}
}
Traffic is unencrypted.
After
Use Certbot to automatically configure TLS.
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d myapp.com
Resulting configuration.
server {
listen 80
server_name myapp.com
return 301 https://$server_name$request_uri
}
server {
listen 443 ssl http2
server_name myapp.com
ssl_certificate /etc/letsencrypt/live/myapp.com/fullchain.pem
ssl_certificate_key /etc/letsencrypt/live/myapp.com/privkey.pem
location / {
proxy_pass http://127.0.0.1:3000
}
}
Certificates renew automatically every 90 days. You get HTTPS and HTTP2 with almost no manual work.
4. Compress Responses with Gzip and Brotli
JavaScript bundles are large. Compression dramatically reduces transfer size.
Before
No compression enabled.
server {
listen 80
server_name myapp.com
}
A 500KB bundle reaches the browser as 500KB.
After
Enable compression in Nginx.
gzip on
gzip_vary on
gzip_min_length 1000
gzip_types
text/plain
text/css
application/javascript
application/json
image/svg+xml
brotli on
brotli_comp_level 6
brotli_types
text/plain
text/css
application/javascript
application/json
Gzip reduces bundles roughly 70 percent. Brotli reduces them even further. This has a direct impact on page load time.
For teams optimizing frontend performance, these infrastructure details compound with the architectural decisions discussed in the Next.js production scaling guide. Server configuration and framework choices work together.
5. Load Balance Multiple Node.js Instances
Node uses a single CPU core per process. If your server has four cores, three sit idle unless you run multiple instances.
Before
Single Node process.
node server.js
Nginx forwards all traffic to one instance.
proxy_pass http://127.0.0.1:3000
Only one CPU core handles every request.
After
Run several Node processes and let Nginx distribute traffic.
upstream nodejs_app {
least_conn
server 127.0.0.1:3000
server 127.0.0.1:3001
server 127.0.0.1:3002
server 127.0.0.1:3003
}
server {
listen 80
server_name myapp.com
location / {
proxy_pass http://nodejs_app
}
}
Each request goes to the server with the fewest active connections. This fully utilizes CPU resources and increases throughput dramatically.
6. Protect Your API with Nginx Rate Limiting
Rate limiting in application code still consumes Node resources. Nginx can block abusive traffic before it reaches your app.
Before
Node handles all requests including malicious bursts.
app.post("/api/login", loginHandler)
An attacker can hammer the endpoint and tie up the event loop.
After
Limit requests at the proxy level.
limit_req_zone $binary_remote_addr zone=login:10m rate=3r/s
server {
listen 80
server_name myapp.com
location /api/login {
limit_req zone=login burst=5 nodelay
limit_req_status 429
proxy_pass http://nodejs_app
}
location / {
proxy_pass http://nodejs_app
}
}
Only three login attempts per second per IP are allowed. Malicious traffic is rejected instantly without touching Node.
What This Means for JavaScript Developers
Most developers learn React, Node, and databases. Far fewer understand how their application actually runs in production.
Nginx sits directly in that gap.
Learning these six patterns takes a weekend and immediately improves the reliability and performance of any Node deployment. More importantly, it signals something hiring managers notice quickly. You are not just someone who writes JavaScript. You are someone who can run software in production.
Top comments (1)
Appreciate it.