Deploying a Node.js application to production can feel overwhelming, but AWS EC2 provides a robust, scalable foundation. This guide walks you through launching an EC2 instance, deploying your Node.js app, and implementing production-grade practices—all while exploring how Node.js Streams enhance memory efficiency in server-side applications.
Why AWS EC2 for Node.js?
EC2 gives you complete control over your server environment, perfect for custom Node.js deployments. Unlike managed platforms, you configure the OS, security, and runtime exactly as needed for production workloads.
Step 1: Launching Your EC2 Instance
Instance Configuration
- Choose AMI: Select Ubuntu Server 22.04 LTS (free tier eligible)
- Instance Type: Start with t2.micro for testing (1 vCPU, 1GB RAM)
-
Key Pair: Create a new
.pemkey pair and download it—this is your SSH access credential -
Security Group: Configure these rules:
- SSH (Port 22): Your IP only
-
HTTP (Port 80): Open to
0.0.0.0/0 -
HTTPS (Port 443): Open to
0.0.0.0/0
Launch the instance and note its Public IPv4 address.
Step 2: Deploying Your Node.js Application
Connect via SSH
chmod 400 your-key.pem
ssh -i your-key.pem ubuntu@your-ec2-ip
Install Node.js with NVM
NVM allows version management without root permissions—critical for EC2 production setup.
# Install NVM
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
source ~/.bashrc
# Install Node.js LTS
nvm install --lts
node -v # Verify installation
Deploy Your Application
# Clone your repository
git clone https://github.com/yourusername/your-app.git
cd your-app
# Install dependencies
npm install
# Test the app
npm start
Step 3: Production Best Practices
PM2 Process Manager
PM2 keeps your app running, auto-restarts on crashes, and manages logs—essential for AWS EC2 deployment stability.
# Install PM2 globally
npm install -g pm2
# Start your app
pm2 start server.js --name "my-app"
# Auto-restart on reboot
pm2 startup
pm2 save
NGINX Reverse Proxy
NGINX handles SSL termination, load balancing, and serves static assets efficiently.
# Install NGINX
sudo apt update
sudo apt install nginx -y
# Configure reverse proxy
sudo nano /etc/nginx/sites-available/default
Add this configuration:
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
Restart NGINX:
sudo systemctl restart nginx
Node.js Streams: Memory Efficiency in Production
Node.js Streams are crucial for EC2 production setup when handling large data transfers, file uploads, or API responses. They process data in chunks rather than loading everything into memory.
Why Streams Matter for Memory Efficiency
Traditional file reading loads the entire file into RAM before processing:
// Bad: Loads entire file into memory
const data = fs.readFileSync('large-file.csv');
res.send(data); // Memory spike!
Streams process data incrementally, keeping memory usage constant regardless of file size:
// Good: Constant memory usage
const readStream = fs.createReadStream('large-file.csv');
readStream.pipe(res); // Streams chunks to client
Practical Stream Example
Here's a complete example demonstrating file upload handling with Streams for AWS EC2 deployment:
import { createReadStream, createWriteStream } from 'fs';
import { pipeline } from 'stream/promises';
import express from 'express';
const app = express();
// File upload endpoint using Streams
app.post('/upload', async (req, res) => {
try {
const writeStream = createWriteStream('./uploads/file.zip');
await pipeline(req, writeStream);
res.json({ message: 'Upload complete' });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
// Stream large file download
app.get('/download', (req, res) => {
const readStream = createReadStream('./large-data.json');
res.setHeader('Content-Type', 'application/json');
readStream.pipe(res);
});
app.listen(3000, () => console.log('Server running on port 3000'));
Stream Types and Benefits
- Readable Streams: Read data from sources (files, HTTP requests)
- Writable Streams: Write data to destinations (files, HTTP responses)
- Benefits: Handles multi-GB files with ~50MB RAM usage vs. loading entire files into memory
Security Hardening for EC2 Production Setup
- Restrict SSH Access: Update Security Group to whitelist only your IP for port 22
- Enable UFW Firewall:
sudo ufw allow 22
sudo ufw allow 80
sudo ufw allow 443
sudo ufw enable
-
Environment Variables: Use
.envfiles withdotenvpackage—never commit secrets -
Regular Updates: Schedule weekly security patches with
sudo apt update && sudo apt upgrade
Monitoring and Maintenance
# PM2 monitoring
pm2 monit
# View logs
pm2 logs my-app
# NGINX logs
sudo tail -f /var/log/nginx/access.log
Conclusion
Your Node.js app is now production-ready on AWS EC2. You've configured PM2 for process management, NGINX for reverse proxying, and leveraged Node.js Streams for memory-efficient data handling. This EC2 production setup scales with your application's growth while maintaining low memory overhead.
Next Steps: Implement CI/CD with GitHub Actions, configure SSL with Let's Encrypt, and set up CloudWatch for advanced monitoring.
Top comments (0)