Starting a new Node.js project often involves tedious boilerplate work. Setting up folder structures, configuring Express, establishing database connections, managing migrations, configuring Redis, writing Dockerfiles, and setting up CI/CD pipelines can eat up hours of your time before you even write your first line of business logic.
Today, I'll walk you through a step-by-step guide to set up a Node.js service from scratch to a production-ready state. We will use the MVC architecture, TypeScript, MySQL with Flyway for database migrations, Redis Caching, run everything seamlessly on Docker Compose, and automate the CI/CD pipeline with GitHub Actions.
Let's dive in!
🎯 TL;DR - Plug-and-Play Source Code:
If you prefer to skip the typing and see the results immediately, I've packaged the entire source code from this article into a production-ready template on GitHub.🔗 Repo: paudang/nodejs-service
(Just
git clone, rundocker-compose up -d, and enjoy! Don't forget to drop a Star ⭐ if you find it helpful!)
Step 1: Project Initialization & Dependencies
First, let's create a new directory for our project:
mkdir nodejs-service
cd nodejs-service
npm init -y
Install the essential packages for Express, Database, Caching, and Security:
npm install express cors helmet hpp express-rate-limit dotenv morgan swagger-ui-express pug sequelize mysql2 ioredis
Install the development dependencies:
npm install -D typescript @types/node @types/express @types/cors @types/morgan @types/swagger-ui-express ts-node typescript tsconfig-paths tsc-alias
Initialize your tsconfig.json for TypeScript and configure optimal Path Aliases (@/*):
{
"compilerOptions": {
"target": "es2020",
"module": "commonjs",
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"baseUrl": ".",
"paths": {
"@/*": ["src/*"]
}
},
"include": ["src/**/*"]
}
Next, open your package.json and add these crucial Scripts to run and build your project:
"scripts": {
"start": "node dist/index.js",
"dev": "ts-node -r tsconfig-paths/register src/index.ts",
"build": "tsc && tsc-alias"
}
Step 2: Architecture Setup (MVC) & Migration Management (Flyway)
1. Source Code Folder Structure
Organizing your code efficiently is vital. Using the MVC (Model - View - Controller) pattern, let's structure our src directory cleanly:
mkdir src
cd src
mkdir config controllers models routes utils views
-
config: Configurations for DB (MySQL), Redis, and Swagger. -
controllers: Handles HTTP Requests & Responses to coordinate logic. -
models: Defines database schemas/entities in code. -
routes: Registers API endpoint routing. -
utils: Shared utility functions (Logger, formatters). -
views: Contains.pugfiles for Server-side rendering (optional).
2. The Power of Flyway in Production 🚀
When multiple developers collaborate or when you deploy to Production, synchronizing Database Schemas (tables, columns, seed data) becomes a massive headache. You cannot ssh into every server to run manual SQL queries.
Enter Flyway! Flyway is the premier Database Version Control tool utilizing pure SQL files.
-
How it works: You predefine SQL migration scripts using a strict versioning convention (e.g.,
V1__Initial_Setup.sql,V2__Add_Users_Table.sql). - In practice: Upon system startup, Flyway automatically checks the DB state, identifies missing migrations, and executes them sequentially. This guarantees your database schema is always accurate and tamper-proof.
Create a directory for Flyway SQL files at the project root:
mkdir -p flyway/sql
Inside, you can create a V1__Initial_Setup.sql file containing your first table creation queries.
Step 3: Connection Configuration (Database & Redis)
1. Environment Variables (.env)
Create a .env file in the root directory:
PORT=3000
DB_HOST=localhost
DB_PORT=3306
DB_USER=root
DB_PASSWORD=root
DB_NAME=demo
REDIS_HOST=localhost
REDIS_PORT=6379
2. MySQL Connection (Sequelize)
In src/config/database.ts, configure your Sequelize instance:
import { Sequelize } from 'sequelize';
import dotenv from 'dotenv';
dotenv.config();
const dialect = 'mysql';
const sequelize = new Sequelize(
process.env.DB_NAME || 'demo',
process.env.DB_USER || 'root',
process.env.DB_PASSWORD || 'root',
{
host: process.env.DB_HOST || '127.0.0.1',
dialect: dialect,
logging: false,
port: parseInt(process.env.DB_PORT || '3306')
}
);
export default sequelize;
3. Redis Caching Setup
In src/config/redisClient.ts, initialize the connection and create a Singleton Service (cacheService) for common caching operations (get, set, delete):
import Redis from 'ioredis';
import dotenv from 'dotenv';
dotenv.config();
class RedisService {
private client: Redis;
private static instance: RedisService;
private constructor() {
this.client = new Redis({
host: process.env.REDIS_HOST || 'localhost',
port: Number(process.env.REDIS_PORT) || 6379,
});
this.client.on('connect', () => console.log('Redis connected'));
}
public static getInstance(): RedisService {
if (!RedisService.instance) {
RedisService.instance = new RedisService();
}
return RedisService.instance;
}
// Utility: Checks Cache, if MISS, fetches from DB and stores in Cache
public async getOrSet<T>(key: string, fetcher: () => Promise<T>, ttl: number = 3600): Promise<T> {
const cached = await this.client.get(key);
if (cached) return JSON.parse(cached);
const data = await fetcher();
if (data) await this.client.set(key, JSON.stringify(data), 'EX', ttl);
return data;
}
public async del(key: string): Promise<void> {
await this.client.del(key);
}
}
// Export the singleton instance (alias: cacheService)
export default RedisService.getInstance();
Step 4: The Entry Point (src/index.ts)
index.ts is the heart of your application, integrating Express, Security Middlewares, triggering DB sync, and booting the server:
import express, { Request, Response } from 'express';
import cors from 'cors';
import helmet from 'helmet';
import rateLimit from 'express-rate-limit';
import dotenv from 'dotenv';
import apiRoutes from '@/routes/api'; // 🔥 Imported from Step 5
dotenv.config();
const app = express();
const port = process.env.PORT || 3000;
app.use(helmet());
app.use(cors({ origin: '*' }));
app.use(rateLimit({ windowMs: 10 * 60 * 1000, max: 100 }));
app.use(express.json());
// Register all API Routes under the /api prefix
app.use('/api', apiRoutes);
app.get('/health', (req: Request, res: Response) => {
res.json({ status: 'UP', message: 'System is running normally' });
});
// Standard server startup sequence
const startServer = async () => {
console.log(`Server running on port ${port}`);
};
// Retry DB connection in case the DB container starts slowly in the Docker network
const syncDatabase = async () => {
let retries = 30;
while (retries) {
try {
const sequelize = (await import('@/config/database')).default;
await sequelize.sync();
console.log('Database synced & connected!');
app.listen(port, startServer);
break;
} catch (error) {
console.error('Error syncing database:', error);
retries -= 1;
await new Promise(res => setTimeout(res, 5000));
}
}
};
syncDatabase();
Step 5: Building a REST API (Users Practical Example)
Let's write a basic CRUD API flow for a User entity to see our architecture in action.
1. Define Model (Table users)
In src/models/User.ts, we use Sequelize to map our code to MySQL:
import { DataTypes, Model } from 'sequelize';
import sequelize from '@/config/database';
class User extends Model {
public id!: number;
public name!: string;
public email!: string;
}
User.init({
id: { type: DataTypes.INTEGER, autoIncrement: true, primaryKey: true },
name: { type: DataTypes.STRING, allowNull: false },
email: { type: DataTypes.STRING, allowNull: false, unique: true },
}, { sequelize, tableName: 'users', underscored: true }
);
export default User;
2. Controller Logic (Powered by Redis Cache)
In src/controllers/userController.ts, we handle fetching data (with caching) and creating new users:
import { Request, Response } from 'express';
import User from '@/models/User';
import cacheService from '@/config/redisClient';
export class UserController {
async getUsers(req: Request, res: Response) {
try {
// Fetch from Cache, if MISS, query DB and Cache for 60s
const users = await cacheService.getOrSet('users:all', async () => {
return await User.findAll();
}, 60);
res.json(users);
} catch (error) {
res.status(500).json({ error: 'Internal Server Error' });
}
}
async createUser(req: Request, res: Response) {
try {
const { name, email } = req.body;
const user = await User.create({ name, email });
// Remember to invalidate the cache when new data is added!
await cacheService.del('users:all');
res.status(201).json(user);
} catch (error) {
res.status(500).json({ error: 'Internal Server Error' });
}
}
}
3. Routing & Exposing the API
In src/routes/api.ts, connect the controller to our endpoints:
import { Router, Request, Response } from 'express';
import { UserController } from '@/controllers/userController';
const router = Router();
const userController = new UserController();
// Test by calling HTTP GET /api/users or POST /api/users
router.get('/users', (req: Request, res: Response) => userController.getUsers(req, res));
router.post('/users', (req: Request, res: Response) => userController.createUser(req, res));
export default router;
Step 6: "All In One" Packaging with Docker-Compose
Any Production server requires Docker.
1. Dockerfile (Optimized Multi-stage build)
# Stage 1: Builder
FROM node:18-alpine AS builder
WORKDIR /app
COPY package*.json ./
COPY tsconfig*.json ./
RUN npm ci || npm ci || npm ci
COPY . .
RUN npm run build
# Stage 2: Production
FROM node:18-alpine AS production
WORKDIR /app
ENV NODE_ENV=production
COPY package*.json ./
RUN npm ci --only=production --ignore-scripts || npm ci --only=production --ignore-scripts || npm ci --only=production --ignore-scripts
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/src/views ./dist/views
COPY --from=builder /app/public ./public
EXPOSE 3000
RUN mkdir -p logs && chown -R node:node logs
USER node
CMD ["npm", "start"]
2. docker-compose.yml (App + MySQL + Flyway + Redis)
This is the ultimate combo. Upon execution, the db container initializes MySQL. Immediately after, flyway jumps in to create the Tables. Finally, the app connects while redis stands ready for Caching:
services:
app:
build: .
ports:
- "${PORT:-3000}:3000"
depends_on:
- db
environment:
- PORT=3000
- REDIS_HOST=redis
- REDIS_PORT=6379
- DB_HOST=db
- DB_USER=root
- DB_PASSWORD=root
- DB_NAME=demo
db:
image: mysql:8.0
restart: always
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: demo
ports:
- "${DB_PORT:-3306}:3306"
volumes:
- ./flyway/sql:/docker-entrypoint-initdb.d
flyway:
image: flyway/flyway
command: -connectRetries=60 migrate
volumes:
- ./flyway/sql:/flyway/sql
environment:
FLYWAY_URL: jdbc:mysql://db:3306/demo
FLYWAY_USER: root
FLYWAY_PASSWORD: root
depends_on:
- db
redis:
image: redis:alpine
restart: always
ports:
- "${REDIS_PORT:-6379}:6379"
volumes:
mysql_data:
Step 7: CI/CD Automation with GitHub Actions
File .github/workflows/ci.yml:
name: Node.js CI
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [18.x, 20.x]
steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- name: Install Dependencies
run: npm ci
- name: Lint Code
run: npm run lint
- name: Run Tests
run: npm test
- name: Build
run: npm run build --if-present
This CI pipeline ensures that every Pull Request must pass Linting and Unit Testing before it can be merged.
The Biggest Twist at the End... 🤫
Did you think you were just reading a standard repository setup tutorial?
👉 The Truth Is: The entire base code behind this repository—the complex web of configurations linking Express, Sequelize, Docker, Redis, Flyway (Migration files), Eslint, and GitHub Actions workflows—was GENERATED BY AN AUTOMATION TOOL IN UNDER 1 MINUTE!
In software development, speed is gold. If you want to end the vicious cycle of manually wiring up identical base code for every new project, you should leverage the exact automation tool I built and optimized for this purpose.
If you want to auto-generate a high-quality boilerplate like this (or freely swap to Clean Architecture, integrate Kafka, etc.), check out my tool with a detailed guide right here:
🔗 NPM: nodejs-quickstart-structure
I hope this architectural breakdown and code-generation tool will make your software product launches much smoother and significantly faster. Don't forget to upvote to share this with the community! 🔥
Top comments (0)