DEV Community

Cover image for 15 Minutes to "Ship It": From Zero to Production with Node.js (Clean Architecture + REST API + Kafka + Docker & CI/CD)
Pau Dang
Pau Dang

Posted on

15 Minutes to "Ship It": From Zero to Production with Node.js (Clean Architecture + REST API + Kafka + Docker & CI/CD)

Starting a new Node.js project often involves tedious repetitive tasks: scaffolding directory structures, setting up Express, configuring database connections, managing migrations, and integrating messaging systems like Kafka. This "boilerplate phase" can eat up hours of your initial development time.

Today, I’ll show you how to go from zero to a production-ready environment in minutes. We will build a high-performance Node.js service using Clean Architecture, TypeScript, MySQL, Flyway for database migrations, Kafka for real-time event-driven messaging, Docker Compose for orchestration, and GitHub Actions for CI/CD.

Let’s dive in!

🎯 "Ready-to-Run" Source Code for You:
Instead of manually copying snippets, I’ve packaged the entire source code for this article into a production-grade template on GitHub. This project has already reached 3,000+ downloads and is being used by developers for real-world services.

🔗 Repo: paudang/nodejs-clean-rest-kafka

(Just git clone, run docker-compose up -d, and you're live!)


Step 1: Initialize Project & Install Dependencies

First, let's create the project directory:

mkdir nodejs-clean-rest-kafka
cd nodejs-clean-rest-kafka
npm init -y
Enter fullscreen mode Exit fullscreen mode

Install the essential production libraries:

npm install express cors helmet hpp express-rate-limit dotenv morgan kafkajs sequelize mysql2 winston
Enter fullscreen mode Exit fullscreen mode

Install development dependencies:

npm install -D typescript @types/node @types/express @types/cors @types/morgan ts-node tsconfig-paths tsc-alias jest ts-jest @types/jest
Enter fullscreen mode Exit fullscreen mode

Initialize tsconfig.json with Path Aliases (@/*):

{
  "compilerOptions": {
    "target": "es2020",
    "module": "commonjs",
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "baseUrl": ".",
    "paths": {
      "@/*": ["src/*"]
    }
  },
  "include": ["src/**/*"]
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Architecting for Scale (Clean Architecture) 🏗️

Using Clean Architecture ensures your codebase remains decoupled and highly testable. We divide the source code into distinct layers:

mkdir -p src/domain src/usecases src/interfaces src/infrastructure src/utils
Enter fullscreen mode Exit fullscreen mode
  • src/domain: Core business entities and logic (framework-independent).
  • src/usecases: Application-specific business rules (The "Interactors").
  • src/interfaces: Adapters such as Controllers and HTTP Routes.
  • src/infrastructure: Technical details like Database connections, Kafka clients, and Loggers.
  • src/utils: Shared utility functions.

This separation allows you to swap your database or framework without touching the core business logic.


Step 3: Event-Driven Messaging with Kafka 🚀

In a microservices architecture, Kafka acts as the "heartbeat" for asynchronous communication. We’ll build a KafkaService using the Connection Promise pattern to ensure the Producer is fully connected before sending messages, preventing data loss during startup.

Inside src/infrastructure/messaging/kafkaClient.ts:

import { Kafka, Producer } from 'kafkajs';

export class KafkaService {
    private producer: Producer;
    private isConnected = false;
    private connectionPromise: Promise<void> | null = null;

    constructor() {
        const kafka = new Kafka({ clientId: 'user-service', brokers: ['localhost:9092'] });
        this.producer = kafka.producer();
    }

    // "Connection Promise" ensures we only connect once efficiently
    async connect() {
        if (this.connectionPromise) return this.connectionPromise;

        this.connectionPromise = (async () => {
            await this.producer.connect();
            this.isConnected = true;
            console.log('[Kafka] Producer connected successfully');
        })();
        return this.connectionPromise;
    }

    async sendEvent(topic: string, action: string, payload: any) {
        await this.connect(); // Always wait for readiness
        await this.producer.send({
            topic,
            messages: [{ value: JSON.stringify({ action, payload, ts: new Date() }) }],
        });
        console.log(`[Kafka] Triggered ${action} to ${topic}`);
    }
}

export const kafkaService = new KafkaService();
Enter fullscreen mode Exit fullscreen mode

Step 3.5: Scaling Consumers with Clean Interfaces 🛠️

Managing dozens of event types can quickly become messy. We use Abstract Base Classes and Schema Validation to keep consumers organized:

1. BaseConsumer (The Blueprint)

At src/interfaces/messaging/baseConsumer.ts, we define a template for all consumers. It handles JSON parsing and error logging, so subclasses can focus solely on business logic:

export abstract class BaseConsumer {
  abstract topic: string;
  abstract handle(data: unknown): Promise<void>;

  async onMessage({ message }: EachMessagePayload) {
      const rawValue = message.value?.toString();
      if (!rawValue) return;
      const data = JSON.parse(rawValue);
      await this.handle(data);
  }
}
Enter fullscreen mode Exit fullscreen mode

2. Schema Validation (The Contract)

Using Zod at src/interfaces/messaging/schemas/userEventSchema.ts, we define the contract between Producer and Consumer. This ensures type safety across the wire.

3. WelcomeEmailConsumer (The Implementation)

Logic that triggers an email when a USER_CREATED event is received:

export class WelcomeEmailConsumer extends BaseConsumer {
  topic = 'user-topic';

  async handle(data: unknown) {
    const result = UserEventSchema.safeParse(data);
    if (result.success && result.data.action === 'USER_CREATED') {
      console.log(`[Kafka] 📧 Sending welcome email to ${result.data.payload.email}...`);
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Step 4: Database Version Control with Flyway

Manual database changes are a nightmare in production. Flyway manages your schema versioning through simple SQL files.

Example V1__Create_Users_Table.sql:

CREATE TABLE users (
    id INT AUTO_INCREMENT PRIMARY KEY,
    name VARCHAR(255) NOT NULL,
    email VARCHAR(255) NOT NULL UNIQUE,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Enter fullscreen mode Exit fullscreen mode

Step 5: Clean UseCases & Controllers

1. UseCase (The Interactor)

Decoupled logic at src/usecases/createUser.ts:

export const createUserUseCase = async (name: string, email: string) => {
    const user = await userRepository.save({ name, email }); // Repository abstraction
    await kafkaService.sendEvent('user-events', 'USER_CREATED', { id: user.id, email: user.email });
    return user;
};
Enter fullscreen mode Exit fullscreen mode

2. Controller (Interface Layer)

The Controller's sole task is to receive requests, call use cases, and return responses. Very clean!

import { Request, Response } from 'express';
import { createUserUseCase } from '@/usecases/createUser';

export const createUser = async (req: Request, res: Response) => {
    try {
        const { name, email } = req.body;
        const user = await createUserUseCase(name, email);
        res.status(201).json(user);
    } catch (err) {
        res.status(500).json({ error: 'Internal Server Error' });
    }
};
Enter fullscreen mode Exit fullscreen mode

Step 6: Docker for Production Excellence

1. Dockerfile (Multi-stage Build)

We separate the build environment from the runtime to minimize image size and attack surface:

FROM node:22-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

FROM node:22-alpine AS production
WORKDIR /app
ENV NODE_ENV=production
ENV NPM_CONFIG_UPDATE_NOTIFIER=false
COPY package*.json ./
RUN npm ci --only=production
COPY --from=builder /app/dist ./dist
EXPOSE 3000
CMD ["npm", "start"]
Enter fullscreen mode Exit fullscreen mode

2. docker-compose.yml (The Full Stack)

One command to rule them all:

services:
  app:
    build: .
    ports: ["3000:3000"]
    depends_on: [db, kafka]
    environment:
      - KAFKA_BROKER=kafka:29092
      - DB_HOST=db
  db:
    image: mysql:8.0
    ports: ["3306:3306"]
    volumes: ["./flyway/sql:/docker-entrypoint-initdb.d"]
  kafka:
    image: confluentinc/cp-kafka:7.4.0
    depends_on: [zookeeper]
    ports: ["9092:9092"]
Enter fullscreen mode Exit fullscreen mode

One Last Surprise... 🤫

Think this took hours to set up?

👉 The Truth Is: The entire project structure—Clean Architecture, Kafka Producers/Consumers, Flyway migrations, Docker configs, and CI/CD pipelines—was generated in under 60 seconds using an automation tool I built.

Time-to-market is everything. Stop reinventing the wheel and start shipping business value.

Want to generate a "perfect" Node.js repo like this yourself? Try my CLI tool:

npx nodejs-quickstart-structure init
Enter fullscreen mode Exit fullscreen mode

Read the full document here: Nodejs Quickstart Structure

Building production-ready software shouldn't be a chore. I hope this helps you ship your next big idea faster than ever. If you found this useful, don't forget to give a Star ⭐ on GitHub! 🔥

Top comments (0)