This article provides a detailed technical explanation of a CI/CD pipeline built for a JavaScript API using:
- GitLab CI/CD
- Node.js and Express
- Docker
- Docker Compose
- Jest for testing
- npm audit for dependency security checks
Instead of only showing the pipeline, this guide explains the role of each file and how every part contributes to the delivery workflow.
1. Project Architecture Overview
The project follows this structure:
project/
├── index.js
├── package.json
├── package-lock.json
├── tests/
│ └── app.test.js
├── Dockerfile
├── compose.yml
└── .gitlab-ci.yml
Each file plays a specific role in development, validation, security, and deployment.
2. Application Layer — index.js
The application is a small Express API.
const express = require('express');
const app = express();
const PORT = process.env.PORT || 3000;
app.get('/', (req, res) => {
res.json({ message: 'UrbanHub API', status: 'ok' });
});
app.get('/health', (req, res) => {
res.status(200).json({ service: 'up' });
});
app.get('/config', (req, res) => {
res.json({ environment: process.env.APP_ENV || 'dev' });
});
if (require.main === module) {
app.listen(PORT, '0.0.0.0', () => {
console.log(`Server is running on port ${PORT}`);
});
}
module.exports = app;
Explanation
-
express()creates the web application instance. -
PORTis taken from an environment variable or defaults to3000. -
/returns a basic JSON response. -
/healthis a health check endpoint. -
/configexposes the current runtime environment. -
0.0.0.0makes the service reachable from outside the container. -
module.exports = appallows the application to be imported in test files.
This structure is common in Node.js APIs because it separates the app definition from the server execution logic.
3. Dependencies — package.json
This file describes the project metadata, scripts, and dependencies.
{
"name": "urbanhub-api",
"version": "1.0.0",
"description": "Simple Express API with CI/CD",
"main": "index.js",
"scripts": {
"test": "jest"
},
"dependencies": {
"express": "^5.0.0"
},
"devDependencies": {
"jest": "^29.0.0",
"supertest": "^7.0.0"
}
}
Explanation
-
maindefines the application entry point. -
scripts.teststandardizes how tests are run. -
expressis used for the API. -
jestis the test framework. -
supertestis used to test HTTP endpoints without starting a real external server.
This file is central to both local development and CI execution.
4. Test Layer — tests/app.test.js
The tests validate the behavior of the API.
const request = require('supertest');
const app = require('../index');
describe('API tests', () => {
test('GET / should return 200', async () => {
const res = await request(app).get('/');
expect(res.statusCode).toBe(200);
expect(res.body.status).toBe('ok');
});
test('GET /health should return 200', async () => {
const res = await request(app).get('/health');
expect(res.statusCode).toBe(200);
});
});
Explanation
-
supertestsends requests directly to the Express app. - The first test validates the root endpoint.
- The second test validates the health endpoint.
- These tests ensure that the service responds correctly before deployment.
In CI/CD, this stage prevents a broken application from reaching deployment.
5. Dockerfile — Containerization Explained
The Dockerfile defines how the application image is built.
FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
ENV APP_ENV=production
EXPOSE 3000
CMD ["node", "index.js"]
Line-by-line explanation
FROM node:20-alpine
Uses a lightweight Node.js image.
-
node:20provides the runtime -
alpinekeeps the image small and efficient
WORKDIR /app
Defines /app as the working directory inside the container.
COPY package*.json ./
Copies package.json and package-lock.json first.
This improves Docker layer caching because dependencies do not need to be reinstalled on every code change.
RUN npm ci
Installs dependencies in a reproducible way.
npm ci is preferred in CI/CD because it strictly follows the lock file and is more deterministic than npm install.
COPY . .
Copies the source code into the image.
ENV APP_ENV=production
Defines the runtime environment variable.
EXPOSE 3000
Documents that the container listens on port 3000.
CMD ["node", "index.js"]
Starts the application when the container runs.
6. Docker Compose — compose.yml
Docker Compose is used to run the API in a structured way.
services:
api:
image: urbanhub-api:latest
env_file:
- .env
ports:
- "3000:3000"
restart: unless-stopped
Explanation
services
Defines the list of services managed by Compose.
api
The service name for the application container.
image: urbanhub-api:latest
Uses the Docker image produced during the build phase.
env_file
Loads environment variables from a dedicated file.
This helps separate configuration from source code.
ports
Maps port 3000 on the host to port 3000 in the container.
restart: unless-stopped
Restarts the service automatically unless it is manually stopped.
For a simple API, this is often enough. If the app used a database, more services could be added here.
7. CI/CD Pipeline — .gitlab-ci.yml
The pipeline is organized into four stages:
stages:
- build
- test
- security
- deploy
The execution flow is:
Build → Test → Security → Deploy
This gives a clear delivery order:
- prepare the app
- validate behavior
- check security
- deploy only if previous stages succeed
8. Global Variables
variables:
APP_DIR: "/home/gitlab-runner/node_api"
NPM_CONFIG_CACHE: "$CI_PROJECT_DIR/.npm"
Explanation
-
APP_DIRdefines the target deployment directory on the runner host. -
NPM_CONFIG_CACHEspeeds up package installation by caching dependencies.
These variables make the pipeline cleaner and easier to maintain.
9. Build Stage
build:
stage: build
script:
- npm ci
- node -c index.js
rules:
- if: '$CI_COMMIT_BRANCH == "develop"'
- if: '$CI_COMMIT_BRANCH == "main"'
Explanation
-
npm ciinstalls dependencies. -
node -c index.jschecks JavaScript syntax. - The stage runs on both
developandmain.
This stage catches basic syntax and installation issues early.
10. Test Stage
unit_tests:
stage: test
script:
- npm ci
- npm test
rules:
- if: '$CI_COMMIT_BRANCH == "develop"'
- if: '$CI_COMMIT_BRANCH == "main"'
Explanation
-
npm ciensures the environment is clean and reproducible. -
npm testruns the Jest suite. - The pipeline stops here if tests fail.
This is the most important validation gate before deployment.
11. Security Stage
security_scan:
stage: security
script:
- npm ci
- npm audit --audit-level=high
rules:
- if: '$CI_COMMIT_BRANCH == "develop"'
- if: '$CI_COMMIT_BRANCH == "main"'
Explanation
-
npm auditchecks project dependencies for known vulnerabilities. -
--audit-level=highfocuses on serious issues. - This introduces a DevSecOps mindset directly into the pipeline.
For a more advanced pipeline, this stage could be complemented by container scanning tools such as Trivy.
12. Deployment Stage
deploy:
stage: deploy
script:
- echo "Deploying into $APP_DIR"
- |
if [ ! -d "$APP_DIR/.git" ]; then
git clone "$CI_REPOSITORY_URL" "$APP_DIR"
fi
- cd "$APP_DIR"
- git checkout main
- git pull origin main
- |
if [ ! -f .env ]; then
cp .env.example .env
fi
- docker compose down || true
- docker build --network=host -t urbanhub-api:latest .
- docker compose up -d
- sleep 5
- docker ps
- docker logs $(docker ps -q --filter "name=api") || true
- curl --fail http://localhost:3000/ || (echo "App not reachable" && exit 1)
rules:
- if: '$CI_COMMIT_BRANCH == "main"'
Explanation
This stage performs several actions:
Clone the project if missing
If the deployment directory does not already contain the Git repository, it is cloned.
Pull the latest code
The deployment always runs against the latest main branch state.
Ensure environment configuration exists
If .env does not exist, it is generated from .env.example.
Stop old containers
docker compose down || true ensures the previous version is stopped cleanly.
Rebuild the image
The latest source code is rebuilt into a fresh Docker image.
Start the new container
docker compose up -d launches the service in detached mode.
Verify deployment
The pipeline checks:
- running containers
- container logs
- HTTP accessibility via
curl
This makes deployment not only automatic, but also self-validated.
13. Branch-Based Execution Strategy
The pipeline uses branch rules.
Build, test, and security
Run on:
developmain
This allows validation both in integration and stable branches.
Deploy
Runs only on:
main
This protects deployment from accidental pushes on development branches.
It creates a simple but effective promotion model:
-
developfor validation -
mainfor delivery
14. DevSecOps Integration
Security is not treated as an afterthought.
This pipeline integrates security by:
- testing the application before deployment
- auditing dependencies for known vulnerabilities
- isolating the runtime inside containers
- verifying the deployed service automatically
This approach reduces the chance of shipping broken or unsafe code.
15. Strengths of This Pipeline
This CI/CD implementation provides:
- clear stage separation
- reproducible dependency installation with
npm ci - automated syntax validation
- automated API testing
- built-in dependency security audit
- controlled deployment on the main branch
- runtime verification after deployment
It is simple, but complete enough to represent a real DevOps workflow.
16. Limitations
Like any basic pipeline, this one has some limitations:
- no staging environment
- no rollback mechanism
- no container vulnerability scan
- no performance testing
- single-node deployment model
These are acceptable for a lightweight project, but could be extended in a production context.
17. Possible Improvements
This pipeline could be improved with:
- ESLint integration for code quality
- SonarQube or CodeQL for advanced static analysis
- Trivy for Docker image scanning
- staging and production environments
- monitoring and alerting
- rollback support in case of failed deployment
Conclusion
This JavaScript CI/CD pipeline demonstrates a complete delivery chain for a Node.js API.
It combines:
- source validation
- automated testing
- dependency security checks
- automated deployment
The key strength of this setup is not complexity, but reliability.
A good pipeline is not the one with the most tools.
It is the one that delivers software in a predictable, controlled, and trustworthy way.
Top comments (0)