Understanding DevOps: Bridging Development and Operations
If you have ever built an app and thought:
- How do I make sure it runs the same everywhere?
- How can I test changes automatically?
- How do I go from my laptop to production safely? β¦then this guide is for you.
In this hands-on tutorial, we will walk step-by-step through setting up a complete DevOps pipeline, from writing simple Node.js code, all the way to automated CI/CD deployments with Docker and Kubernetes.
By the end, you will know how to:
- Set up Git, Docker, and GitHub Actions
- Write and run automated tests
- Build lightweight, secure Docker images
- Use CI/CD pipelines to test, build, and deploy automatically
- Deploy to staging (for testing) and production (for real users) with confidence
Note: No prior DevOps experience required, just a willingness to code, learn, and experiment.
Hands-On Practice: Preparing Your DevOps Workspace
π§ Before You Begin: Install Required Tools
Before starting, letβs prepare your machine with the right tools:
-
Node.js (v18 or higher, LTS v20 recommended in 2025)
- Download: nodejs.org
- Choose the LTS version (20.x)
- Verify installation:
node --version npm --version
-
Git (latest stable version)
- Download: git-scm.com/downloads
- Pick your operating system
- Verify installation:
git --version
-
Docker Desktop (latest version)
- Download: docker.com/products/docker-desktop
- Install and start Docker Desktop
- Verify installation:
docker --version docker-compose --version
-
GitHub Account
- Sign up: github.com
- Needed for hosting your code and setting up CI/CD pipelines
-
Code Editor (optional but recommended)
- Suggested: Visual Studio Code
- Or use any editor you prefer (Sublime Text, Atom, etc.)
Final Check: Make Sure Everything Works
Run these commands in your terminal:
node --version # Should show v18.x+ or v20.x+
npm --version # Should show 9.x+ or 10.x+
git --version # Should show 2.34+
docker --version # Should show 24.x+
Step 1: Set Up Git for Version Control
What this step does:
This tells Git who you are (your name and email) so it can label your changes. It also creates a folder for your project and starts tracking it with Git.
One-time Git Setup
Run these commands (replace with your name and email):
git config --global user.name "Your Name"
git config --global user.email "you@example.com"
git config --global init.defaultBranch main
Create and Initialize Your Project
Now, letβs create a new folder and start Git inside it by running the following command:
mkdir devops-project
cd devops-project
git init
Step 2: Build a Node.js Web App
What this step does:
We will create a simple Node.js app that serves web pages and API endpoints. This will be the foundation for our DevOps journey.
1. Initialize Node.js Project
What it does:
Creates a package.json
file that describes your project and manages dependencies.
# Create package.json with default settings
`npm init -y`
After running this, you will see a new package.json file in your project folder.
- Update package.json
What it does:
Adds scripts, metadata, and development tools to your project setup.
Open the file in your editor, or create one with:
touch package.json
Note: Since we are using a vscode in linux npm init -y
has done the opening for us but had it been we are using ubuntu or a linux server we must use touch package.json
to open the file in the editor.
So copy and paste this content in the open file editor:
{
"name": "my-devops-project",
"version": "1.0.0",
"description": "DevOps learning project with Node.js",
"main": "app.js",
"scripts": {
"start": "node app.js",
"test": "jest",
"dev": "node app.js",
"lint": "eslint ."
},
"keywords": ["devops", "nodejs", "docker"],
"author": "Your Name",
"license": "MIT",
"engines": {
"node": ">=18.0.0"
},
"devDependencies": {
"jest": "^29.7.0",
"eslint": "^8.57.0",
"supertest": "^7.1.4"
}
}
3. Create the Main Application File
What it does:
Creates the main web server (app.js) that:
- Listens on port 3000
- Serves routes:
/
,/health
,/info
,/metrics
- Adds security headers
- Supports graceful shutdown
- Exports the server for testing
Create a the main application file using the code below:
touch app.js
Copy this into app.js:
const http = require('http');
const url = require('url');
const port = process.env.PORT || 3000;
const environment = process.env.NODE_ENV || 'development';
let requestCount = 0;
const startTime = Date.now();
const server = http.createServer((req, res) => {
requestCount++;
const timestamp = new Date().toISOString();
const { pathname } = url.parse(req.url, true);
console.log(`${timestamp} - ${req.method} ${pathname} - ${req.headers['user-agent'] || 'Unknown'}`);
// CORS headers
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE');
res.setHeader('Access-Control-Allow-Headers', 'Content-Type');
// Security headers
res.setHeader('X-Content-Type-Options', 'nosniff');
res.setHeader('X-Frame-Options', 'DENY');
res.setHeader('X-XSS-Protection', '1; mode=block');
switch (pathname) {
case '/':
res.statusCode = 200;
res.setHeader('Content-Type', 'text/html');
res.end(`<h1>I'm Getting Better at DevOps, Yay!</h1>`);
break;
case '/health':
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({ status: 'healthy', uptime: process.uptime() }, null, 2));
break;
case '/info':
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({ platform: process.platform, pid: process.pid }, null, 2));
break;
case '/metrics':
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end(`# HELP http_requests_total Total HTTP requests
# TYPE http_requests_total counter
http_requests_total ${requestCount}`);
break;
default:
res.statusCode = 404;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({ error: 'Not Found' }, null, 2));
}
});
process.on('SIGTERM', () => server.close(() => process.exit(0)));
process.on('SIGINT', () => server.close(() => process.exit(0)));
server.listen(port, () => {
console.log(`π Server running at http://localhost:${port}/`);
});
module.exports = server;
Install Dependencies
What it does:
Installs the testing and linting tools your app will use.
### Install testing and dev tools
npm install --save-dev jest eslint supertest
### Install all dependencies
npm install
You wll now see:
A node_modules/
folder with installed packages
A package-lock.json
file locking dependency versions
Step 3: Create Proper Tests
Now that your Node.js app is ready, itβs important to make sure it works every time you make changes. Automated testing lets you check your appβs endpoints without doing it manually.
- Create a tests folder β Store all your test files in one place.
-
Write test cases β Each test checks if a specific route (
/
,/health
,/info
,/metrics
) works as expected. - Configure Jest β Tell Jest how to run your tests and where to store coverage reports.
By the end of this step, you will have an automated system that validates your app every time you make a change.
Creating Proper Tests
What this step does:
Sets up automated testing so your application is checked every time you make changes.
1. Create Tests Directory and File
### Create a folder for your tests
mkdir tests
### Create the main test file
touch tests/app.test.js
2. Write Test Cases
What it does:
Checks if your web server endpoints are working correctly.
Copy this code into tests/app.test.js
:
const request = require('supertest');
const server = require('../app');
describe('App Endpoints', () => {
afterAll(() => {
server.close();
});
test('GET / should return welcome page', async () => {
const response = await request(server).get('/');
expect(response.status).toBe(200);
expect(response.text).toContain('DevOps Lab 2025');
});
test('GET /health should return health status', async () => {
const response = await request(server).get('/health');
expect(response.status).toBe(200);
expect(response.body.status).toBe('healthy');
expect(response.body.timestamp).toBeDefined();
expect(typeof response.body.uptime).toBe('number');
});
test('GET /info should return system info', async () => {
const response = await request(server).get('/info');
expect(response.status).toBe(200);
expect(response.body.platform).toBeDefined();
expect(response.body.node_version).toBeDefined();
});
test('GET /metrics should return prometheus metrics', async () => {
const response = await request(server).get('/metrics');
expect(response.status).toBe(200);
expect(response.text).toContain('http_requests_total');
expect(response.text).toContain('app_uptime_seconds');
});
test('GET /nonexistent should return 404', async () => {
const response = await request(server).get('/nonexistent');
expect(response.status).toBe(404);
expect(response.body.error).toBe('Not Found');
});
});
- Configure Jest
What it does:
Tells Jest how to run your tests and generate coverage reports.
# Create Jest configuration file
touch jest.config.js
Copy this into jest.config.js
:
module.exports = {
testEnvironment: 'node',
collectCoverage: true,
coverageDirectory: 'coverage',
testMatch: ['**/tests/**/*.test.js'],
verbose: true
};
Step 4: Add GitHub Actions for CI/CD
Instead of manually running tests and building Docker images, you can set up GitHub Actions. This creates a pipeline that automatically runs every time you push code to GitHub.
Hereβs what happens:
Tests run automatically β Your app is linted, tested, and security-checked.
Docker image builds automatically β When changes are pushed to main, GitHub builds and pushes your Docker image to GitHubβs container registry (GHCR).
This ensures your app is always tested and ready to deploy without extra manual steps.
Fixed GitHub Actions CI/CD
What this step does:
Sets up an automated workflow that runs tests and builds Docker images every time you push to GitHub.
1. Create Workflow Directory
# Create the GitHub Actions directory structure
mkdir -p .github/workflows
2. Create CI/CD Pipeline File
# Create the workflow file
touch .github/workflows/ci.yml
Copy this into .github/workflows/ci.yml
:
name: CI/CD Pipeline
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
env:
NODE_VERSION: '20'
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
jobs:
test:
name: Run Tests
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [18, 20]
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Run linting
run: npx eslint . --ext .js --ignore-pattern node_modules/
continue-on-error: true
- name: Run tests
run: npm test
- name: Run security audit
run: npm audit --audit-level=moderate || true
build:
name: Build Docker Image
runs-on: ubuntu-latest
needs: test
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
permissions:
contents: read
packages: write
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=ref,event=branch
type=sha,prefix={{branch}}-
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
Step 5: Create a Dockerfile
A Dockerfile is like a recipe that tells Docker how to build a container image for your app. With this, you can run your app anywhere on your laptop, a server, or in the cloud.
Hereβs what this Dockerfile does:
- Uses a multi-stage build so the final image is small and efficient
- Installs curl for health checks
- Runs the app as a non-root user for better security
- Configures health checks to make sure the app is running
- Ensures proper file permissions and clean setup
Creating a Dockerfile
What this step does:
Defines instructions for Docker to build a portable container image of your Node.js application.
1. Create Dockerfile
# Create the Dockerfile (no extension needed)
touch Dockerfile
2. Add Dockerfile Content
Copy the code below into your Dockerfile:
# Fixed and Verified Multi-stage build for optimized image
FROM node:20-alpine AS dependencies
# Update packages for security
RUN apk update && apk upgrade --no-cache
WORKDIR /app
# Copy package files first for better caching
COPY package*.json ./
# Install only production dependencies
RUN npm ci --only=production && npm cache clean --force
# Production stage
FROM node:20-alpine AS production
# Update packages and install necessary tools
RUN apk update && apk upgrade --no-cache && \
apk add --no-cache curl dumb-init && \
rm -rf /var/cache/apk/*
# Create non-root user with proper permissions
RUN addgroup -g 1001 -S nodejs && \
adduser -S nodeuser -u 1001 -G nodejs
WORKDIR /app
# Copy dependencies from previous stage with proper ownership
COPY --from=dependencies --chown=nodeuser:nodejs /app/node_modules ./node_modules
# Copy application code with proper ownership
COPY --chown=nodeuser:nodejs package*.json ./
COPY --chown=nodeuser:nodejs app.js ./
# Switch to non-root user
USER nodeuser
# Expose port
EXPOSE 3000
# Health check with proper timing for Node.js startup
HEALTHCHECK --interval=30s --timeout=10s --start-period=15s --retries=3 \
CMD curl -f http://localhost:3000/health || exit 1
# Use dumb-init for proper signal handling in containers
ENTRYPOINT ["dumb-init", "--"]
# Start application
CMD ["npm", "start"]
Step 6: Essential Configuration Files
Configuration files are like βrulesβ for your project. They tell tools what to ignore, how to check your code, and what settings to use. Adding these makes your project easier to maintain and safer to share with others.
Hereβs what we will add:
- .dockerignore β tells Docker which files to skip when building the container.
- .gitignore β tells Git which files not to track.
- .env.example β shows what environment variables the project needs (without secrets).
- .eslintrc.js β sets up ESLint to keep your code clean and consistent.
Essential Configuration Files
What this step does:
Adds important project config files to ignore unnecessary stuff, define environment settings, and enforce clean coding standards.
1. Create .dockerignore
Tells Docker which files/folders to ignore when building images.
touch .dockerignore
Copy this inside .dockerignore:
node_modules
npm-debug.log*
.git
.github
.env
.env.local
.env.*.local
logs
*.log
coverage
.nyc_output
.vscode
.idea
*.swp
*.swo
.DS_Store
Thumbs.db
README.md
tests/
jest.config.js
.eslintrc*
2. Create .gitignore
Tells Git which files not to track in version control, by running the code below:
touch .gitignore
Copy the code inside .gitignore
:
# Dependencies
node_modules/
npm-debug.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Coverage
coverage/
.nyc_output
# Environment variables
.env
.env.local
.env.*.local
# Logs
logs
*.log
# IDE
.vscode/
.idea/
*.swp
*.swo
# OS
.DS_Store
Thumbs.db
3. Create .env.example
Shows other developers what environment variables your application needs, without exposing actual secrets.
touch .env.example
Copy this inside:
# Server Configuration
PORT=3000
NODE_ENV=production
# Logging
LOG_LEVEL=info
4. Create .eslintrc.js
Configures ESLint to check code quality and style.
touch .eslintrc.js
Copy the code inside:
module.exports = {
env: {
node: true,
es2021: true,
jest: true
},
extends: ['eslint:recommended'],
parserOptions: {
ecmaVersion: 12,
sourceType: 'module'
},
rules: {
'no-console': 'off',
'no-unused-vars': ['error', { 'argsIgnorePattern': '^_' }]
}
};
Step 7: Docker Compose for Development
Docker Compose is like a shortcut remote control for your app. Instead of typing long docker commands, you define everything in one file, and then just run:
docker-compose up
This makes development faster and keeps things consistent for every developer on the project.
Docker Compose for Development
What this step does:
Creates a docker-compose.yml
file that makes it easy to run your app (and later, databases) with just one command.
1. Create Docker Compose file
touch docker-compose.yml
2. Copy and paste the following code:
version: '3.8'
services:
app:
build: .
ports:
- "3000:3000"
environment:
- NODE_ENV=development
- PORT=3000
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3000/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 10s
- Defines your app as a service
- Maps port 3000 β so you can visit http://localhost:3000 in your browser
- Sets environment variables β tells the app to run in development mode
- Adds a health check β ensures the app is running properly
- Auto restarts β if the app crashes, Docker brings it back up
Step 8: Commands to Run Everything
Now that your app, tests, Dockerfile, and CI/CD are ready, itβs time to run it locally and make sure everything works:
- Install and test with npm β Installs dependencies, runs your automated tests, and starts the app.
- Run with Docker β Build a Docker image and run it in an isolated container.
- Run with Docker Compose β Start the app (and other services) together easily.
After this step, your app should be fully functional at http://localhost:3000.
Commands to Run Everything
What this step does:
Shows how to run and test your Node.js application locally before deploying it.
1. Install and Test Locally
Install dependencies
npm install
Run tests
npm test
Start application
npm start
### In a new terminal, test endpoints
curl http://localhost:3000/ # Homepage
curl http://localhost:3000/health # Health check JSON
curl http://localhost:3000/info # System info JSON
curl http://localhost:3000/metrics # Prometheus metrics
Expected results:
- Tests pass (green checkmarks)
- Server logs: π Server running at http://localhost:3000/
- Health endpoint returns JSON: {"status":"healthy","timestamp":"...","uptime":...}
2. Docker Commands
What these commands do: Build your application into a Docker container and run it in an isolated environment.
How to run Docker commands:
Build Docker image
docker build -t my-devops-app:latest .
Run container
docker run -d \
--name my-devops-container \
-p 3000:3000 \
--restart unless-stopped \
my-devops-app:latest
Check container status
docker ps
docker logs my-devops-container
Test health check
curl http://localhost:3000/health
Stop and remove container
docker stop my-devops-container
### Start all services in docker-compose.yml
docker-compose up -d
View logs from all services
docker-compose logs -f
Stop all services and clean up
docker-compose down
- Containers build and start automatically
- Logs show your application initializing
- Application accessible at http://localhost:3000
Step 9: CI/CD with Deployment
What this step does:
Pushes your local project to GitHub so the CI/CD pipeline (from Step 4) can automatically run tests, build Docker images, and deploy.
1. Initial Commit
Take a snapshot of your full project and save it in Git history:
Add all files to Git staging area
git add .
Create your first commit with a descriptive message
git commit -m "Initial commit: Complete DevOps setup with working CI/CD"
What this does: Takes a snapshot of all your files and saves it in Git history and gets your project ready to push.
2. Connect to GitHub
First, create a new repository on GitHub.com
.
Then link your local repo to GitHub:
Set main as the default branch
git branch -M main
Connect to your GitHub repository (replace yourusername with your actual GitHub username)
git remote add origin https://github.com/yourusername/my-devops-project.git
Push your code to GitHub for the first time
git push -u origin main
Things to know before this step
- Create a new repo on GitHub before running these commands
- Copy the repo URL from GitHub (HTTPS or SSH)
- Replace yourusername with your actual GitHub username
What You will See:
- Your project files appear in your GitHub repository
- GitHub Actions automatically triggers the CI/CD pipeline you configured earlier
- Tests, linting, security checks, and Docker builds all run in the cloud π
This is where everything comes alive:
π Code β GitHub β Automated CI/CD β Ready for deployment
10. Step 10: Continuous Deployment (CD) Setup
What this step does:
Extends your CI pipeline to automatically deploy your application to staging and production environments when code is pushed.
β‘ Enhanced CI/CD Pipeline with Deployment
This enhanced pipeline will:
- β Run tests on multiple Node.js versions
- π³ Build Docker images for multiple platforms (AMD64 + ARM64)
- π Deploy to staging when pushing to develop branch
- π― Deploy to production when pushing to main branch
- π Run security scans on production deployments
π Update Your GitHub Actions Workflow
Open .github/workflows/ci.yml and replace it with this:
name: CI/CD Pipeline
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
env:
NODE_VERSION: '20'
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
jobs:
test:
name: Run Tests
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [18, 20]
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Run linting
run: npx eslint . --ext .js --ignore-pattern node_modules/
continue-on-error: true
- name: Run tests
run: npm test
- name: Run security audit
run: npm audit --audit-level=moderate || true
build:
name: Build Docker Image
runs-on: ubuntu-latest
needs: test
if: github.event_name == 'push'
permissions:
contents: read
packages: write
outputs:
image-digest: ${{ steps.build.outputs.digest }}
image-tag: ${{ steps.meta.outputs.tags }}
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=ref,event=branch
type=sha,prefix={{branch}}-
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
id: build
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
deploy-staging:
name: Deploy to Staging
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/develop'
environment: staging
steps:
- name: Deploy to staging
run: |
echo "π Deploying to staging environment..."
echo "Image: ${{ needs.build.outputs.image-tag }}"
echo "Would deploy to staging server here"
# In real scenario, you'd use:
# - kubectl apply -f k8s/staging/
# - docker-compose -f docker-compose.staging.yml up -d
# - ssh staging-server "docker pull $IMAGE && docker-compose up -d"
deploy-production:
name: Deploy to Production
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main'
environment: production
steps:
- name: Deploy to production
run: |
echo "π― Deploying to production environment..."
echo "Image: ${{ needs.build.outputs.image-tag }}"
echo "Image digest: ${{ needs.build.outputs.image-digest }}"
echo "Would deploy to production server here"
# In real scenario, you'd use:
# - kubectl apply -f k8s/production/
# - terraform apply
# - ansible-playbook deploy.yml
security-scan:
name: Security Scan
runs-on: ubuntu-latest
needs: build
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
steps:
- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
with:
image-ref: 'ghcr.io/kosinachi/getting-started-with-devops:latest'
format: 'sarif'
output: 'trivy-results.sarif'
# - name: Upload Trivy scan results
# uses: github/codeql-action/upload-sarif@v3
# with:
# # sarif_file: 'trivy-results.sarif'
env:
TRIVY_USERNAME: ${{ github.actor }}
TRIVY_PASSWORD: ${{ secrets.GITHUB_TOKEN }}
Commit and push your workflow:
`git add .github/workflows/ci.yml`
`git commit -m "Enhance CI/CD pipeline with staging and production deployment"`
`git push origin main`
βΈοΈ Kubernetes Deployment Configurations
We will now define how the app should run in staging and production.
π¦ Staging Deployment
mkdir -p k8s/staging
touch k8s/staging/deployment.yml
Copy this content into k8s/staging/deployment.yml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: devops-app-staging
namespace: staging
spec:
replicas: 2
selector:
matchLabels:
app: devops-app
environment: staging
template:
metadata:
labels:
app: devops-app
environment: staging
spec:
containers:
- name: app
image: ghcr.io/yourusername/my-devops-project:develop-latest
ports:
- containerPort: 3000
env:
- name: NODE_ENV
value: "staging"
- name: PORT
value: "3000"
livenessProbe:
httpGet:
path: /health
port: 3000
initialDelaySeconds: 30
periodSeconds: 10
readinessProbe:
httpGet:
path: /health
port: 3000
initialDelaySeconds: 5
periodSeconds: 5
---
apiVersion: v1
kind: Service
metadata:
name: devops-app-service-staging
namespace: staging
spec:
selector:
app: devops-app
environment: staging
ports:
- protocol: TCP
port: 80
targetPort: 3000
type: LoadBalancer
π Production Deployment
mkdir -p k8s/production
touch k8s/production/deployment.yml
Copy this content into k8s/production/deployment.yml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: devops-app-production
namespace: production
spec:
replicas: 3
selector:
matchLabels:
app: devops-app
environment: production
template:
metadata:
labels:
app: devops-app
environment: production
spec:
containers:
- name: app
image: ghcr.io/yourusername/my-devops-project:latest
ports:
- containerPort: 3000
env:
- name: NODE_ENV
value: "production"
- name: PORT
value: "3000"
resources:
requests:
memory: "128Mi"
cpu: "100m"
limits:
memory: "256Mi"
cpu: "200m"
livenessProbe:
httpGet:
path: /health
port: 3000
initialDelaySeconds: 30
periodSeconds: 10
readinessProbe:
httpGet:
path: /health
port: 3000
initialDelaySeconds: 5
periodSeconds: 5
---
apiVersion: v1
kind: Service
metadata:
name: devops-app-service-production
namespace: production
spec:
selector:
app: devops-app
environment: production
ports:
- protocol: TCP
port: 80
targetPort: 3000
type: LoadBalancer
- Push to develop β Staging deployed
- Push to main β Production deployed
- Automated Docker builds, tests, and security scans
- Kubernetes-powered scalability with replicas, probes, and resource limits
π At this point, ywe have got a full DevOps pipeline from code β CI β Docker β Kubernetes β CD.
Step 11: Complete Deployment Workflow
What this step does:
Shows you how to use the full CI/CD pipeline with a proper branching strategy for staging and production deployments.
Branch-based Deployment Strategy
Hereβs how the flow works:
- develop branch β π Automatically deploys to staging
- main branch β π― Automatically deploys to production
- Pull Requests β β Run tests only (no deployment)
This ensures safe testing in staging before releasing to production.
Deploy Changes
π Deploy to Staging
Create and switch to develop branch
git checkout -b develop
Make your changes, then commit and push
git add .
git commit -m "Add new feature"
git push origin develop
- GitHub Actions runs tests β
- If successful, your app is deployed to staging automatically
Switch to main branch
git checkout main
Merge changes from develop
git merge develop
Push to trigger production deployment
git push origin main
- GitHub Actions runs tests + build + security scan
- If successful, your app is deployed to production
π Monitor Deployments
Check whether everything is working as expected:
- π’ GitHub Actions status: π Your Actions page or:
### Check GitHub Actions status
### Visit: https://github.com/yourusername/my-devops-project/actions
- π¦ Container registry: π Your Docker images or:
### Check your container registry
### Visit: `https://github.com/yourusername/my-devops project/pkgs/container/my-devops-project`
π§ͺ Test live endpoints:
Staging health check
curl https://your-staging-url.com/health
Production health check
curl https://your-production-url.com/health
β
Expected response:
{ "status": "healthy", "timestamp": "...", "uptime": ... }
π At this point.
We have set up a complete DevOps environment:
- Local development π₯
- Containerization π³
- CI/CD pipelines βοΈ
- Branch-based deployments πΏ
- Kubernetes for staging & production βΈοΈ
- This is a production-grade workflow you can scale for real projects.
Conclusion & Best Practices
Congratulations! we have just built a complete DevOps pipeline, from coding locally all the way to automated staging and production deployments.
This setup covers:
- Local development with Node.js + Docker
- Automated testing (Jest + Supertest)
- GitHub Actions CI/CD pipeline
- Secure Dockerfile and configs
- Kubernetes staging & production deployments
- Branch-based workflow (develop β staging, main β production)
Best Practices to Keep in Mind
1. Keep CI/CD Fast
- Cache dependencies
- Run tests in parallel
- Fail fast on errors
2. Automate Security
- Use npm audit, Trivy, or Dependabot
- Regularly update dependencies
3. Protect Secrets
- Store secrets in GitHub Actions Secrets π
- Never commit .env files to Git
4. Monitor Everything
- Use /health and /metrics endpoints
- Connect to monitoring tools (Prometheus, Grafana, etc.)
5. Use Branching Wisely
- develop β Safe place for testing
- main β Only stable, production-ready code
Final Thoughts
This workflow gives you a real-world DevOps setup thatβs:
- Scalable β ready for growth
- Secure β avoids common pitfalls
- Automated β less manual work, more focus on coding
From here, you can expand further by:
- Adding Terraform for infrastructure-as-code
- Integrating Helm charts for Kubernetes
- Setting up CDNs & load balancers for global scalability
You now have all the building blocks to manage modern applications like a DevOps pro!
Top comments (0)