"Automation is the key to scaling software delivery."
Table of Contents
- Introduction
- Project Overview
- Architecture & Deployment Flow
- Setup Bitbucket Pipelines
- Interesting Facts & Statistics
- Frequently Asked Questions (FAQs)
- Key Takeaways
- Conclusion
Introduction
In modern software development, automation, reliability, and speed are critical for successful application delivery. Continuous Integration and Continuous Deployment (CI/CD) pipelines help developers deploy applications faster with minimal manual intervention.
This document explains a Proof of Concept (POC) where a Node.js login application is deployed automatically to an AWS EC2 instance using Bitbucket Pipelines, managed by PM2, and served securely via Nginx as a reverse proxy.
Project Overview
This project demonstrates an end-to-end CI/CD workflow:
- A Node.js login application hosted in Bitbucket
- Automated deployment triggered on code push to the prd branch
- Secure server access via SSH
- Application process management using PM2
- Public access via Nginx reverse proxy
The deployed application is accessible through a public URL, validating the successful CI/CD pipeline execution.
Architecture & Deployment Flow
This CI/CD pipeline follows a practical, step-by-step deployment flow using Bitbucket Pipelines and AWS EC2.
High-level flow:
- Developer pushes code to the prd branch
- Bitbucket Pipelines is triggered automatically
- Pipeline connects to AWS EC2 using SSH
- Latest code is pulled from Bitbucket
- Dependencies are installed
- Application is restarted using PM2
- Nginx routes HTTP traffic to the Node.js app
This ensures zero manual deployment effort and consistent releases.
EC2 server setup (one-time)
1.1 Install Node.js + Git + Nginx
sudo apt update
sudo apt install -y git nginx curl
Install Node.js (recommended using NodeSource, example Node 20):
- curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
- sudo apt install -y nodejs
- node -v
npm -v
1.2 Install PM2 globallysudo npm i -g pm2
pm2 -v
1.3 Create project directory
Example:sudo mkdir -p /var/www/myapp
sudo chown -R deploy:deploy /var/www/myapp
2) Clone web Application repo once (manual first time)cd /var/www/myapp
git clone .
2.1 Install and start using PM2
Example: if your app entry is server.js and runs on port 3000:
- npm ci
- pm2 start server.js --name myapp
- pm2 save
Enable PM2 auto-start on reboot:
- pm2 startup
- pm2 save
Nginx reverse proxy (one-time)
3.1 Create Nginx config
- sudo nano /etc/nginx/sites-available/myapp
server {
listen 80;
server_name yourdomain.com www.yourdomain.com;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
Enable site:
- sudo ln -s /etc/nginx/sites-available/myapp /etc/nginx/sites-enabled/
- sudo nginx -t
sudo systemctl restart nginx
(Optional SSL with Let’s Encrypt):sudo apt install -y certbot python3-certbot-nginx
sudo certbot --nginx -d yourdomain.com -d www.yourdomain.com
Setup Bitbucket Pipelines
This section explains the practical CI/CD implementation step by step, exactly as executed in the Bitbucket Pipeline.
SSH key for Bitbucket Pipelines → EC2 (required)
Create a deploy key pair (local machine)
On your local PC:
ssh-keygen -t ed25519 -C "bitbucket-pipeline" -f bb_pipeline_key
You will get:
- bb_pipeline_key (private)
- bb_pipeline_key.pub (public)
4.2 Add public key to EC2
On EC2 as deploy user:
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- nano ~/.ssh/authorized_keys
- Paste content of bb_pipeline_key.pub, then:**
- chmod 600 ~/.ssh/authorized_keys
4.3 Add private key to Bitbucket repository variables
In Bitbucket:
- Repo → Repository settings → Pipelines → Repository variables Add:
- SSH_PRIVATE_KEY = (paste full content of bb_pipeline_key)
- SSH_USER = User
- SSH_HOST = your EC2 public IP / domain
- APP_DIR = /var/www/myapp
Also add:
- KNOWN_HOST (optional but recommended) or use ssh-keyscan in pipeline (we will do ssh-keyscan)
Bitbucket Pipeline YAML Configuration
1. vim bitbucket-pipelines.yml
image: node:24
pipelines:
branches:
prd:
- step:
name: Deploy to Production
deployment: Production
script:
- apt-get update && apt-get install -y openssh-client
- mkdir -p ~/.ssh
- ssh-keyscan $SSH_HOST >> ~/.ssh/known_hosts
- |
ssh $SSH_USER@$SSH_HOST << EOF
set -e
set +x
echo "Load NVM"
export NVM_DIR="\$HOME/.nvm"
[ -s "\$NVM_DIR/nvm.sh" ] && . "\$NVM_DIR/nvm.sh"
echo "----Use Node 24-----"
nvm use 24
echo "-----Change directory-----"
cd $APP_DIR
echo "----Pull latest code----"
git pull origin prd
echo "--npm install----"
npm install
echo "----Restart PM2-----"
pm2 restart all
echo "-----Deployment completed Successfully------"
EOF
Practical Demonstration (Images Explained)
"If it hurts, do it more often and automate it." - DevOps Principle
Step 1. BEFORE: Original Login Page
- This shows the original application running at nodeapp.13.127.87.71.nip.io/login with the title "Login Page". This is the state before making any code changes.
Step 2. Making Changes & Pushing Code
- This terminal screenshot shows the developer workflow
Step 3. Bitbucket Pipelines Dashboard
- This shows the Pipelines page in Bitbucket with successful deployments:
- The highlighted row #30 is the most recent deployment that was triggered automatically when code was pushed to the prd branch.
Step 4. AFTER: Updated Login Page
This shows the application after successful deployment. The title has changed from "Login Page" to "Addweb Login Page" - confirming the CI/CD pipeline worked correctly!
Pipeline Failure Scenario
Step 1. Intentionally Introduce an Error
To test pipeline failure behavior, we intentionally modified the package.json file by adding an invalid dependency:
Example change in package.json:
"this-package-does-not-exist-123": "1.0.0"
This package does not exist in the npm registry. the purpose was to simulate a real-world mistake such as:
- Typo in package name
- Incorrect dependency version
- Invalid module added by mistake
Step 2. Commit and Push the Wrong Code
After modifying package.json, the changes were committed and pushed to the prd branch:
git commit -m "We mentioned the wrong package name in package.json."git push origin prd
Since our Bitbucket Pipeline is configured to run automatically on the prd branch, this push immediately triggered a new pipeline execution.
Step 3. Pipeline Triggered Automatically
As expected, Bitbucket Pipelines started running automatically as soon as the code was pushed.
In the Pipelines dashboard we can see:
- New pipeline execution created
- Status initially shown as “In Progress”
This confirms that the CI/CD automation is working correctly.
Step 4. Pipeline Execution Failed
During pipeline execution, the following command was executed on the server:
- npm install
Because we added a non-existent package, the installation failed with this error:
npm error code E404
npm error 404 Not Found - GET https://registry.npmjs.org/this-package-does-not-exist-123
npm error 404 'this-package-does-not-exist-123@1.0.0' is not in this registry.
As a result:
- The pipeline step stopped
- Deployment process was aborted
- Application was NOT restarted
- Previous working version remained intact
If any step in the CI/CD pipeline fails, the deployment automatically stops. This protects production from broken or unstable code.
Interesting Facts & Statistics
- CI/CD can reduce deployment time by up to 70% Source:- CI/CD up to 70%
- PM2 improves Node.js app uptime close to 99.9% Source:- PM2 uptime 99.9%
- Automated pipelines reduce deployment errors by 60-80% Source:- Automated by 60-80%
"CI/CD is not a tool, it’s a culture"
Frequently Asked Questions (FAQs)
Q1. Why use Bitbucket Pipelines?
Bitbucket Pipelines provides native CI/CD integration with repositories, reducing setup complexity.
Q2. Why PM2 instead of node directly?
PM2 ensures application stability, auto-restarts on failure, and better process control.
Q3. Why use Nginx as a reverse proxy?
Nginx improves security, handles traffic efficiently, and allows SSL termination.
Key Takeaways
- CI/CD automates deployments and saves time
- Bitbucket Pipelines integrates seamlessly with repositories
- AWS EC2 provides flexible and scalable hosting
- PM2 ensures high availability of Node.js apps
- Nginx enhances security and request handling
Conclusion
This POC successfully demonstrates a real-world CI/CD pipeline for a Node.js application using Bitbucket Pipelines and AWS EC2. The setup ensures reliable, automated, and scalable deployments with minimal manual intervention.By implementing this approach, teams can improve deployment confidence, reduce errors, and accelerate delivery - making it a strong foundation for production-grade applications.
About the Author: Narendra is a DevOps Engineer at AddWebSolution, specializing in automating infrastructure to improve efficiency and reliability.








Top comments (0)