๐ Introduction
In the modern software development landscape, Continuous Integration and Continuous Deployment (CI/CD) pipelines are essential for ensuring that code changes are automatically built, tested, and deployed to production environments in a consistent and reliable manner. This document provides a comprehensive guide to setting up a robust CI/CD pipeline using various tools hosted on AWS EC2 instances.
The process will cover everything from setting up the necessary infrastructure to deploying an application on an Amazon EKS (Elastic Kubernetes Service) cluster, assigning a custom domain, and monitoring the application to ensure its stability and performance.
The pipeline will incorporate several industry-standard tools:
- AWS EC2: Creating virtual machines.
- Jenkins: Automating the build, test, and deployment processes.
- SonarQube: Static code analysis to ensure code quality.
- Trivy: File scan and vulnerability scanning for Docker images.
- Nexus Repository Manager: Managing artifacts.
- Terraform: Infrastructure as code to create the EKS Cluster.
- Docker: Containerization for consistency and portability.
- Kubernetes: Container orchestration for deployment.
- Prometheus and Grafana: Monitoring the pipeline and application performance.
By following this guide, you'll be able to set up a fully functional CI/CD pipeline that supports continuous delivery and helps maintain high standards for code quality and application performance.
๐ผ๏ธ Image Overview:
Step 1๏ธโฃ: Set Up a GitHub Repository
Repository Link
- Repository URL: GitHub Repository
Initialize Your Repository
-
Create a New Repository:
- Navigate to GitHub.
- Create a new repository.
- Clone the repository to your local machine.
- Add your project files.
Push Local Code to GitHub
git init
git add .
git commit -m "Initial commit"
git branch -M main
git remote add origin https://github.com/your-username/your-repo.git
git push -u origin main
Step 2๏ธโฃ: Set Up Jenkins on AWS EC2
Launch an EC2 Instance
-
Login to AWS Management Console:
- Navigate to the EC2 Dashboard.
- Launch a new instance.
- Choose Ubuntu as the AMI.
- Select an instance type (e.g., t2.micro).
- Configure security groups to allow SSH, HTTP, and port 8080.
Install Jenkins
#!/bin/bash
# Update packages
sudo apt-get update
# Install Java
sudo apt-get install -y openjdk-11-jdk
# Add Jenkins repo key to your system
wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add -
# Add Jenkins to your system's sources
sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'
# Update package list
sudo apt-get update
# Install Jenkins
sudo apt-get install -y jenkins
# Start Jenkins service
sudo systemctl start jenkins
Access Jenkins
-
Open Jenkins:
- Navigate to
http://<your-server-ip>:8080
in a web browser. - Unlock Jenkins using the password stored at
/var/lib/jenkins/secrets/initialAdminPassword
.
- Navigate to
Step 3๏ธโฃ: Install Nexus
Launch Nexus on Docker
docker run -d -p 8081:8081 --name nexus sonatype/nexus3
Access Nexus
-
Open Nexus:
- Navigate to
http://<your-server-ip>:8081
. - Use the default credentials:
-
Username:
admin
-
Password: Located in the
admin.password
file inside the container.
-
Username:
- Navigate to
Step 4๏ธโฃ: Set Up Docker on Jenkins EC2 Instance
Install Docker
#!/bin/bash
# Update package list
sudo apt-get update
# Install Docker
sudo apt-get install -y docker.io
# Add Jenkins user to Docker group
sudo usermod -aG docker jenkins
# Restart Jenkins to apply group changes
sudo systemctl restart jenkins
Step 5๏ธโฃ: Set Up SonarQube
Launch SonarQube on Docker
docker run -d -p 9000:9000 --name sonarqube sonarqube
Get SonarQube Initial Password
-
Login Credentials:
-
Username:
admin
-
Password:
admin
-
Username:
Step 6๏ธโฃ: Install Trivy
#!/bin/bash
# Download and install Trivy
sudo apt-get install wget apt-transport-https gnupg lsb-release -y
wget -qO - https://aquasecurity.github.io/trivy-repo/deb/public.key | sudo apt-key add -
echo deb https://aquasecurity.github.io/trivy-repo/deb $(lsb_release -sc) main | sudo tee -a /etc/apt/sources.list.d/trivy.list
sudo apt-get update
sudo apt-get install trivy
Step 7๏ธโฃ: Set Up Terraform EKS
EKS Requirements
- Create a
terraform
directory in your repository. -
Include the following files:
main.tf
variables.tf
outputs.tf
provider.tf
eks-cluster.tf
vpc.tf
Example main.tf
File
provider "aws" {
region = var.region
}
module "vpc" {
source = "terraform-aws-modules/vpc/aws"
version = "3.5.0"
name = var.vpc_name
cidr = var.vpc_cidr
azs = var.availability_zones
public_subnets = var.public_subnets
private_subnets = var.private_subnets
enable_nat_gateway = true
single_nat_gateway = true
public_subnet_tags = {
"kubernetes.io/cluster/${var.cluster_name}" = "shared"
"kubernetes.io/role/elb" = "1"
}
private_subnet_tags = {
"kubernetes.io/cluster/${var.cluster_name}" = "shared"
"kubernetes.io/role/internal-elb" = "1"
}
}
module "eks" {
source = "terraform-aws-modules/eks/aws"
version = "17.1.0"
cluster_name = var.cluster_name
cluster_version = var.kubernetes_version
subnets = module.vpc.private_subnets
vpc_id = module.vpc.vpc_id
node_groups = {
eks_nodes = {
desired_capacity = var.desired_capacity
max_capacity = var.max_capacity
min_capacity = var.min_capacity
instance_type = var.instance_type
key_name = var.key_name
}
}
}
Set Up AWS CLI for Terraform
#!/bin/bash
# Install AWS CLI v2
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
# Verify installation
aws --version
Step 8๏ธโฃ: Create an IAM Role for Jenkins
- Login to AWS Management Console.
- Navigate to the IAM Service.
-
Create a New Role:
- Select
EC2
as the trusted entity. - Attach the
AmazonEKSClusterPolicy
andAmazonEC2ContainerRegistryFullAccess
policies. - Name your role (e.g.,
Jenkins_EKS_Role
).
- Select
- Attach the Role to the Jenkins EC2 Instance.
Step 9๏ธโฃ: Set Up Jenkins Pipeline
Pipeline Script
pipeline {
agent any
environment {
AWS_DEFAULT_REGION = 'us-west-2'
ECR_REPO_NAME = 'my-ecr-repo'
IMAGE_TAG = "my-ecr-repo:${env.BUILD_ID}"
}
stages {
stage('Checkout') {
steps {
git branch: 'main', url: 'https://github.com/your-repo.git'
}
}
stage('Build Docker Image') {
steps {
script {
dockerImage = docker.build("${env.ECR_REPO_NAME}:${env.BUILD_ID}")
}
}
}
stage('Scan with Trivy') {
steps {
script {
sh 'trivy image --exit-code 1 --severity HIGH,CRITICAL ${dockerImage.id}'
}
}
}
stage('Push Docker Image') {
steps {
script {
docker.withRegistry("https://${env.AWS_ACCOUNT_ID}.dkr.ecr.${env.AWS_DEFAULT_REGION}.amazonaws.com", 'ecr:aws-credentials') {
dockerImage.push()
}
}
}
}
stage('Deploy to EKS') {
steps {
script {
sh '''
aws eks --region $AWS_DEFAULT_REGION update-kubeconfig --name $CLUSTER_NAME
kubectl apply -f k8s/deployment.yaml
'''
}
}
}
}
}
Step ๐: Deploy Application to EKS
Update k8s/deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: my-app
labels:
app: my-app
spec:
replicas: 3
selector:
matchLabels:
app: my-app
template:
metadata:
labels:
app: my-app
spec:
containers:
- name: my-app-container
image: my-ecr-repo:latest
ports:
- containerPort: 80
Apply the Deployment
kubectl apply -f k8s/deployment.yaml
Step 1๏ธโฃ1๏ธโฃ: Assign Custom Domain to EKS
Update k8s/ingress.yaml
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: my-app-ingress
annotations:
kubernetes.io/ingress.class: nginx
spec:
rules:
- host: my-custom-domain.com
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: my-app-service
port:
number: 80
Apply the Ingress
kubectl apply -f k8s/ingress.yaml
Step 1๏ธโฃ2
๏ธโฃ: Monitor Application with Prometheus and Grafana
Install Prometheus
kubectl apply -f https://raw.githubusercontent.com/prometheus-operator/prometheus-operator/main/bundle.yaml
Install Grafana
kubectl apply -f https://raw.githubusercontent.com/grafana-operator/grafana-operator/main/bundle.yaml
Conclusion
This CI/CD pipeline project demonstrates the end-to-end process of automating the software development lifecycle, from code integration to deployment and monitoring. By following the steps outlined in this guide, youโve successfully:
- Set Up a Repository: Established a version-controlled environment where code can be collaboratively managed and tracked.
- Configured Necessary Infrastructure: Provisioned AWS EC2 instances and set up essential tools like Jenkins, SonarQube, Nexus, Prometheus, and Grafana to facilitate continuous integration, deployment, and monitoring.
- Pushed Local Code to GitHub: Centralized your codebase in a GitHub repository, enabling seamless collaboration and integration with other tools in the pipeline.
- Built and Deployed the Application: Leveraged Jenkins to automate the build, test, and deployment processes, ensuring that your application is consistently deployed to a Kubernetes cluster on Amazon EKS.
- Monitored Application Performance: Used Prometheus and Grafana to set up a robust monitoring system that tracks the performance and health of your application in real-time.
- Assigned a Custom Domain: Mapped your application to a custom domain, making it accessible to users in a production-ready environment.
By integrating these tools and processes, youโve created a powerful CI/CD pipeline that enhances code quality, reduces deployment time, and ensures application reliability. This setup not only accelerates the development cycle but also fosters a culture of continuous improvement, where code is regularly integrated, tested, and deployed with minimal manual intervention.
Moving forward, you can extend this pipeline by adding more advanced features, such as automated security testing, blue-green deployments, or canary releases, to further improve the robustness and scalability of your CI/CD processes.
Thank you for reading my blog โฆ:)
ยฉ Copyrights: ProDevOpsGuy
Top comments (36)
Thanks for this detailed post, Harsh!
Thanks @martinbaun for your feedback ๐
Super professional.
I would like to contact with you if I have any issues with CI/CD and Devops.
Thanks.
Thanks @simba0808 ๐
Good Project. covering most topics
Thanks @md_kaleem_5c6461dd829ff99 for your feedback โบ๏ธ
Thanks for sharing informative post ,its really helpful to me for my next project.
Thanks @maneksoft_texas for your feedback โบ๏ธ
This is awesome and comprehensive. Thanks for sharing!
Thanks @kenneth_david for your feedback โบ๏ธ
Great contribution!
Thanks ๐ @wioliveira
Very nice explanation and super useful to get hands-on for someone interested in DevOps. Thanks so much. Appreciate it.
Thanks @koushick123 for your feedback ๐
HARSHHAA
Thanks for the article,
quite insightful
Thanks @ankitsingh for your feedback โบ๏ธ
This is amazingly detailed for a write up
Thanks @theophilusgordon for your feedback โบ๏ธ
monitoring part not elabaratef
Kinda, will be updated ASAP @pradeepkumardevops
Some comments may only be visible to logged-in visitors. Sign in to view all comments.