DEV Community

Cover image for DevOps Project: Production Level CI/CD Pipeline Project
H A R S H H A A for ProDevOpsGuy Tech Community

Posted on • Edited on

DevOps Project: Production Level CI/CD Pipeline Project

๐Ÿ“ Introduction

In the modern software development landscape, Continuous Integration and Continuous Deployment (CI/CD) pipelines are essential for ensuring that code changes are automatically built, tested, and deployed to production environments in a consistent and reliable manner. This document provides a comprehensive guide to setting up a robust CI/CD pipeline using various tools hosted on AWS EC2 instances.

The process will cover everything from setting up the necessary infrastructure to deploying an application on an Amazon EKS (Elastic Kubernetes Service) cluster, assigning a custom domain, and monitoring the application to ensure its stability and performance.

The pipeline will incorporate several industry-standard tools:

  • AWS EC2: Creating virtual machines.
  • Jenkins: Automating the build, test, and deployment processes.
  • SonarQube: Static code analysis to ensure code quality.
  • Trivy: File scan and vulnerability scanning for Docker images.
  • Nexus Repository Manager: Managing artifacts.
  • Terraform: Infrastructure as code to create the EKS Cluster.
  • Docker: Containerization for consistency and portability.
  • Kubernetes: Container orchestration for deployment.
  • Prometheus and Grafana: Monitoring the pipeline and application performance.

By following this guide, you'll be able to set up a fully functional CI/CD pipeline that supports continuous delivery and helps maintain high standards for code quality and application performance.

๐Ÿ–ผ๏ธ Image Overview:

Image description


Step 1๏ธโƒฃ: Set Up a GitHub Repository

Repository Link

Initialize Your Repository

  1. Create a New Repository:
    • Navigate to GitHub.
    • Create a new repository.
    • Clone the repository to your local machine.
    • Add your project files.

Push Local Code to GitHub

git init
git add .
git commit -m "Initial commit"
git branch -M main
git remote add origin https://github.com/your-username/your-repo.git
git push -u origin main
Enter fullscreen mode Exit fullscreen mode

Step 2๏ธโƒฃ: Set Up Jenkins on AWS EC2

Launch an EC2 Instance

  1. Login to AWS Management Console:
    • Navigate to the EC2 Dashboard.
    • Launch a new instance.
    • Choose Ubuntu as the AMI.
    • Select an instance type (e.g., t2.micro).
    • Configure security groups to allow SSH, HTTP, and port 8080.

Install Jenkins

#!/bin/bash
# Update packages
sudo apt-get update
# Install Java
sudo apt-get install -y openjdk-11-jdk
# Add Jenkins repo key to your system
wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add -
# Add Jenkins to your system's sources
sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'
# Update package list
sudo apt-get update
# Install Jenkins
sudo apt-get install -y jenkins
# Start Jenkins service
sudo systemctl start jenkins
Enter fullscreen mode Exit fullscreen mode

Access Jenkins

  1. Open Jenkins:
    • Navigate to http://<your-server-ip>:8080 in a web browser.
    • Unlock Jenkins using the password stored at /var/lib/jenkins/secrets/initialAdminPassword.

Step 3๏ธโƒฃ: Install Nexus

Launch Nexus on Docker

docker run -d -p 8081:8081 --name nexus sonatype/nexus3
Enter fullscreen mode Exit fullscreen mode

Access Nexus

  1. Open Nexus:
    • Navigate to http://<your-server-ip>:8081.
    • Use the default credentials:
      • Username: admin
      • Password: Located in the admin.password file inside the container.

Step 4๏ธโƒฃ: Set Up Docker on Jenkins EC2 Instance

Install Docker

#!/bin/bash
# Update package list
sudo apt-get update
# Install Docker
sudo apt-get install -y docker.io
# Add Jenkins user to Docker group
sudo usermod -aG docker jenkins
# Restart Jenkins to apply group changes
sudo systemctl restart jenkins
Enter fullscreen mode Exit fullscreen mode

Step 5๏ธโƒฃ: Set Up SonarQube

Launch SonarQube on Docker

docker run -d -p 9000:9000 --name sonarqube sonarqube
Enter fullscreen mode Exit fullscreen mode

Get SonarQube Initial Password

  1. Login Credentials:
    • Username: admin
    • Password: admin

Step 6๏ธโƒฃ: Install Trivy

#!/bin/bash
# Download and install Trivy
sudo apt-get install wget apt-transport-https gnupg lsb-release -y
wget -qO - https://aquasecurity.github.io/trivy-repo/deb/public.key | sudo apt-key add -
echo deb https://aquasecurity.github.io/trivy-repo/deb $(lsb_release -sc) main | sudo tee -a /etc/apt/sources.list.d/trivy.list
sudo apt-get update
sudo apt-get install trivy
Enter fullscreen mode Exit fullscreen mode

Step 7๏ธโƒฃ: Set Up Terraform EKS

EKS Requirements

  1. Create a terraform directory in your repository.
  2. Include the following files:
    • main.tf
    • variables.tf
    • outputs.tf
    • provider.tf
    • eks-cluster.tf
    • vpc.tf

Example main.tf File

provider "aws" {
  region = var.region
}

module "vpc" {
  source = "terraform-aws-modules/vpc/aws"
  version = "3.5.0"

  name = var.vpc_name
  cidr = var.vpc_cidr
  azs = var.availability_zones
  public_subnets = var.public_subnets
  private_subnets = var.private_subnets

  enable_nat_gateway = true
  single_nat_gateway = true
  public_subnet_tags = {
    "kubernetes.io/cluster/${var.cluster_name}" = "shared"
    "kubernetes.io/role/elb" = "1"
  }
  private_subnet_tags = {
    "kubernetes.io/cluster/${var.cluster_name}" = "shared"
    "kubernetes.io/role/internal-elb" = "1"
  }
}

module "eks" {
  source = "terraform-aws-modules/eks/aws"
  version = "17.1.0"

  cluster_name = var.cluster_name
  cluster_version = var.kubernetes_version
  subnets = module.vpc.private_subnets
  vpc_id = module.vpc.vpc_id

  node_groups = {
    eks_nodes = {
      desired_capacity = var.desired_capacity
      max_capacity = var.max_capacity
      min_capacity = var.min_capacity

      instance_type = var.instance_type
      key_name = var.key_name
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Set Up AWS CLI for Terraform

#!/bin/bash
# Install AWS CLI v2
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

# Verify installation
aws --version
Enter fullscreen mode Exit fullscreen mode

Step 8๏ธโƒฃ: Create an IAM Role for Jenkins

  1. Login to AWS Management Console.
  2. Navigate to the IAM Service.
  3. Create a New Role:
    • Select EC2 as the trusted entity.
    • Attach the AmazonEKSClusterPolicy and AmazonEC2ContainerRegistryFullAccess policies.
    • Name your role (e.g., Jenkins_EKS_Role).
  4. Attach the Role to the Jenkins EC2 Instance.

Step 9๏ธโƒฃ: Set Up Jenkins Pipeline

Pipeline Script

pipeline {
  agent any
  environment {
    AWS_DEFAULT_REGION = 'us-west-2'
    ECR_REPO_NAME = 'my-ecr-repo'
    IMAGE_TAG = "my-ecr-repo:${env.BUILD_ID}"
  }
  stages {
    stage('Checkout') {
      steps {
        git branch: 'main', url: 'https://github.com/your-repo.git'
      }
    }
    stage('Build Docker Image') {
      steps {
        script {
          dockerImage = docker.build("${env.ECR_REPO_NAME}:${env.BUILD_ID}")
        }
      }
    }
    stage('Scan with Trivy') {
      steps {
        script {
          sh 'trivy image --exit-code 1 --severity HIGH,CRITICAL ${dockerImage.id}'
        }
      }
    }
    stage('Push Docker Image') {
      steps {
        script {
          docker.withRegistry("https://${env.AWS_ACCOUNT_ID}.dkr.ecr.${env.AWS_DEFAULT_REGION}.amazonaws.com", 'ecr:aws-credentials') {
            dockerImage.push()
          }
        }
      }
    }
    stage('Deploy to EKS') {
      steps {
        script {
          sh '''
          aws eks --region $AWS_DEFAULT_REGION update-kubeconfig --name $CLUSTER_NAME
          kubectl apply -f k8s/deployment.yaml
          '''
        }
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Step ๐Ÿ”Ÿ: Deploy Application to EKS

Update k8s/deployment.yaml

apiVersion: apps/v1
kind: Deployment
metadata:
  name: my-app
  labels:
    app: my-app
spec:
  replicas: 3
  selector:
    matchLabels:
      app: my-app
  template:
    metadata:
      labels:
        app: my-app
    spec:
      containers:
      - name: my-app-container
        image: my-ecr-repo:latest
        ports:
        - containerPort: 80
Enter fullscreen mode Exit fullscreen mode

Apply the Deployment

kubectl apply -f k8s/deployment.yaml
Enter fullscreen mode Exit fullscreen mode

Step 1๏ธโƒฃ1๏ธโƒฃ: Assign Custom Domain to EKS

Update k8s/ingress.yaml

apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  name: my-app-ingress
  annotations:
    kubernetes.io/ingress.class: nginx
spec:
  rules:
  - host: my-custom-domain.com
    http:
      paths:
      - path: /
        pathType: Prefix
        backend:
          service:
            name: my-app-service
            port:
              number: 80
Enter fullscreen mode Exit fullscreen mode

Apply the Ingress

kubectl apply -f k8s/ingress.yaml
Enter fullscreen mode Exit fullscreen mode

Step 1๏ธโƒฃ2

๏ธโƒฃ: Monitor Application with Prometheus and Grafana

Install Prometheus

kubectl apply -f https://raw.githubusercontent.com/prometheus-operator/prometheus-operator/main/bundle.yaml
Enter fullscreen mode Exit fullscreen mode

Install Grafana

kubectl apply -f https://raw.githubusercontent.com/grafana-operator/grafana-operator/main/bundle.yaml
Enter fullscreen mode Exit fullscreen mode

Conclusion

This CI/CD pipeline project demonstrates the end-to-end process of automating the software development lifecycle, from code integration to deployment and monitoring. By following the steps outlined in this guide, youโ€™ve successfully:

  • Set Up a Repository: Established a version-controlled environment where code can be collaboratively managed and tracked.
  • Configured Necessary Infrastructure: Provisioned AWS EC2 instances and set up essential tools like Jenkins, SonarQube, Nexus, Prometheus, and Grafana to facilitate continuous integration, deployment, and monitoring.
  • Pushed Local Code to GitHub: Centralized your codebase in a GitHub repository, enabling seamless collaboration and integration with other tools in the pipeline.
  • Built and Deployed the Application: Leveraged Jenkins to automate the build, test, and deployment processes, ensuring that your application is consistently deployed to a Kubernetes cluster on Amazon EKS.
  • Monitored Application Performance: Used Prometheus and Grafana to set up a robust monitoring system that tracks the performance and health of your application in real-time.
  • Assigned a Custom Domain: Mapped your application to a custom domain, making it accessible to users in a production-ready environment.

By integrating these tools and processes, youโ€™ve created a powerful CI/CD pipeline that enhances code quality, reduces deployment time, and ensures application reliability. This setup not only accelerates the development cycle but also fosters a culture of continuous improvement, where code is regularly integrated, tested, and deployed with minimal manual intervention.

Moving forward, you can extend this pipeline by adding more advanced features, such as automated security testing, blue-green deployments, or canary releases, to further improve the robustness and scalability of your CI/CD processes.


Thank you for reading my blog โ€ฆ:)

ยฉ Copyrights: ProDevOpsGuy

img

Join Our Telegram Community || Follow me for more DevOps Content.

Top comments (36)

Collapse
 
martinbaun profile image
Martin Baun

Thanks for this detailed post, Harsh!

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @martinbaun for your feedback ๐Ÿ™‚

Collapse
 
simba0808 profile image
simba

Super professional.
I would like to contact with you if I have any issues with CI/CD and Devops.
Thanks.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @simba0808 ๐Ÿ™

Collapse
 
md_kaleem_5c6461dd829ff99 profile image
MD Kaleem

Good Project. covering most topics

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @md_kaleem_5c6461dd829ff99 for your feedback โ˜บ๏ธ

Collapse
 
maneksoft_texas profile image
Maneksoft

Thanks for sharing informative post ,its really helpful to me for my next project.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @maneksoft_texas for your feedback โ˜บ๏ธ

Collapse
 
kenneth_david profile image
kenneth David

This is awesome and comprehensive. Thanks for sharing!

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @kenneth_david for your feedback โ˜บ๏ธ

Collapse
 
wioliveira profile image
William Oliveira

Great contribution!

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks ๐Ÿ˜Š @wioliveira

Collapse
 
koushick123 profile image
Koushick Suryanarayanan

Very nice explanation and super useful to get hands-on for someone interested in DevOps. Thanks so much. Appreciate it.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @koushick123 for your feedback ๐Ÿ™‚

Collapse
 
ankitsingh profile image
ANKIT SINGH

HARSHHAA
Thanks for the article,
quite insightful

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @ankitsingh for your feedback โ˜บ๏ธ

Collapse
 
theophilusgordon profile image
Theophilus Gordon

This is amazingly detailed for a write up

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks @theophilusgordon for your feedback โ˜บ๏ธ

Collapse
 
pradeepkumardevops profile image
pradeepkumardevops

monitoring part not elabaratef

Collapse
 
notharshhaa profile image
H A R S H H A A

Kinda, will be updated ASAP @pradeepkumardevops

Some comments may only be visible to logged-in visitors. Sign in to view all comments.