DEV Community

Amarachi Iheanacho
Amarachi Iheanacho

Posted on

Automate testing and Docker image deployment to Amazon ECR with CI.

The frequency and speed of releases in modern software development requires robust CI/CD pipelines that ensure code quality, security, and reliable deployments. These pipelines eliminate the errors, that spring up from manual integration development and deployment, some of which are due to inefficient and insufficient testing, and some are just due to human errors from performing the same steps a million times.

In this article you'll build a comprehensive continuous integration(CI) pipeline that automatically tests your application, performs security scans, and pushes Docker images to Amazon ECR (Elastic Container Registry). It is the second part of this four-part DevSecOps series.

This pipeline implements DevSecOps best practices by integrating security at every stage, from code quality analysis with SonarCloud to vulnerability scanning with Snyk and Trivy. By the end of this guide, you'll have a production-ready pipeline that automatically validates your application before containerizing and storing it securely in AWS.

What this series contains

This four-part series walks you through building a modern DevSecOps pipeline for a containerized quiz application:

  1. Provision a secure EC2 jumphost using Terraform and GitHub Actions (previous article)
  2. Build a CI pipeline that tests your application and pushes Docker images to Amazon ECR (this article)
  3. Set up an Amazon EKS cluster and deploy the application with ArgoCD
  4. Add monitoring and observability using Prometheus and Grafana

Prerequisites

Ensure you have the following before proceeding with this article:

  1. Completed the first part of the series. You should have a secure EC2 jumphost provisioned using Terraform and GitHub Actions
  2. A basic understanding of Docker and containerization.

Project structure overview

This project will build on the EC2 jumphost project you had from the first part of the series.

So create a ci.yaml file in your .github/workflows folder. This file will hold the code for the CI pipeline responsible for testing, validating and pushing your application to the Amazon ECR repository. Your complete file structure should look like this:

.
├── backend/
├── frontend/
├── .github/
   └── workflows/
       ├── terraform.yaml
       └── ci.yaml (new)
├── terraform/
└── scripts/
Enter fullscreen mode Exit fullscreen mode

Pre-project setup checklist

Before getting right into building your CI pipeline you need to do the following:

  • Set up your AWS ECR repositories
  • Set up SonarCloud for your repository
  • Set up Snyk

Follow the rest of this section to complete the steps above.

Set up your AWS ECR repositories

The Amazon ECR repositories will hold the frontend and backend Docker images your CI pipeline would create.

You need your AWS account ID, and your AWS ECR repositories to effectively push your Docker images.

Get your account ID:

  1. Sign in to your AWS Console.
  2. Click your account name at the top right corner of the navigation bar
  3. Copy your account ID from the dropdown menu.

Create your ECR repositories:

  1. Search and navigate to Elastic Container Registry (ECR) in your console.
  2. Create repositories for both frontend and backend by doing the following:

    a.Click Create repository

    b.Set Repository name to frontend.
    c.Leave the remaining settings as default and click the Create button*.*


d. Repeat the steps above to create a second repository named backend.

Set up SonarCloud

SonarCloud provides code quality analysis, helping you identify bugs and vulnerabilities in the code you plan to package. To enable SonarCloud, you'll need four things: a SonarCloud account and project, a SonarCloud token, an organization token, and a SonarCloud project key.

Do the following to get your SonarCloud all set up:

  • Create a SonarCloud account and project:

    1. Sign up for a free SonarCloud account using your GitHub account
    2. Click on Analyze new project to import your organization
    3. Select the quiz-application repository that will hold your CI pipeline (this is the same GitHub repository from part 1)
    4. Select a new code definition to define what SonarCloud would define as new code.
    5. Click the Create project button.
  • Generate a SonarCloud token:

    1. Select your account’s icon at the top right of the page
    2. Go to My AccountSecurity

    3.Enter the name of your token
    4.Select the Generate Token button to create the token
    5.Copy the token

  • Get your organization key:

    1. Select your account’s icon at the top right of the page
    2. Select your organization (the same one that holds the quiz application GitHub repository)

    3.Copy your organization key from the top right corner of the webpage.

  • Get a SonarCloud project key:

    1. Select your organization
    2. Select the project that holds your quiz-application repository.
    3. Navigate to Administration -> Update Key.

    4.Copy your Project Key.

Important: After setting up, make sure to disable automatic analysis to avoid conflicts with your CI configuration.

Disable automatic analysis
SonarCloud recommends using only one analysis method (either CI-based or automatic) to avoid duplicate results and conflicts.

Since you’re configuring analysis through a CI pipeline, you must disable Automatic Analysis for your project.

Here is how to do it:

  1. Select your project
  2. Navigate to Administration -> Analysis method

  3. Toggle the Automatic Analysis button to disable automatic analysis.

Set up Snyk

Snyk scans for security vulnerabilities in your dependencies and Docker images.

Do the following to get a Snyk Auth Token for authenticating your CI pipeline:

  1. Create a Snyk account.
  2. Select the Choose integration option and connect Snyk to your GitHub repository.
  3. Select your account at the bottom of the sidebar .
  4. Select Account settings

  5. Copy your authentication token from the Auth Token section

Configure your GitHub secrets

Now that you have all your credentials, add them to your GitHub secrets so that your CI pipeline can pull them into the workflow:

  1. Select your project’s GitHub repository.
  2. Navigate to the SettingsSecrets and variablesActions

  3. Click on New repository secret and add the following secrets, replacing the place holders with your actual values :

    • AWS_ACCOUNT_ID: your-account-id
    • SONAR_TOKEN: your-sonarcloud-token
    • SONAR_ORGANIZATION_KEY: your-sonarcloud-org-key
    • SONAR_URL: https://sonarcloud.io
    • SONAR_PROJECT_KEY: your-project-key
    • SNYK_TOKEN: your-snyk-api-token

Understanding the CI/CD pipeline architecture

Now that you have the file structure in place and your credentials set up, it's important to understand what this CI pipeline does.

The pipeline implements a comprehensive DevSecOps workflow with the following stages:

  • Code Testing: Runs unit tests, linting, and formatting checks.
  • Quality Analysis: Performs code quality scanning using SonarCloud.
  • Security Scanning: Assesses source code vulnerabilities with Snyk.
  • Container Building: Builds a Docker image and pushes it to Amazon ECR.
  • Image Security: Scans the Docker image for vulnerabilities using Trivy and Snyk.

The pipeline is triggered on both pull requests and pushes to the main branch, ensuring code quality and security throughout the development process.

Building the CI/CD workflow

With your project structure, credentials and basic knowledge of the CI pipeline all setup, copy and paste this code in your .github/workflow/ci-cd.yaml file.

https://gist.github.com/Iheanacho-ai/2ee426b821ddc2058c76956fafeb399e

After completing this section, your pipeline will be fully set up. When it runs, your applications will be tested, built into Docker images, tested again, and then pushed to AWS ECR.

But now, let's understand exactly what you just created.

Pipeline breakdown and analysis

Here is the breakdown of the pipeline in stages:

Stage 1: Application testing

The pipeline begins by thoroughly testing both the frontend and backend applications using the frontend-test and backend-test jobs.

Frontend testing through the **frontend-test** job

 frontend-test:
   runs-on: ubuntu-latest
   defaults:
     run:
       working-directory: ./frontend
   strategy:
     matrix:
       node-version: [20.x]
       architecture: [x64]
   steps:
     - name: Check-out git repository
       uses: actions/checkout@v4

     - name: USE NODEJS ${{ matrix.node-version }} - ${{ matrix.architecture }}
       uses: actions/setup-node@v4

     - name: Install project dependencies
       working-directory: ./frontend
       run: |
         npm i
        npm run lint
        npm install --save-dev --save-exact prettier
        npm run prettier
        npm test
       env:
         CI: true

     - name: Build
       run: npm run build
       working-directory:
         ./frontend

     - name: Analyze with SonarCloud
       uses: sonarsource/sonarcloud-github-action@v5.0.0
       env:
         SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
       with:
         projectBaseDir: frontend
         args: >
           -Dsonar.organization=${{ secrets.SONAR_ORGANIZATION_KEY }}
          -Dsonar.projectKey=${{ secrets.SONAR_PROJECT_KEY }}
          -Dsonar.host.url=${{ secrets.SONAR_URL }}
          -Dsonar.login=${{ secrets.SONAR_TOKEN }}
          -Dsonar.sources=src/
          -Dsonar.verbose=true


Enter fullscreen mode Exit fullscreen mode

In the code block above the frontend-test job runs tests on the frontend application with the following steps:

  • Check-out git repository: Uses the actions/checkout@v4 action to check out the frontend application code into the GitHub Actions runner.
  • USE NODEJS ${{ matrix.node-version }} - ${{ matrix.architecture }}: Sets the environment to use Node.js 20.x on Ubuntu
  • Install project dependencies: This step does the following:
    • Sets the working-directory to /frontend
    • Installs all the npm dependencies required to run your application
    • Runs ESLint for code linting and Prettier for formatting
    • Executes test suites with npm test
  • Build: Compiles the frontend application using the npm run build command
  • Analyze with SonarCloud: Uses the sonarsource/sonarcloud-github-action@v5.0.0 action to perform static code analysis, identifying bugs, vulnerabilities, and code smells. Refer to the SonarSource project available as a GitHub Action resource for more information on using SonarCloud in your GitHub Actions workflow.

Backend testing through the **backend-test** job

 backend-test:
   runs-on: ubuntu-latest
   defaults:
     run:
       working-directory: ./backend
   strategy:
     matrix:
       node-version: [20.x]
       architecture: [x64]
   steps:
     - name: Check-out git repository
       uses: actions/checkout@v4

     - name: USE NODEJS ${{ matrix.node-version }} - ${{ matrix.architecture }}
       uses: actions/setup-node@v4

     - name: Install project dependencies
       working-directory: ./backend
       run: |
         npm i
        npm run lint
        npm install --save-dev --save-exact prettier
        npm run prettier
        npm test
       env:
         CI:
           true

           # Setup sonar-scanner
     - name: Setup SonarQube
       uses: warchant/setup-sonar-scanner@v8

     - name: Analyze with SonarCloud
       uses: sonarsource/sonarcloud-github-action@v5.0.0
       env:
         SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
       with:
         projectBaseDir: backend
         args: >
           -Dsonar.organization=${{ secrets.SONAR_ORGANIZATION_KEY }}
          -Dsonar.projectKey=${{ secrets.SONAR_PROJECT_KEY }}
          -Dsonar.host.url=${{ secrets.SONAR_URL }}
          -Dsonar.login=${{ secrets.SONAR_TOKEN }}
          -Dsonar.sources=.
          -Dsonar.verbose=true

Enter fullscreen mode Exit fullscreen mode

The backend-test above mirrors the frontend-test job to run lints and comprehensive testing for the backend application.

Step 2: Security scanning

After successfully testing the application code, the pipeline proceeds to scan both the frontend and backend applications for security vulnerabilities using the frontend-security and backend-security jobs, respectively.

Frontend security with frontend-security job

 frontend-security:
   needs: frontend-test
   runs-on: ubuntu-latest
   defaults:
     run:
       working-directory: ./frontend
   steps:
     - name: Checkout frontend code
       uses: actions/checkout@master
     - name: Run Snyk to check for vulnerabilities
       uses: snyk/actions/node@master
       continue-on-error: true
       env:
         SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}

     - name: Install Snyk CLI
       uses: snyk/actions/setup@master
       with:
         version: latest
       env:
         SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Authenticate
       run: snyk auth ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Code Test
       run: snyk code test --all-projects
       continue-on-error: true

Enter fullscreen mode Exit fullscreen mode

The frontend-security job runs on Ubuntu and starts only after the frontend-test job has completed.
It performs a security vulnerability scan on the frontend application using the following steps:

  • Checkout frontend code: Checks out the frontend application code into the GitHub Actions runner.
  • Run Snyk to check for vulnerabilities:Uses the snyk/actions/node@master action to scan for security issues. The continue-on-error: true setting ensures that the job won’t fail even if vulnerabilities are detected.
  • Install Snyk CLI: Installs the latest version of the Snyk Command Line Interface tool.
  • Snyk Authenticate: Authenticates the Snyk CLI with the provided token from GitHub Secrets.
  • Snyk Code Test: Runs static code analysis on all projects in the frontend directory to detect vulnerabilities.

Backend security with backend-security job

 backend-security:
   needs: backend-test
   runs-on: ubuntu-latest
   defaults:
     run:
       working-directory: ./backend
   steps:
     - name: Checkout backend code
       uses: actions/checkout@master
     - name: Run Snyk to check for vulnerabilities
       uses: snyk/actions/node@master
       continue-on-error: true # To make sure that SARIF upload gets called
       env:
         SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}

     - name: Install Snyk CLI
       uses: snyk/actions/setup@master
       with:
         version: latest
       env:
         SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Authenticate
       run: snyk auth ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Code Test
       run: snyk code test --all-projects
       continue-on-error: true


Enter fullscreen mode Exit fullscreen mode

The backend-security job mirrors the frontend-security job and runs vulnerability scans on the backend application.

   defaults:
     run:
       working-directory: ./backend
Enter fullscreen mode Exit fullscreen mode

Stage 3: Container image creation and security

This stage builds the frontend and backend applications into Docker images, pushes them to the AWS ECR repository, and then scans the images for security vulnerabilities using Trivy and Snyk.

Build, validate and push your frontend docker image with the frontend image job

https://gist.github.com/Iheanacho-ai/e8672ae025e5d4ec5be3646f42534027

The frontend-image job consists of a job definition and a series of steps that build the Dockerfile located in your frontend directory and push the resulting image to your frontend AWS ECR repository.

Below is a breakdown of the job definition and its steps.

Frontend-image job definition:

 frontend-image:
   needs: frontend-security
   runs-on: ubuntu-latest
   permissions:
     contents: read
     security-events: write
     actions: read
     id-token: write

Enter fullscreen mode Exit fullscreen mode

This code block above specifies the following:

  • needs: frontend-security: Specifies that the job will only run after the frontend-security job completes successfully
  • permissions: Specifies the permissions required for this job, including:
    • contents: read: Allows the job to read the repository content
    • security-events: write: Enables uploading of vulnerability scan results.
    • actions: read: Grants read access to GitHub Actions metadata.
    • id-token: write: Allows the use of OpenID Connect (OIDC) tokens.

Once the environment is set up, the next step is to define the actions the frontend-image job will take, specifically, building and pushing the Docker image to a container registry and validating its security.

Frontend-image job steps

Here are the steps the job takes to build, push and scan the Docker images:

   steps:
     - name: Checkout the application code
       uses: actions/checkout@v4

     - name: Configure AWS credentials
       uses: aws-actions/configure-aws-credentials@v4
       with:
         aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
         aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
         aws-region: ${{ secrets.AWS_REGION }}

     - name: Build and push frontend Docker image to ECR
       working-directory: ./frontend
       run: |
         aws ecr get-login-password --region ${{ secrets.AWS_REGION }} | docker login --username AWS --password-stdin ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.${{ secrets.AWS_REGION }}.amazonaws.com
        IMAGE_URI=${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.${{ secrets.AWS_REGION }}.amazonaws.com/frontend
        docker build -t ${IMAGE_URI}:latest .
        docker push ${IMAGE_URI}:latest

     - name: Run Trivy vulnerability scanner
       uses: aquasecurity/trivy-action@master
       with:
         image-ref: "${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.${{ secrets.AWS_REGION }}.amazonaws.com/frontend:latest"
         format: "sarif"
         output: "trivy-results.sarif"
         severity: "CRITICAL,HIGH"

     - name: Install Snyk CLI
       uses: snyk/actions/setup@master
       with:
         snyk-token: ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Authenticate
       run: snyk auth ${{ secrets.SNYK_TOKEN }}

     - name: Snyk Container monitor
       run: snyk container monitor ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.${{ secrets.AWS_REGION }}.amazonaws.com/frontend:latest --file=Dockerfile
       working-directory: ./frontend

     - name: Run Snyk to check for vulnerabilities in the Docker image
       uses: snyk/actions/docker@master
       with:
         image: ${{ secrets.AWS_ACCOUNT_ID }}.dkr.ecr.${{ secrets.AWS_REGION }}.amazonaws.com/frontend:latest
         args: --file=frontend/Dockerfile --severity-threshold=high
       env:
         SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
       continue-on-error: true


Enter fullscreen mode Exit fullscreen mode

The frontend-image job does the following:

  • Configure AWS credentials: Sets up AWS credentials from GitHub Secrets so the job can interact with ECR.
  • Build and push frontend Docker image to ECR: This step authenticates Docker with AWS ECR, builds the Docker image, and pushes it to your ECR repository. It includes:
    • aws ecr get-login-password ... | docker login ...: Logs into Amazon ECR using a token generated by AWS.
    • IMAGE_URI…: Defines the full Docker image URI, pointing to your ECR repo.
    • docker build -t ${IMAGE_URI}:latest .: Builds the Docker image from the ./frontend directory and tags it as <your-ecr-repo>:latest.
    • docker push ${IMAGE_URI}:latest: Uploads the latest version of your image to the frontend repository in ECR
  • Run Trivy vulnerability scanner: Scans the pushed Docker image for known vulnerabilities using Trivy and outputs the results in SARIF format.
  • Install Snyk CLI: Installs the Snyk CLI tool to perform additional security checks.
  • Snyk Authenticate: Logs into Snyk using your Snyk token from GitHub Secrets.
  • Snyk Container monitor: Uploads the Docker image to Snyk for continuous monitoring and alerting about new vulnerabilities as they are discovered.
  • Run Snyk to check for vulnerabilities in the Docker image: Performs a vulnerability scan of the frontend:latest Docker image in ECR. The workflow will continue even if high-severity issues are detected.

Build, validate and push your backend docker image with the **backend image** job

https://gist.github.com/Iheanacho-ai/dabc139fe35d496fa45eb2e6bca01278

The backend-image job mirrors the frontend-image job, but operates on the backend service. It builds, scans, and pushes the Docker image from the ./backend directory to the backend repository in AWS ECR.

Running your pipeline

Once you’ve configured your pipeline, the next step is to trigger it to build, test, and push your Docker images to AWS ECR. To do this, simply push your code to GitHub. GitHub will automatically detect the push, read the .github/workflows file, and run the pipeline.

Refer to the GitHub documentation if you are unsure how to push your local code to GitHub.

After pushing your code, navigate to your project repository on GitHub and click the Actions tab to monitor your workflow.

Note: If you don’t see any workflows under the Actions tab, double-check that the ci.yaml file is placed correctly in the .github/workflows directory. Make sure there are no typos in folder or file names.

Once the workflow runs successfully, your Docker images will be available in the AWS ECR service.

What's next

With your CI/CD pipeline now successfully building and pushing secure Docker images to Amazon ECR, you're ready to move on to the next phase of the series.

In Part 3, you will:

  • Set up an Amazon EKS cluster using the jumphost from Part 1
  • Deploy your applications using the Docker images built in this pipeline
  • Implement GitOps practices with ArgoCD for automated deployments

This pipeline lays the foundation for your DevSecOps workflow, ensuring that only tested, secure, and validated code is deployed to production.

Final thoughts

Building a comprehensive CI/CD pipeline requires balancing speed, security, and reliability. This pipeline demonstrates how to integrate multiple security tools and best practices while maintaining development velocity. The automated testing, security scanning, and containerization process ensures that your applications are production-ready and secure.

Remember to regularly update your dependencies, review security scan results, and continuously improve your pipeline based on your team's needs and security requirements. The DevSecOps approach implemented here provides a solid foundation for scalable, secure application delivery.

In the next article, you’ll leverage these Docker images to deploy your application to a fully managed Kubernetes cluster with ArgoCD, completing the deployment automation loop.

Top comments (0)