Part 2 – Automating the Pipeline
In Part 1 of this series, I explained how I wanted my personal portfolio to be more than just a few HTML files sitting on GitHub Pages.
I wanted it to behave like a production-grade application: built, scanned, and deployed automatically with security woven into the process.
This is the part where things got interesting — the pipeline.
🏗️ The App Itself
Let’s keep it real: my app is not a full-stack system.
It’s a portfolio website — plain and simple.
- HTML for structure
- CSS for styling
- JavaScript for a little interactivity
That’s it.
But what made it special was how I treated it:
- I Dockerized it using an NGINX base image to serve static files.
- I pushed that image to AWS Elastic Container Registry (ECR).
- Then, I deployed it to Azure Container Apps using Terraform.
Even though the app was static, the pipeline was dynamic.
🔄 The Pipeline Workflow
Here’s how the GitHub Actions workflow was designed:
- Checkout code from GitHub
- Build Docker image (NGINX serving my HTML/CSS/JS)
- Tag and push the image to AWS ECR
- Run security scans:
- Trivy → scan Docker image for vulnerabilities
- TFSEC → scan Terraform code
- SonarCloud → check code quality
- Deploy with Terraform to Azure Container Apps
⚠️ The Roadblocks
It wasn’t smooth sailing at first.
Here are some of the biggest challenges I hit:
- SonarCloud Failures
- Initially it was configuration issue, till i understood it clearly and passed its project key, token, and needs into Repo-Env-Secrets & Variables.
- Then next; SonarCloud kept failing the Quality Gate.
- Why? Because my project had no backend logic — just HTML, CSS, and JS.
- That meant 0% test coverage (and SonarCloud hates that).
👉 My fix: I kept SonarCloud in the pipeline, but marked it so failures didn’t block the build.
That way, I still got visibility on quality checks, but my deployments weren’t stopped.
SonarCloud Failed due to 0% test coverage, since my app doesnt have any actual complex logic. Nonetheless, it doesnt change the fact that SonarCloud scanner was integrated and showcases my potential.
Here is a simple SonarCloud workflow you can get your hands dirty with, to get started on your next project.
name: "sonar_cloud_scan_github_actions"
on:
workflow_dispatch:
jobs:
DemoSonarCloudSCan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: SonarCloud Scan
uses: sonarsource/sonarcloud-github-action@master
env:
GITHUB_TOKEN: ${{ secrets.GIT_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
with:
args: >
-Dsonar.organization=rekhugopal
-Dsonar.projectKey=SonarCloudCodeAnalyisis
-Dsonar.python.coverage.reportPaths=coverage.xml
It is a manual trigger workflow, change it to automatic by modifying it to something like this for both a push and when a PR is opened:
name: "sonar_cloud_scan_github_actions"
on:
push:
branches:
- main
pull_request:
branches:
- main
Also, ensure to pass in the appropriate keys, variables and tokens in Actions or Github Env Variables.
- Docker “latest” Tag Problem
- At first, I tagged my image as latest.
- The issue? If nothing in the image changed, ECR wouldn’t show a new version.
- It felt like my push was “skipped,” even though the workflow ran.
👉 My fix: I switched to unique tags using Git commit SHA.
Here is my code block for that from my actions.
docker build -t my-portfolio:${{ github.sha }} .
docker push my-portfolio:${{ github.sha }}
This way, every commit created a brand new image version in ECR.
No confusion, easy rollback.
Trivy Misconfiguration
- My first Trivy scan failed with an invalid image reference error.
- I had forgotten to include the repository name before the image hash.
👉 My fix: I updated the scan step to reference the full ECR image path.
Once fixed, Trivy scanned the image and reported vulnerabilities clearly.
It was at this point, I had to make each workflow modular, and seperate to be able to isolate without messing up each workflow or pipeline. Segmenting it made me focus like a laser! And it work like a charm!
- Terraform Secrets
- Terraform needed my ECR token and Azure credentials.
- Storing them in plain text was not an option.
👉 My fix: I stored them securely in Terraform Cloud as sensitive variables. That way, I avoided exposing secrets in GitHub Actions or in my repo.
And the rest fix were properly passing in the right ENV values without human errors such as Spaces, Upper-cases, or Additional characters, for The Pipeline to pick up the right Secret Envs since I did not want to "Hardcode" any value or Secrets (See Screenshot below).
✅ The Working Pipeline
After fixing those issues, my pipeline looked like this:
Commit to GitHub
↓
GitHub Actions kicks off
↓
Build Docker image (NGINX + portfolio files)
↓
Push image to AWS ECR (tagged with commit SHA)
↓
Run scans (SonarCloud, TFSEC, Trivy)
↓
Deploy with Terraform to Azure Container Apps
And the best part? Every push = automatic build + scan + deploy.
My ECR Repo images, with each change, commit, and push it is built, tagged, scanned, before being deployed.
💡 Why This Matters
Even though my app is just static HTML/CSS/JS, the pipeline is enterprise-grade.
It shows that:
- I can build CI/CD pipelines from scratch
- I can integrate security tools (DevSecOps mindset)
- I can work across AWS + Azure
- I can solve real-world problems like failing quality gates, versioning issues, and secret management.
This is the stuff hiring managers love to see — not just “I can write code,” but “I can run a secure DevOps workflow.”
🚀 Next Up
In Part 3, I’ll walk through the Terraform setup and how I provisioned Azure Container Apps with IaC.
Spoiler: Terraform Cloud made my life much easier, but it came with its own surprises.
Stay tuned.
Click here to view part 3 ➡️
Top comments (0)