Table of Content
1. Introduction
In part one, we covered the basics of my Cloud Resume Challenge journey. Now, let's dive into the heart of the challenge: deployment pipelines. These are the backbone of our cloud setup, making deployment smooth and efficient. Join me as we explore how these pipelines work and how they make our deployment process easier.
2. The Deployments
2.1. Infrastructure CI/CD pipeline
To streamline our infrastructure management, I've implemented a pipeline using GitHub Actions. Here's the breakdown of our workflow:
- Initialization: We kick off by initializing Terraform, ensuring all necessary configurations and plugins are set up correctly.
- Planning: Terraform meticulously plans the proposed changes, outlining what will be added, modified, or destroyed in our infrastructure.
- Validation: Once the plan is generated, Terraform validates it against our existing infrastructure, ensuring compatibility and compliance without making any actual modifications.
- Applying Changes: With the validated plan, Terraform proceeds to apply the changes, seamlessly updating our infrastructure to reflect the desired state.
Here is the workflow flow file:
name: CI
on:
push:
branches: [main]
permissions:
contents: 'read'
id-token: 'write'
env:
ARM_CLIENT_ID: '${{ secrets.AZURE_CLIENT_ID }}'
ARM_CLIENT_SECRET: '${{ secrets.AZURE_CLIENT_SECRET }}'
ARM_SUBSCRIPTION_ID: '${{ secrets.AZURE_SUBSCRIPTION_ID }}'
ARM_TENANT_ID: '${{ secrets.AZURE_TENANT_ID }}'
jobs:
build:
runs-on: ubuntu-latest
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
- uses: hashicorp/setup-terraform@v3
- name: Initialize terraform configuration
run: terraform init
- name: Validate terraform configuration
run: terraform validate
- name: Plan terraform configuration
run: terraform plan -refresh=false
- name: Apply terraform configuration
run: terraform apply -refresh=false -auto-approve
Note: I used a service principal details and environment varaibles to authenticate terraform to azure. Read more about that HERE.
2.2. Frontend CI/CD Pipeline
For the deployment pipeline for the frontend, i have used github actions workflows. The workflow includes the following steps:
- Azure Login: Authenticates access to the Azure environment.
- Delete Previous Content: Clears out existing website content to ensure a clean deployment.
- Upload New Content: Transfers the updated website content to the Azure platform.
- Purge Azure CDN: Refreshes the Azure CDN to reflect the latest changes.
- Azure Logout: Safely terminates the Azure session.
Here is the workflow flow file:
name: Cloud Resume Deployment
on:
push:
branches: [main]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: azure/login@v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Delete previous content
uses: azure/CLI@v1
with:
inlineScript: |
az storage blob delete-batch --account-name ${{ vars.STORAGE_ACCOUNT_NAME }} --auth-mode key --source '$web'
- name: Upload to blob storage
uses: azure/CLI@v1
with:
inlineScript: |
az storage blob upload-batch --account-name ${{ vars.STORAGE_ACCOUNT_NAME }} --auth-mode key -d '$web' -s .
- name: Purge CDN endpoint
uses: azure/CLI@v1
with:
inlineScript: |
az cdn endpoint purge --content-paths "/*" --profile-name ${{ vars.CDN_PROFILE_NAME }} --name ${{ vars.CDN_ENDPOINT_NAME }} --resource-group ${{ vars.RESOURCE_GROUP }}
# Azure logout
- name: logout
run: |
az logout
if: always()
2.3. Backend CI/CD Pipeline
I decided to switch it up a bit and used azure DevOps for deploying to the Azure functions. The script steps:
- Installing Dependencies: Ensuring all necessary dependencies are set up and ready to go.
- Publishing Deployment Artifact: Packaging everything up neatly for deployment to Azure Functions.
Here is the pipeline file:
name: Cloud Resume Deployment
on:
push:
branches: [main]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: azure/login@v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Delete previous content
uses: azure/CLI@v1
with:
inlineScript: |
az storage blob delete-batch --account-name ${{ vars.STORAGE_ACCOUNT_NAME }} --auth-mode key --source '$web'
- name: Upload to blob storage
uses: azure/CLI@v1
with:
inlineScript: |
az storage blob upload-batch --account-name ${{ vars.STORAGE_ACCOUNT_NAME }} --auth-mode key -d '$web' -s .
- name: Purge CDN endpoint
uses: azure/CLI@v1
with:
inlineScript: |
az cdn endpoint purge --content-paths "/*" --profile-name ${{ vars.CDN_PROFILE_NAME }} --name ${{ vars.CDN_ENDPOINT_NAME }} --resource-group ${{ vars.RESOURCE_GROUP }}
# Azure logout
- name: logout
run: |
az logout
if: always()
The remaining steps are configured in Azure DevOps by creating a release pipeline. Here is mine shown below:
3. What's Next
While the project is mostly complete, there's always room for improvement. Here are some areas I'll be focusing on to enhance my knowledge and experience:
- Implementing unit testing for Python code.
- Conducting code scans with SonarQube or Synk for security and quality assurance.
- Modularizing infrastructure to make it more reusable and scalable.
- Privatizing Azure infrastructure behind a VNet (Virtual Network) for enhanced security.
- Setting up a DNS Zone and managing my custom domain for better branding and accessibility.
- Implementing monitoring solutions to ensure system health and performance optimization.
4. Conclusion
That wraps up part two of our Cloud Resume Challenge journey! We've dived deep into the deployment pipelines, seeing how they make our cloud setup smooth and efficient. From initializing Terraform to deploying our frontend and backend, each step has been crucial in streamlining our deployment process.
But our journey doesn't end here. We've outlined some areas for improvement, like implementing unit testing, conducting code scans, and making our infrastructure more modular and secure. These steps will help us further enhance our project and deepen our understanding of cloud technologies.
Top comments (0)