π°οΈ Hello Cloud Sentinels!!
May all your logs be clean and your dashboards insightful! πβοΈ
π All code, docs, and resources are available in my GitHub repository:
madhurimarawat
/
Cloud-Computing
This repository focuses on cloud computing and demonstrates how to set up virtual machines, S3, and other services using LocalStack. It provides a comprehensive guide to simulating AWS services locally for development and testing purposes.
Cloud-Computing
This repository focuses on cloud computing and demonstrates how to set up virtual machines, S3, and other services using LocalStack. It provides a comprehensive guide to simulating AWS services locally for development and testing purposes.
Tools and Technologies βοΈπ»
1. AWS CLI
AWS Command Line Interface (CLI) is a powerful tool that allows users to interact with AWS services directly from the terminal. It simplifies managing cloud resources by providing commands for a wide range of AWS services, enabling tasks such as provisioning, managing, and automating workflows with ease.
2. LocalStack
LocalStack is a fully functional, local testing environment for AWS services. It enables developers to simulate AWS services on their local machines, facilitating the development and testing of cloud-based applications without needing access to an actual AWS account.
3. Docker
Docker is a containerization platform that allows developers to build, share, and run applications in isolated environments calledβ¦
In the last post,

π Implementing Cloud Monitoring and Logging π οΈ
Madhurima Rawat γ» May 19
we previously explored monitoring and logging, and how logs are managed within the cloud.
Today, weβre diving into something equally exciting and essential:
π Cloud-based CI/CD Pipelines!
First, Iβll walk you through what CI/CD really is, why itβs a game-changer in modern development, and then weβll move on to its hands-on implementation β from seamless code integration to fully automated deployment.
In this experiment, weβll set up a Continuous Integration/Continuous Deployment (CI/CD) pipeline using GitHub Actions, Docker, and LocalStack. You'll gain practical insights into automating cloud deployments with AWS services, particularly leveraging the AWS CLI and S3 for efficient resource management.
Letβs build pipelines that flow effortlessly β like water through perfectly engineered pipes! βοΈπππ οΈ
π How It Works
1οΈβ£ GitHub Actions:
- Automates build, test, and deployment processes directly from GitHub.
- Triggers workflows on code commits, pull requests, or scheduled intervals.
2οΈβ£ Docker:
- Creates containerized environments for running applications.
- Ensures that the pipeline runs consistently across different systems.
3οΈβ£ LocalStack:
- Simulates AWS cloud services locally (S3, Lambda, DynamoDB, etc.).
- Allows developers to test AWS-related workflows without real AWS costs.
4οΈβ£ AWS CLI & S3:
- AWS CLI automates interactions with AWS services.
- S3 (Simple Storage Service) acts as a storage bucket for deployment artifacts.
πΉ Use Cases
β Automated Deployment Pipelines
- Code is automatically tested, built, and deployed to cloud environments.
- Reduces manual intervention, ensuring faster release cycles.
β Simulating AWS Services Locally
- Developers can test AWS-dependent applications without incurring AWS costs.
- Ideal for offline development or local testing of cloud-native applications.
β Cloud-Based Workflow Testing
- Ensures infrastructure as code (IaC) principles by defining cloud setups in version control.
- Useful for DevOps teams deploying applications on AWS.
β Disaster Recovery & Backup Automation
- CI/CD can automate the creation of S3 backups and deployment rollbacks.
- Helps maintain data integrity and business continuity.
β Microservices & Serverless Development
- Supports Lambda function deployment, API Gateway integration, and event-driven applications.
- Helps teams working on serverless computing streamline their workflow.
π Real-Life Examples
π E-commerce Platforms
- Deploy new features to AWS-hosted websites seamlessly without downtime.
- Test changes in a LocalStack AWS simulation before pushing them live.
π Financial Services
- Automate deployment of fraud detection algorithms in a secure pipeline.
- Ensure compliance by testing AWS interactions locally before deploying.
π Mobile App Backend Development
- Automatically deploy backend APIs (hosted on AWS Lambda) after each successful commit.
- Use LocalStack to test S3 storage operations without using real AWS resources.
π AI/ML Model Deployment
- Automate pushing trained ML models to S3 for cloud inference.
- Use GitHub Actions to validate the model before deployment.
π― Key Benefits
πΉ Faster Development Cycles β Reduces manual deployment efforts.
πΉ Cost-Efficient Testing β Simulates AWS without incurring costs.
πΉ Reliable Cloud Automation β Ensures seamless integration & deployment.
πΉ Enhanced Security β Controlled CI/CD workflow reduces human errors.
πΉ Scalability β Easily extendable for various AWS services.
π· Visual Representation
π Version Control in CI/CD
π CI/CD Pipeline Execution
πΌοΈ About the Cover Image:
The visual begins with a cloud icon surrounded by a π CI/CD loop, symbolizing continuous integration and deployment within the cloud environment.
From the cloud, the flow branches out into three development lines, each representing different GitHub branches. These branches visually merge into one, signifying the integration into the main branch β a core concept of CI/CD pipelines.
Next, the image shows three connected user icons, representing seamless team collaboration through GitHub and CI/CD processes. It highlights how developers work together, push changes, and build confidently in sync.
π¨ The entire theme follows a black and grey palette, inspired by the GitHub aesthetic.
β
The icons appear in green, reinforcing the success-driven nature of CI/CD pipelines β where green indicates passing builds, clean merges, and successful deployments.
CI/CD Pipelines with GitHub Actions, Docker, and LocalStack
1. Creating an S3 Bucket
Command:
aws --endpoint-url=http://localhost:4566 s3 mb s3://my-ci-cd-artifacts
Explanation:
-
aws s3 mb
β Creates a new S3 bucket. -
s3://my-ci-cd-artifacts
β The name of the bucket being created. -
--endpoint-url=http://localhost:4566
β Uses LocalStack to simulate AWS services.
Output:
make_bucket: my-ci-cd-artifacts
2. Attempting to Create a CodeCommit Repository
Command:
aws --endpoint-url=http://localhost:4566 codecommit create-
repository --repository-name my-repo
Explanation:
-
aws codecommit create-repository
β Creates a new AWS CodeCommit repository. -
--repository-name my-repo
β Assigns the repository name asmy-repo
. -
--endpoint-url=http://localhost:4566
β Uses LocalStack.
Error Output:
An error occurred (InternalFailure) when calling the
CreateRepository operation: API for service 'codecommit'
not yet implemented or pro feature - please check
https://docs.localstack.cloud/references/coverage/ for further information
3. Initializing a Git Repository
Command:
git init
Explanation:
-
git init
β Initializes a new Git repository in the current directory.
Output:
Initialized empty Git repository in C:/Users/rawat/Documents/8
SEMESTER/Cloud Computing/Lab/Experiment 10/Codes/.git/
4. Staging and Committing Files
Commands:
git add .
git commit -m "Initial commit"
Explanation:
-
git add .
β Stages all files for commit. -
git commit -m "Initial commit"
β Commits the staged files with a message.
Output:
[master (root-commit) 2dfb5b6] Initial commit
3 files changed, 1153 insertions(+)
create mode 100644 Command Prompt Input and Output Explanation.md
create mode 100644 Command Prompt Input and Output Explanation.pdf
create mode 100644 Command Prompt Input and Output.txt
5. Uploading a ZIP File to S3
Command:
aws --endpoint-url=http://localhost:4566 s3 cp my-code.zip
s3://my-ci-cd-artifacts/
Explanation:
-
aws s3 cp
β Copies a file to S3. -
my-code.zip
β The file being uploaded. -
s3://my-ci-cd-artifacts/
β Destination bucket in S3. -
--endpoint-url=http://localhost:4566
β Uses LocalStack.
Error Output:
The user-provided path my-code.zip does not exist.
6. Creating a ZIP Archive
Command:
powershell Compress-Archive -Path * -DestinationPath my-code.zip
Explanation:
-
Compress-Archive -Path * -DestinationPath my-code.zip
β Creates a ZIP archive of all files in the directory.
7. Uploading the ZIP File Again
Command:
aws --endpoint-url=http://localhost:4566 s3 cp my-code.zip
s3://my-ci-cd-artifacts/
Output:
upload: .\my-code.zip to s3://my-ci-cd-artifacts/my-code.zip
8. Listing the Uploaded Files in S3
Command:
aws --endpoint-url=http://localhost:4566 s3 ls s3://my-ci-cd-artifacts/
Output:
2025-03-08 10:32:42 289415 my-code.zip
9. Setting Up a Remote Git Repository
Commands:
git remote add origin https://github.com/madhurimarawat/Cloud-
Computing.git
git branch -M main
git push -u origin main
Explanation:
-
git remote add origin <repo-url>
β Links the local repository to GitHub. -
git branch -M main
β Renames the current branch tomain
. -
git push -u origin main
β Pushes the code to GitHub.
Error Output:
To https://github.com/madhurimarawat/Cloud-Computing.git
! [rejected] main -> main (fetch first)
error: failed to push some refs to
'https://github.com/madhurimarawat/Cloud-Computing.git'
hint: Updates were rejected because the remote contains
work that you do not
hint: have locally. This is usually caused by another
repository pushing to
hint: the same ref. If you want to integrate the remote changes, use
hint: 'git pull' before pushing again.
Fix:
To resolve this issue, run:
git pull origin main --rebase
git push -u origin main
10. Pulling the Latest Changes from GitHub
Command:
git pull origin main --rebase
Explanation:
- Fetches changes from the remote repository and applies them using rebase instead of a merge.
- Ensures a linear commit history by reapplying local changes on top of the latest remote changes.
Output:
remote: Enumerating objects: 240, done.
remote: Counting objects: 100% (240/240), done.
remote: Compressing objects: 100% (212/212), done.
remote: Total 240 (delta 100), reused 43 (delta 21), pack-reused 0
Receiving objects: 100% (240/240), 9.22 MiB | 1.11 MiB/s, done.
Resolving deltas: 100% (100/100), done.
From https://github.com/madhurimarawat/Cloud-Computing
* branch main -> FETCH_HEAD
* [new branch] main -> origin/main
Successfully rebased and updated refs/heads/main.
11. Staging All Changes
Command:
git add .
Explanation:
- Stages all modified and newly created files in the current directory for the next commit.
12. Checking for an Ongoing Rebase
Command:
git rebase --continue
Explanation:
- Used to continue an ongoing rebase operation if there are conflicts.
- In this case, the error means there was no ongoing rebase, so this step was unnecessary.
Output:
fatal: no rebase in progress
13. Pushing Changes to GitHub
Command:
git push -u origin main
Explanation:
- Pushes local changes to the remote repository (
origin
), settingmain
as the upstream branch. - This makes future
git push
commands simpler by automatically pushing toorigin main
.
Output:
Enumerating objects: 6, done.
Counting objects: 100% (6/6), done.
Delta compression using up to 8 threads
Compressing objects: 100% (5/5), done.
Writing objects: 100% (5/5), 282.43 KiB | 31.38 MiB/s, done.
Total 5 (delta 1), reused 1 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (1/1), completed with 1 local object.
To https://github.com/madhurimarawat/Cloud-Computing.git
b201b02..eb4faf7 main -> main
branch 'main' set up to track 'origin/main'.
Output Breakdown:
- Delta compression β Reduces the size of transmitted data.
- Objects written successfully β Confirms the push was successful.
-
Tracking branch set up β Future
git push
commands will default toorigin main
.
14. Viewing the YAML Deployment Workflow
Link:
Purpose:
This GitHub Actions workflow automates a manual deployment process by performing the following steps:
Workflow Breakdown:
-
Triggering the Workflow
- The workflow is manually triggered using
workflow_dispatch
, meaning it does not run automatically on commits or merges.
- The workflow is manually triggered using
-
Job Execution
- A single job named
deploy
is executed on Ubuntu-latest, the default GitHub-hosted runner.
- A single job named
Steps in the Workflow
- Checkout Repository
- Uses `actions/checkout@v4` to fetch the repository contents into the GitHub Actions runner.
- (Optional) Install AWS CLI
- This step is commented out but would install the AWS CLI if needed.
- (Optional) Zip the Repository
- Another commented-out step that creates a ZIP archive of the repository.
- (Optional) Upload to LocalStack S3
- Demonstrates an attempt to upload the ZIP file to a **LocalStack S3 bucket**.
- **β οΈ This step does NOT work in GitHub Actions**, since LocalStack would need to be running on the same machine.
-
Print Success Message
- Simply prints
"Successfully run!"
to indicate that the workflow has been executed.
- Simply prints
Key Considerations:
- This workflow is primarily a template for deploying to LocalStack S3.
- Since GitHub Actions runs on cloud-hosted VMs, it cannot access LocalStack running locally.
- We can modify this workflow to deploy to a real AWS S3 bucket by configuring proper AWS credentials.
π Want to see how everything came together step by step? Check it out here:
π Experiment 10 Output (PDF)
π§ Curious about the exact commands and how they function? Explore the detailed input-output flow:
π₯οΈ CI/CD Setup Input-Output Flow (PDF)
π And thatβs a wrap on Cloud-based CI/CD Pipelines!
π Got awesome CI/CD tips, resources, or cheat sheets? Feel free to drop them in the comments β Iβd love to explore and share them!
π Want a compact version of all experiments with explanations and outputs in one place?
Grab the complete PDF here:
π Cloud Computing Lab Manual (PDF)
π This is the final article in this series.
Thank you all so much for sticking with me and exploring the world of Cloud Computing through each of these posts! Itβs been an incredible journey, and I hope you found each article insightful and practical.
π¬ Iβd love to hear from you:
β¨ Which article was your favorite?
β¨ How did you like the series overall?
β¨ And most importantly β what should I explore next?
Here are a few exciting ideas Iβm considering for the next series:
- π Big Data Analytics using Hadoop
- π’ Data Warehousing using MySQL
- ππ§Ή Data Wrangling using Python and MySQL
- π» A community-powered series based on my repo:
madhurimarawat / CodeCulture-Daily
A daily programming challenge repository where fun meets learning! With 39 challenges over 39 days, it helps coders enhance skills through practical tasks and interview prep. While maintenance is paused, contributions are welcome, and future expansions are planned to keep learning ongoing.
π CodeCulture-Daily
A daily programming challenge repository where fun meets learning! Solve challenges, learn new tools and technologies, and get featured as a top contributor. Join now to sharpen your coding skills!
βIt's not about ideas. It's about making ideas happen.β β Scott Belsky
Welcome to CodeCulture-Daily, your go-to spot for daily programming and tech challenges! π
By practicing daily, you'll not only sharpen your problem-solving skills but also learn and master various tools, technologies, algorithms, and much more. Whether you're starting as a beginner or pushing through advanced levels, CodeCulture-Daily is the perfect platform to grow your knowledge. π‘
π What to Expect
- π Daily Challenges: Every day at 7 PM, a new challenge will be posted in this repository.
-
π Three Levels: Each challenge is categorized by difficulty
- π’ Beginner
- π‘ Intermediate
- π΄ Advanced
-
π Domain Categories
-
Tech π»
- π₯οΈ Programming Languagesβ¦
-
Tech π»
π Let me know in the comments which one you'd love to dive into next!
Until the next article β goodbye for now! π
Letβs meet again in the next series. ππ»π
Top comments (0)