Hello DevOps community! π
I'm excited to share the highlights and achievements of my Week 2 in the DevOps journey. This week was intense, insightful, and filled with hands-on experience across AWS, Azure, SSH, Linux commands, CLI tools, and team collaboration frameworks.
βοΈ Cloud Platforms β AWS and Azure
This week, I took a deep dive into AWS and Azure, focusing on real-world, hands-on learning.
β
Key Accomplishments:
β
Created AWS and Azure accounts.
β Launched and configured virtual machines (instances) on AWS.
β Learned to create EC2 instances on AWS and manage key configurations.
β Successfully connected to AWS instances using SSH via private .pem key files.
β οΈ Major Challenge (And Proud Moment):
I created the instance from Windows, which downloaded the private key file there. Then I had to connect the Ubuntu system to the AWS instance.
To solve this:
I shared the .pem file from Windows to Ubuntu.
Used Add Location via file manager, input the system IP, and securely moved the .pem file.
In VirtualBox, I used Shared Folders to manage file access between host and VM.
This took almost 2 days of research and troubleshooting, but I cracked it! πͺ
π SSH, Linux, and Shell Scripting
Understanding how to securely connect and automate processes is vital in DevOps.
π» Linux Commands Mastered:
history β track command history
ls -ltr, cat, touch, cd, pwd β navigation and file handling
nproc β check CPU cores
chmod 777 β permission setup
sudo β run commands as superuser
π‘ SSH & Shell Scripting:
Connected to EC2 instances using ssh -i "key.pem" ubuntu@ip-address.
Created automated tasks via Shell Scripting, instead of executing commands one by one.
Sample shell workflow:
bash
Copy
Edit
!/bin/bash
sudo apt update
sudo apt install apache2 -y
echo "Apache Installed Successfully!"
π§ AWS CLI and S3
I also started working with AWS CLI to perform cloud operations from the terminal.
π Hands-On with AWS CLI:
Installed and configured AWS CLI using aws configure.
Created S3 buckets using CLI.
Uploaded folders and files to AWS S3 via CLI.
Learned to manage access policies and permissions.
π¨βπΌ Team Collaboration & Agile Methodologies
This week, I also explored how cross-functional teams work in a DevOps environment.
π§©** Understanding Team Roles:**
*Business Analyst *β Gathers user requirements.
Product Owner β Prioritizes and refines requirements.
Product Manager β Aligns product roadmap and business goals.
Software Architect β Designs system architecture.
*UI/UX Designer *β Plans user interface and experience.
Developers β Implement features.
QA/Testers β Ensure quality through testing.
DevOps Engineer β Bridges development and operations.
SRE (Site Reliability Engineer) β Ensures system uptime and performance.
π Tools Explored:
Jira β For Agile documentation, sprint planning, and issue tracking.
Scrum Framework β Learned ceremonies like Sprint Planning, Daily Standups, Reviews, and Retrospectives.
π οΈ Infrastructure as Code (IaC)
While I havenβt used it yet, I explored Terraform and learned that:
It's a powerful Infrastructure as Code tool.
It supports multi-cloud platforms like AWS, Azure, and GCP.
Will be focusing on it in upcoming weeks for automated deployments.
π** Week 2 Summary**
β
Created and connected virtual machines on AWS & Azure
β
Managed private key sharing between Windows and Ubuntu
β
Mastered shell scripting and basic Linux CLI
β
Configured and used AWS CLI to manage S3 buckets
β
Learned DevOps team structure and Agile collaboration
β
Explored tools like Jira and Scrum
β
Discovered Terraform and IaC principles
π Onward to Week 3β¦
Week 2 was all about getting hands-on, solving real-world issues, and building confidence.
I'm excited to go even deeper into automation, CI/CD, and containerization in the next leg of my journey.
π’ Connect with me if you're on a similar DevOps path. Let's grow together!
Top comments (0)