<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jeshlin P V</title>
    <description>The latest articles on DEV Community by Jeshlin P V (@jeshlin_pv_1628a63168e90).</description>
    <link>https://dev.to/jeshlin_pv_1628a63168e90</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jeshlin_pv_1628a63168e90"/>
    <language>en</language>
    <item>
      <title>How I Passed the AZ-900 Exam: My Study Plan, Resources &amp; Tips</title>
      <dc:creator>Jeshlin P V</dc:creator>
      <pubDate>Sat, 01 Mar 2025 04:46:18 +0000</pubDate>
      <link>https://dev.to/jeshlin_pv_1628a63168e90/how-i-passed-the-az-900-exam-my-study-plan-resources-tips-2cam</link>
      <guid>https://dev.to/jeshlin_pv_1628a63168e90/how-i-passed-the-az-900-exam-my-study-plan-resources-tips-2cam</guid>
      <description>&lt;p&gt;In this blog, I'll share my experience, the resources I used, and the strategies that helped me pass the exam with confidence.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I Took the AZ-900 Exam&lt;/strong&gt;&lt;br&gt;
As someone deeply interested in cloud computing, I wanted to solidify my understanding of Microsoft Azure and its core services. Since I already had hands-on experience with AWS, I thought learning Azure fundamentals would broaden my cloud knowledge and open up more opportunities.&lt;/p&gt;

&lt;p&gt;AZ-900 is an entry-level certification designed for beginners in cloud computing, making it perfect for students, IT professionals, and even non-technical individuals who want to understand cloud concepts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My Study Plan&lt;/strong&gt;&lt;br&gt;
I set aside two weeks to prepare for the exam. Here's how I structured my study plan:&lt;br&gt;
&lt;strong&gt;Week 1&lt;/strong&gt;: Focused on learning concepts from video tutorials&lt;br&gt;
&lt;strong&gt;Week 2&lt;/strong&gt;: Took practice tests and reviewed weak areas&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Resources I Used&lt;/strong&gt;&lt;br&gt;
I relied on two main resources that were incredibly helpful:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1️. Adam Marczak's YouTube Videos&lt;/strong&gt; &lt;br&gt;
Adam Marczak is one of the best instructors for Azure. His free YouTube playlist on AZ-900 covers all topics in a simple, engaging, and practical manner. His real-world explanations helped me grasp concepts quickly. &lt;a href="https://youtube.com/playlist?list=PLGjZwEtPN7j-Q59JYso3L4_yoCjj2syrM&amp;amp;si=A7bUcmXgb0j9C39-" rel="noopener noreferrer"&gt;Link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tip&lt;/strong&gt;: Take notes while watching the videos and try to relate the concepts to real-world scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2️. Udemy Practice Tests&lt;/strong&gt; &lt;br&gt;
Udemy provides high-quality practice tests that helped me understand the exam pattern and identify weak areas. The detailed explanations for each question were useful in reinforcing my learning. &lt;a href="https://www.udemy.com/course/practice-tests-azure-fundamentals/" rel="noopener noreferrer"&gt;Link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tip&lt;/strong&gt;: Don't just memorize answers - understand the logic behind them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Topics You Should Focus On&lt;/strong&gt;&lt;br&gt;
AZ-900 tests your understanding of Azure's core services, pricing, security, and compliance. Here are the key topics I focused on:&lt;br&gt;
 &lt;strong&gt;Cloud Concepts&lt;/strong&gt; - Types of cloud models &amp;amp; benefits&lt;br&gt;
 &lt;strong&gt;Azure Core Services&lt;/strong&gt; - Compute, Storage, Networking&lt;br&gt;
 &lt;strong&gt;Security &amp;amp; Compliance&lt;/strong&gt; - Identity, Governance, Policies&lt;br&gt;
 &lt;strong&gt;Azure Pricing &amp;amp; Support&lt;/strong&gt; - Cost management, SLAs&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My Exam Day Experience&lt;/strong&gt;&lt;br&gt;
On the day of the exam:&lt;br&gt;
✔ I revised my notes for 30 minutes&lt;br&gt;
✔ Managed my time well during the exam&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Tips for AZ-900 Success&lt;/strong&gt;&lt;br&gt;
 &lt;strong&gt;1. Watch Adam Marczak's videos&lt;/strong&gt; - They simplify complex topics&lt;br&gt;
 &lt;strong&gt;2. Take Udemy practice tests&lt;/strong&gt; - Helps in self-assessment&lt;br&gt;
 &lt;strong&gt;3. Use the Microsoft Learn portal&lt;/strong&gt; - Explore official Azure documentation&lt;br&gt;
 &lt;strong&gt;4. Understand, don't memorize&lt;/strong&gt; - Focus on concepts&lt;br&gt;
 &lt;strong&gt;5. Stay confident on exam day&lt;/strong&gt; - You got this!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My Takeaway&lt;/strong&gt;&lt;br&gt;
Passing AZ-900 boosted my confidence in cloud computing and helped me understand Microsoft Azure better. If you're preparing for the exam, trust the process, use the right resources, and you'll definitely succeed!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Automated Image Labeling System Using AWS Rekognition: A Serverless Approach</title>
      <dc:creator>Jeshlin P V</dc:creator>
      <pubDate>Sat, 17 Aug 2024 12:06:37 +0000</pubDate>
      <link>https://dev.to/jeshlin_pv_1628a63168e90/automated-image-labeling-system-using-aws-rekognition-a-serverless-approach-1aia</link>
      <guid>https://dev.to/jeshlin_pv_1628a63168e90/automated-image-labeling-system-using-aws-rekognition-a-serverless-approach-1aia</guid>
      <description>&lt;p&gt;To create an image labeling system using AWS services, you can leverage AWS Rekognition, a powerful image and video analysis service. Below are the general steps to set up an image labeling system:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-by-Step Guide&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Set Up AWS Rekognition&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Go to the S3 console and create a new bucket.&lt;br&gt;
Upload images to this bucket. I uploaded an image named 'street.jpg'.&lt;br&gt;
Configure IAM Role: Ensure the IAM role has the necessary permissions to access S3 and use Rekognition.&lt;br&gt;
Attach the following policies to the IAM role:&lt;/p&gt;

&lt;p&gt;-AmazonRekognitionFullAccess&lt;br&gt;
-AmazonS3ReadOnlyAccess&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Use AWS Rekognition to Detect Labels&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Python Script: Use Boto3, the AWS SDK for Python, to interact with Rekognition.&lt;br&gt;
To configure AWS, run the command "aws configure".&lt;br&gt;
Create a working directory and create a file named 'labels.py'.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fql729j0c8pcnxqnr94du.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fql729j0c8pcnxqnr94du.png" alt="Image description" width="583" height="278"&gt;&lt;/a&gt;&lt;br&gt;
Copy and paste the code in the text editor.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3

def detect_labels(photo, bucket):
    client = boto3.client('rekognition')

    response = client.detect_labels(Image={'S3Object': {'Bucket': bucket, 'Name': photo}},
                                    MaxLabels=10)

    print('Detected labels for ' + photo)
    for label in response['Labels']:
        print(f"{label['Name']} : {label['Confidence']}")

def main():
    bucket = 'image-labels-bucket'
    photo = 'street.jpg'
    detect_labels(photo, bucket)

if __name__ == "__main__":
    main()

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3. Run the Script&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Run your Python script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python labels.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftiz7dw6glwaj8910aue0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftiz7dw6glwaj8910aue0.png" alt="Image description" width="420" height="212"&gt;&lt;/a&gt;&lt;br&gt;
The image I used (street.jpg):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwlmlt2rcqeimdgijdriy.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwlmlt2rcqeimdgijdriy.jpg" alt="Image description" width="800" height="533"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;4. Test with Different Images&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can test the script with different images by modifying the image name.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This project effectively demonstrates the use of AWS Rekognition to automate image labeling, providing an efficient and scalable solution for processing visual data. By utilizing a serverless architecture, the system is both cost-effective and easy to maintain, making it a practical approach for real-world applications.&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>aws</category>
      <category>cloudcomputing</category>
      <category>cloudskills</category>
    </item>
    <item>
      <title>Design and Implementation of a Fault-Tolerant VPC Architecture with Multi-AZ High Availability on AWS</title>
      <dc:creator>Jeshlin P V</dc:creator>
      <pubDate>Tue, 09 Jul 2024 07:42:44 +0000</pubDate>
      <link>https://dev.to/jeshlin_pv_1628a63168e90/design-and-implementation-of-a-fault-tolerant-vpc-architecture-with-multi-az-high-availability-on-aws-oop</link>
      <guid>https://dev.to/jeshlin_pv_1628a63168e90/design-and-implementation-of-a-fault-tolerant-vpc-architecture-with-multi-az-high-availability-on-aws-oop</guid>
      <description>&lt;p&gt;In today's digital era, ensuring high availability and fault tolerance for applications is paramount. AWS offers robust tools and services to help achieve these goals, making it possible to build resilient, scalable, and secure infrastructure. This blog will walk you through the design and implementation of a custom Virtual Private Cloud (VPC) infrastructure on AWS, focusing on fault tolerance and high availability across multiple Availability Zones (AZs).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
Creating a fault-tolerant architecture means designing systems that continue to operate even when some components fail. By leveraging AWS services such as VPC, subnets, Internet Gateway (IGW), NAT Gateway, Elastic Load Balancer (ALB), and Auto Scaling Groups, you can build a highly resilient infrastructure that ensures your applications remain available and performant.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Overview&lt;/strong&gt;&lt;br&gt;
The primary objective of this project is to set up a custom VPC with the following key components:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;VPC: A custom VPC with a defined CIDR block.&lt;/li&gt;
&lt;li&gt;Subnets: Two public subnets and two private subnets spread across two different Availability Zones (AZs) to ensure high availability.&lt;/li&gt;
&lt;li&gt;Internet Gateway (IGW): Provides internet access to resources in the public subnets.&lt;/li&gt;
&lt;li&gt;NAT Gateway: Allows instances in the private subnets to access the internet securely for updates and patches.&lt;/li&gt;
&lt;li&gt;Route Tables: Separate route tables for public and private subnets to manage traffic routing.&lt;/li&gt;
&lt;li&gt;Elastic Load Balancer (ALB): Distributes incoming traffic across multiple instances in different AZs.&lt;/li&gt;
&lt;li&gt;Auto Scaling Group: Manages and scales instances in the private subnets automatically based on demand.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Architecture Design&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-AZ Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Distributing resources across multiple AZs enhances fault tolerance by ensuring that an outage in one AZ doesn't bring down the entire system. This architecture includes:&lt;/p&gt;

&lt;p&gt;Public Subnets: Two public subnets located in different AZs.&lt;br&gt;
Private Subnets: Two private subnets located in different AZs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Internet Gateway and NAT Gateway&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An Internet Gateway is attached to the VPC to facilitate inbound and outbound internet traffic for public-facing resources. A NAT Gateway is deployed in one of the public subnets to provide internet access to instances in the private subnets, ensuring they can download updates and patches securely without exposing them directly to the internet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Route Tables&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Two separate route tables are configured:&lt;/p&gt;

&lt;p&gt;Public Route Table: Directs traffic from public subnets to the Internet Gateway.&lt;br&gt;
Private Route Table: Routes traffic from private subnets to the NAT Gateway.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Elastic Load Balancer (ALB)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The ALB is deployed in the public subnets across multiple AZs, distributing incoming traffic to healthy instances. This setup ensures continuous availability and load distribution, even if an instance or an AZ fails.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Auto Scaling Group&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Auto Scaling Group launches instances in the private subnets, ensuring that the application scales automatically based on demand. By spreading instances across multiple AZs, the architecture can withstand AZ-level failures and maintain the desired capacity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Fault Tolerance Features&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Multi-AZ Distribution: By deploying resources in multiple AZs, the architecture can handle the failure of an entire AZ without affecting the application's availability.&lt;/li&gt;
&lt;li&gt;Elastic Load Balancer: The ALB automatically routes traffic to healthy instances. If an instance or an AZ fails, the ALB reroutes traffic to instances in other AZs.&lt;/li&gt;
&lt;li&gt;Auto Scaling Group: Ensures that the application maintains the required number of instances, launching new ones in different AZs if some instances fail.&lt;/li&gt;
&lt;li&gt;NAT Gateway: Using multiple NAT Gateways in different AZs can enhance fault tolerance, ensuring continuous internet access for instances in private subnets even if one NAT Gateway fails.
Benefits of the Architecture&lt;/li&gt;
&lt;li&gt;Scalability: The Auto Scaling group adjusts the number of running instances based on demand, ensuring optimal resource usage.&lt;/li&gt;
&lt;li&gt;High Availability: The multi-AZ deployment ensures that the application remains available even in the event of an AZ failure.&lt;/li&gt;
&lt;li&gt;Security: The separation of public and private subnets enhances security by restricting internet access to internal resources.&lt;/li&gt;
&lt;li&gt;Cost Efficiency: By automatically adjusting resources based on demand, the architecture helps minimize costs while maintaining performance.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Get the complete code &lt;a href="https://github.com/JESHLIN/cloud-infrastructure" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Procedures for Using Terraform Commands&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Terraform is a powerful tool for managing infrastructure as code. By following these steps, you can efficiently create, manage, and destroy AWS infrastructure using Terraform. Below are the key procedures and commands you'll use:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Initialize the Project&lt;/strong&gt;&lt;br&gt;
Initialize your Terraform project to download the necessary provider plugins.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 2: Validate the Configuration&lt;/strong&gt;&lt;br&gt;
Validate your Terraform files to ensure there are no syntax errors.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform validate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 3: Plan the Infrastructure Changes&lt;/strong&gt;&lt;br&gt;
Generate an execution plan to see what changes Terraform will make to your infrastructure.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkyer4carurv6o4vhu9xo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkyer4carurv6o4vhu9xo.png" alt="Image description" width="800" height="466"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Step 4: Apply the Infrastructure Changes&lt;/strong&gt;&lt;br&gt;
Apply the changes specified in the execution plan to create or update your infrastructure.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 5: Destroy the Infrastructure&lt;/strong&gt;&lt;br&gt;
When you no longer need the infrastructure, you can destroy it using:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
Building a resilient and scalable VPC infrastructure on AWS involves leveraging various AWS services and best practices to ensure high availability and fault tolerance. By distributing resources across multiple AZs, utilizing load balancing, and implementing auto-scaling, you can create a robust environment capable of handling failures gracefully and maintaining continuous operation.&lt;/p&gt;

&lt;p&gt;This architecture is an excellent foundation for deploying secure, scalable, and highly available applications on AWS, ensuring that your infrastructure can adapt to changing demands and withstand unexpected failures.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Blog: Creating, Modifying, and Destroying an EC2 Instance in AWS with Terraform</title>
      <dc:creator>Jeshlin P V</dc:creator>
      <pubDate>Wed, 26 Jun 2024 15:42:21 +0000</pubDate>
      <link>https://dev.to/jeshlin_pv_1628a63168e90/blog-creating-modifying-and-destroying-an-ec2-instance-and-hosting-a-static-website-using-s3-in-aws-with-terraform-11md</link>
      <guid>https://dev.to/jeshlin_pv_1628a63168e90/blog-creating-modifying-and-destroying-an-ec2-instance-and-hosting-a-static-website-using-s3-in-aws-with-terraform-11md</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hello Friends! I’ve been exploring the capabilities of Terraform and AWS. Recently, I worked on creating, modifying, and destroying an EC2 instance with Terraform. This blog will detail the steps I followed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding the tools&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Terraform&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Terraform is an open-source tool that allows you to define and provision infrastructure as code. Today, I learned some of the basics of Terraform and used it to create an EC2 instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon EC2 (Elastic Compute Cloud)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Amazon EC2 provides scalable virtual servers in the cloud, allowing you to run applications and services with flexible computing power.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Here’s a brief overview of my Terraform journey:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;● Installation: I started by installing Terraform on my local machine. This was straightforward, requiring just a few commands to download and install the binary.&lt;/p&gt;

&lt;p&gt;● Writing Configuration Files: Terraform uses configuration files written in HashiCorp Configuration Language (HCL). I created a simple configuration file to define an EC2 instance.&lt;/p&gt;

&lt;p&gt;● Applying the Configuration: Using the “terraform apply” command, I provisioned the resources defined in my configuration file. Terraform handled the creation of the EC2 instance, showing me the power of infrastructure as code.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Here's how I did it:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Install Terraform&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure Terraform is installed on your local machine. If not, download and install it from the Terraform website. I used Chocolatey software to install Terraform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Set Up Your Working Directory&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To configure AWS, open your terminal and run the command "aws configure".&lt;br&gt;
Create a new directory for your Terraform configuration files.&lt;br&gt;
Navigate to this directory in your terminal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmaukkc8wgnchgnfu181p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmaukkc8wgnchgnfu181p.png" alt="Image description" width="614" height="282"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Create a Terraform Configuration File&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In your working directory, create a new file named 'main.tf'.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsawlo6wiejvhk29lqnq2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsawlo6wiejvhk29lqnq2.png" alt="Image description" width="427" height="108"&gt;&lt;/a&gt;&lt;br&gt;
Define the AWS provider and the region where the EC2 instance will be created:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;provider "aws" {
  region = "us-west-2"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Define the EC2 instance resource with the desired AMI ID and instance type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_instance" "ec2-instance" {
  ami           = "ami-0c55b159cbfafe1f0"  # Replace with your desired AMI ID
  instance_type = "t2.micro"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;4. Initialize Terraform&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the terminal, run the following command to initialize Terraform and download the necessary plugins for the AWS provider:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;5. Apply the Configuration to Create the EC2 Instance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Apply the configuration to create the EC2 instance by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Terraform will display a plan of what will be created. Type yes to confirm and create the instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Modify the EC2 Instance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To modify the EC2 instance, update the instance_type in the main.tf file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_instance" "ec2-instance" {
  ami           = "ami-0c55b159cbfafe1f0"
  instance_type = "t2.medium"  # Updated instance type
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Apply the changes by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Terraform will detect the change and prompt you to confirm the update. Type yes to apply the modification.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7. Destroy the EC2 Instance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When you no longer need the EC2 instance, destroy it by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Terraform will ask for confirmation. Type yes to confirm and destroy the instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Working with Terraform to create, modify, and destroy an EC2 instance, was an enlightening experience. This hands-on approach demonstrated the power of IaC in managing cloud resources efficiently and consistently. I'm excited to continue exploring Terraform and AWS and learning more about automating and managing infrastructure.&lt;/p&gt;

</description>
      <category>cloudcomputing</category>
      <category>aws</category>
      <category>cloudskills</category>
      <category>blog</category>
    </item>
  </channel>
</rss>
