<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mohana Priya </title>
    <description>The latest articles on DEV Community by Mohana Priya  (@mohanapriya_s_1808).</description>
    <link>https://dev.to/mohanapriya_s_1808</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mohanapriya_s_1808"/>
    <language>en</language>
    <item>
      <title>From Learning in Silence to Leading a Community</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Tue, 20 Jan 2026 17:03:34 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/from-learning-in-silence-to-leading-a-community-4cc1</link>
      <guid>https://dev.to/mohanapriya_s_1808/from-learning-in-silence-to-leading-a-community-4cc1</guid>
      <description>&lt;p&gt;&lt;strong&gt;Hi,&lt;/strong&gt; I’m Mohana Priya 👋&lt;/p&gt;

&lt;p&gt;A Cloud Captain, DevOps Engineer… and once a student who didn’t know where she belonged.&lt;br&gt;
There was a time when I sat quietly on the last bench, watching others speak with confidence, wondering if I would ever find my place. I didn’t have a clear dream, a strong voice, or a perfect plan for my future.&lt;/p&gt;

&lt;p&gt;What I had was curiosity. And honestly, a lot of self-doubt.&lt;/p&gt;

&lt;p&gt;I never imagined that one decision—to join Cloud Clubs, would slowly change the way I saw myself.&lt;/p&gt;

&lt;h2&gt;
  
  
  Before vs After Cloud Clubs: A Journey I’ll Always Cherish and Carry
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Before: Learning in Silence&lt;/strong&gt;&lt;br&gt;
Before Cloud Clubs, I was not confident. I rarely spoke, never asked questions, and mastered the art of sitting on the last bench without being noticed.&lt;/p&gt;

&lt;p&gt;I didn’t know what cloud computing was. All I knew was that I wanted to learn—but I was waiting for the “start here” sign to appear.&lt;/p&gt;

&lt;p&gt;Cloud Clubs was where I heard about AWS Cloud Computing for the first time. I remember opening the AWS Console and feeling a strange mix of excitement and fear. Everything looked powerful… and expensive. I was scared to click anything, worried I’d make a mistake that couldn’t be undone. &lt;/p&gt;

&lt;p&gt;I made plenty of beginner mistakes, some I laugh about now:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creating resources and forgetting to delete them&lt;/li&gt;
&lt;li&gt;Mixing up regions and availability zones&lt;/li&gt;
&lt;li&gt;Attending sessions and still going home to Google the basics&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But behind those funny moments was something deeper—I was trying. Even when I felt lost, I kept showing up.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Change I Felt:&lt;/strong&gt;&lt;br&gt;
What changed everything wasn’t just learning cloud services. It was the people.&lt;/p&gt;

&lt;p&gt;I joined the AWS Cloud Club at my university simply as a member—quiet, observant, unsure of where I fit. I attended sessions, listened closely, and learned at my own pace. But as I kept showing up, something began to shift.&lt;/p&gt;

&lt;p&gt;Watching my former Cloud Captains lead events, guide students, and create opportunities made me feel something I hadn’t felt before—belonging. For the first time, I didn’t feel invisible.&lt;/p&gt;

&lt;p&gt;Gradually, I became part of the core team. I started supporting my former Cloud Captain, helping with event coordination, assisting peers with certifications and projects, and being there wherever help was needed. Not because I knew everything, but because I wanted to grow alongside the community.&lt;/p&gt;

&lt;p&gt;With every small responsibility I was trusted with, my confidence grew.&lt;br&gt;
Helping others helped me believe in myself.&lt;/p&gt;

&lt;p&gt;Slowly, without even realizing it,&lt;br&gt;
I found my voice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;After: Standing Where I Once Watched&lt;/strong&gt;&lt;br&gt;
When I became a Cloud Captain, it felt surreal. I wasn’t just learning anymore, I was leading.&lt;/p&gt;

&lt;p&gt;I still remember my first event as a Cloud Captain, volunteering with the UG WomenInTech team at AWSheTech Day. Standing on stage, delivering the welcome note, my heart was pounding. But in that moment, I realized something powerful—I had become the person I once looked up to.&lt;/p&gt;

&lt;p&gt;From that first step to organizing Student Community Day with nearly 500+ participants and 28 speakers across India, Cloud Clubs transformed me, not overnight, but step by step.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What This Journey Gave Me:&lt;/strong&gt;&lt;br&gt;
Cloud Clubs gave me more than technical skills:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Courage to speak even when my voice shook&lt;/li&gt;
&lt;li&gt;Confidence to take responsibility&lt;/li&gt;
&lt;li&gt;Leadership rooted in empathy&lt;/li&gt;
&lt;li&gt;The belief that I belonged here&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The certification support helped me prove my knowledge, secure two internships, and today, I work as a DevOps Engineer, a role I once thought was far beyond my reach.&lt;/p&gt;

&lt;p&gt;To the student reading this—&lt;/p&gt;

&lt;p&gt;If you often listen more than you speak,&lt;br&gt;
If curiosity lives inside you but fear holds you back,&lt;br&gt;
If you sometimes feel like everyone else is moving ahead while you’re still figuring things out…&lt;/p&gt;

&lt;p&gt;You’re not alone.&lt;br&gt;
I’ve stood exactly where you are. &lt;/p&gt;

&lt;p&gt;⏳ The Cloud Clubs application deadline is Jan 20 2026. Just a few hours left.&lt;/p&gt;

&lt;p&gt;Don’t wait for confidence to come first.&lt;br&gt;
Sometimes, growth begins with just showing up.&lt;/p&gt;

&lt;p&gt;Apply for the cloud clubs program now: 

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://pulse.aws/application/TAWKU3CJv" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;pulse.aws&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


. 

&lt;p&gt;For more details visit: 

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="builder.aws.com/cloud-clubs" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;builder.aws.com/cloud-clubs&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


.

&lt;p&gt;Take the step. Trust the journey. Your “after” is waiting. &lt;/p&gt;

</description>
      <category>awscloudclubs</category>
      <category>cloudcomputing</category>
      <category>devops</category>
      <category>aws</category>
    </item>
    <item>
      <title>Building a Continuous Delivery Pipeline for Web Applications on AWS</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Sat, 07 Dec 2024 11:35:21 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/building-a-continuous-delivery-pipeline-for-web-applications-on-aws-chl</link>
      <guid>https://dev.to/mohanapriya_s_1808/building-a-continuous-delivery-pipeline-for-web-applications-on-aws-chl</guid>
      <description>&lt;p&gt;Imagine making changes to your application and deploying them live with just a few steps. Continuous Delivery Pipeline makes this possible! In this blog, we’ll walk through setting up an automated pipeline using AWS services for a simple memory-matching game built with HTML, CSS, and JavaScript.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/mohanapriyas1808/codepipeline-s3-game" rel="noopener noreferrer"&gt;https://github.com/mohanapriyas1808/codepipeline-s3-game&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is a Continuous Delivery Pipeline?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before we jump into the exciting process of building one, let’s first understand what a Continuous Delivery Pipeline is.&lt;/p&gt;

&lt;p&gt;A &lt;strong&gt;Continuous Delivery (CD) Pipeline&lt;/strong&gt; is a series of automated steps that streamline the journey of your application code from a developer's workstation to a live, production-ready environment. It ensures that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Code Changes are Tested:&lt;/strong&gt; Each update is automatically verified for functionality.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Changes are Reviewed:&lt;/strong&gt; A manual or automated review step ensures quality control.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployments are Seamless:&lt;/strong&gt; Updates go live without interrupting the user experience.
Think of it as an assembly line for your software. Just like cars in a factory pass through various stages assembly, painting, quality checks, and packaging your code moves through a pipeline: source control → build → test → review → deploy.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Why to use a Continuous Delivery Pipeline?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Automation:&lt;/strong&gt; Eliminates manual deployment tasks, saving time and effort.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Faster Deployment:&lt;/strong&gt; Speeds up the release of new features and fixes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Consistency:&lt;/strong&gt; Ensures every deployment follows a reliable, tested process.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An AWS account&lt;/li&gt;
&lt;li&gt;A GitHub account&lt;/li&gt;
&lt;li&gt;Git installed on your computer&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Application Architecture&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseghskgp5an01cycdlpc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseghskgp5an01cycdlpc.png" alt=" " width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How the Pipeline Works ?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here’s a quick overview of what we’re building:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Source Stage:&lt;/strong&gt; Fetches your code from GitHub.&lt;br&gt;
&lt;strong&gt;Build Stage:&lt;/strong&gt; Uses CodeBuild to test and package your game.&lt;br&gt;
&lt;strong&gt;Approval Stage:&lt;/strong&gt; Adds a manual review step to check your changes.&lt;br&gt;
&lt;strong&gt;Deploy Stage:&lt;/strong&gt; Automatically uploads your game to an S3 bucket with static web hosting.&lt;/p&gt;

&lt;p&gt;Simple, right? Let’s break it down step by step.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Set Up Your GitHub Repository&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a repository on &lt;strong&gt;GitHub&lt;/strong&gt; and push your game application source code.&lt;/li&gt;
&lt;li&gt;Make sure your code is clean, well-structured, and ready to be deployed.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftctlfyd047lf92ih90du.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftctlfyd047lf92ih90du.png" alt=" " width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Configure S3 for Hosting&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the &lt;strong&gt;S3&lt;/strong&gt; dashboard in AWS and create a new bucket with &lt;strong&gt;Block all public access&lt;/strong&gt; disabled.&lt;/li&gt;
&lt;li&gt;Enable &lt;strong&gt;Static Web Hosting&lt;/strong&gt; in the bucket properties.&lt;/li&gt;
&lt;li&gt;Update the &lt;strong&gt;bucket policy&lt;/strong&gt; to make your application publicly accessible.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Bucket Policy
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::Bucket-Name/*"
            ]
        }
    ]
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 3: Create a Build Project in CodeBuild&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open CodeBuild and create a new project.&lt;/li&gt;
&lt;li&gt;Connect to &lt;strong&gt;GitHub&lt;/strong&gt; using OAuth and select your source code repository.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Amazon Linux 2&lt;/strong&gt; from the Operating system dropdown menu.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Standard&lt;/strong&gt; from the Runtime(s) dropdown menu.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;aws/codebuild/amazonlinux2-x86_64-standard:5.0&lt;/strong&gt; from the Image dropdown menu.&lt;/li&gt;
&lt;li&gt;Confirm that &lt;strong&gt;Linux&lt;/strong&gt; is selected for Environment type.&lt;/li&gt;
&lt;li&gt;Confirm that &lt;strong&gt;New service role&lt;/strong&gt; is selected.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Insert build commands&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Choose &lt;strong&gt;Switch to editor&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Replace the Buildspec in the editor with the code below:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: 0.2
phases:
  install:
    commands:
      - echo "No dependencies to install for static HTML application"
  build:
    commands:
      - echo "No build step needed for static HTML, CSS, and JavaScript files"
artifacts:
  files:
    - '**/*.html'  
    - '**/*.css'  
    - '**/*.js'    
    - 'images/**'  
  discard-paths: no  

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;11.Click the &lt;strong&gt;Create Project button&lt;/strong&gt; and hit &lt;strong&gt;Start Build&lt;/strong&gt; to test it!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff9jssgf3z9qou8zp9jzu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff9jssgf3z9qou8zp9jzu.png" alt=" " width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Set Up the Continuous Delivery Pipeline in AWS CodePipeline&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;strong&gt;CodePipeline&lt;/strong&gt; and &lt;strong&gt;create a new pipeline&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Confirm that &lt;strong&gt;New service role&lt;/strong&gt; is selected.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;GitHub version 1&lt;/strong&gt; from the Source provider dropdown menu.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Connect to GitHub&lt;/strong&gt; and select your source code repository.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;main&lt;/strong&gt; from the &lt;strong&gt;branch&lt;/strong&gt; dropdown menu&lt;/li&gt;
&lt;li&gt;Confirm that &lt;strong&gt;GitHub webhooks&lt;/strong&gt; is selected.&lt;/li&gt;
&lt;li&gt;Choose &lt;strong&gt;Next&lt;/strong&gt; and select the &lt;strong&gt;CodeBuild project&lt;/strong&gt; you just created.&lt;/li&gt;
&lt;li&gt;Choose &lt;strong&gt;Next&lt;/strong&gt; and select your &lt;strong&gt;S3 bucket&lt;/strong&gt; which you have created for deployment with static web hosting enabled.&lt;/li&gt;
&lt;li&gt;Review the pipeline and click the &lt;strong&gt;create pipeline&lt;/strong&gt; button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1tolarakpr9zgj9jpht.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1tolarakpr9zgj9jpht.png" alt=" " width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4mwyv2ai17pit9gd19yh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4mwyv2ai17pit9gd19yh.png" alt=" " width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Access you application using the object's URL in the s3 bucket.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fmg1zrx41oimgu404dd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fmg1zrx41oimgu404dd.png" alt=" " width="800" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhk3ju4f81atnaq62l2ec.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhk3ju4f81atnaq62l2ec.png" alt=" " width="800" height="434"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5: Add a Review Stage&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Choose the &lt;strong&gt;Edit&lt;/strong&gt; button near the top of the page.&lt;/li&gt;
&lt;li&gt;Choose the &lt;strong&gt;Add stage&lt;/strong&gt; button between the Build and Deploy stages.&lt;/li&gt;
&lt;li&gt;In the created stage, choose the &lt;strong&gt;Add action group&lt;/strong&gt; button.&lt;/li&gt;
&lt;li&gt;From the &lt;strong&gt;Action provider&lt;/strong&gt; dropdown, select &lt;strong&gt;Manual approval&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Hit the &lt;strong&gt;Done&lt;/strong&gt; button and the &lt;strong&gt;Save&lt;/strong&gt; button in the top of the page to save the changes.&lt;/li&gt;
&lt;li&gt;You will now see your pipeline with four stages: &lt;strong&gt;Source, Build, Review, and Deploy&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9j8e73quupepuh1rk27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9j8e73quupepuh1rk27.png" alt=" " width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6: Test the Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Commit&lt;/strong&gt; a change to your &lt;strong&gt;source code&lt;/strong&gt; in the github repository (in this case i have changed the name MEME Matching Game to Image Matching Game.&lt;/li&gt;
&lt;li&gt;Watch the pipeline automatically start.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Approve&lt;/strong&gt; the changes in the &lt;strong&gt;review stage&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Your updated game will be live now on the &lt;strong&gt;S3 bucket&lt;/strong&gt;!&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb61qao4lz4thehz2wifi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb61qao4lz4thehz2wifi.png" alt=" " width="800" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4mn9b9ol5d7e9fr2c4rl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4mn9b9ol5d7e9fr2c4rl.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fye9az9hsdqny07e69bvt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fye9az9hsdqny07e69bvt.png" alt=" " width="750" height="772"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the Deploy Stage is complete refresh your application page to see the changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpygftwnicipcb7nsp1lh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpygftwnicipcb7nsp1lh.png" alt=" " width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqbnxa1ccv10afw7qz0i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqbnxa1ccv10afw7qz0i.png" alt=" " width="800" height="434"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;By combining AWS services like CodePipeline, CodeBuild, and S3, you can create a robust continuous delivery pipeline for your applications. This setup ensures a seamless deployment process, allowing you to focus on improving your application rather than worrying about deployments.&lt;/p&gt;

&lt;p&gt;Ready to build your own pipeline? Start experimenting with AWS today!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>🚀 Automate Your Resume Hosting with CI/CD and AWS EC2 🌐</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Sun, 14 Jul 2024 14:25:17 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/automate-your-resume-hosting-with-cicd-and-aws-ec2-1jdh</link>
      <guid>https://dev.to/mohanapriya_s_1808/automate-your-resume-hosting-with-cicd-and-aws-ec2-1jdh</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;CI/CD is a method used in modern software development to automate the process of integrating code changes, testing them, and deploying applications. This approach ensures that code changes are continuously tested and deployed, leading to faster development cycles and more reliable applications.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;In this project, we utilize CI/CD to host a personal resume website on an EC2 instance using GitHub Actions. By automating the deployment process, any updates made to the resume repository are automatically reflected on the live website. This not only reduces manual effort but also ensures that the resume is always current and available to potential employers or collaborators.By integrating GitHub Actions with AWS, you can achieve continuous deployment, ensuring that any changes made to your resume are automatically reflected online.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;This article helps you understand how you can automatically deploy your code to AWS EC2 from GitHub&lt;/strong&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; &lt;em&gt;Launch an EC2 instance with the Ubuntu server. Modify the security group inbound rules to allow SSH (port 22), HTTP (port 80), and HTTPS (port 443).&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyu81w8nszsovtmm6g6s2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyu81w8nszsovtmm6g6s2.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; &lt;em&gt;Create a public repository on GitHub for example Host-Resume-on-ec2-githubactions.&lt;/em&gt;&lt;br&gt;
&lt;em&gt;Add an index.html file containing the code for your resume website&lt;/em&gt;.&lt;br&gt;
&lt;em&gt;Add a CSS file for styling your resume&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmg2ai9vxr2icvy19vr3n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmg2ai9vxr2icvy19vr3n.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; &lt;em&gt;Add secrets to the repository for secure information storage.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EC2_SSH_KEY&lt;/strong&gt;: &lt;em&gt;Your key pair file for SSH access.&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;HOST_DNS&lt;/strong&gt;: &lt;em&gt;The public DNS of your EC2 instance.&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;USERNAME&lt;/strong&gt;: &lt;em&gt;The username for SSH (e.g., ubuntu).&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;TARGET_DIR&lt;/strong&gt;: &lt;em&gt;The target directory on the EC2 instance (e.g., home).&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3xwecblcdmx70g57xe2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3xwecblcdmx70g57xe2.png" alt=" " width="800" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; &lt;em&gt;Create a .github/workflows directory in your repository.&lt;/em&gt;&lt;br&gt;
&lt;em&gt;Add a YAML file (e.g., github-actions-ec2.yml) with the following code to automate deployment&lt;/em&gt;:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Add the following code so that your actions only run when you push to main branch.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Trigger deployment only on push to main branch
on:
  push:
    branches:
      - main

jobs:
  deploy:
    name: Deploy to EC2 on master branch push
    runs-on: ubuntu-latest

    steps:
      - name: Checkout the files
        uses: actions/checkout@v2

      - name: Deploy to Server 1
        uses: easingthemes/ssh-deploy@main
        env:
          SSH_PRIVATE_KEY: ${{ secrets.EC2_SSH_KEY }}
          REMOTE_HOST: ${{ secrets.HOST_DNS }}
          REMOTE_USER: ${{ secrets.USERNAME }}
          TARGET: ${{ secrets.TARGET_DIR }}

      - name: Executing remote ssh commands using ssh key
        uses: appleboy/ssh-action@master
        with:
          host: ${{ secrets.HOST_DNS }}
          username: ${{ secrets.USERNAME }}
          key: ${{ secrets.EC2_SSH_KEY }}
          script: |
            sudo apt-get -y update
            sudo apt-get install -y apache2
            sudo systemctl start apache2
            sudo systemctl enable apache2
            cd home
            sudo mv * /var/www/html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Once done with this your workflow will be created and using the public IP address if the EC2 instance you can view the resume.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;If any update was done in your main branch you can able to view by refreshing the page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzos5rbymecmg7qxsdm12.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzos5rbymecmg7qxsdm12.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Hosting a resume on an EC2 instance using GitHub Actions combines the power of cloud infrastructure with the automation of CI/CD workflows. This method ensures that your resume is always up-to-date and accessible, demonstrating technical skills and modern development practices. This approach can be extended to various applications, making it a versatile solution for web hosting needs.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>github</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Automating EC2 Instance Management with AWS Lambda and EventBridge Using Terraform</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Mon, 08 Jul 2024 15:47:16 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/automating-ec2-instance-management-with-aws-lambda-and-eventbridge-using-terraform-38jm</link>
      <guid>https://dev.to/mohanapriya_s_1808/automating-ec2-instance-management-with-aws-lambda-and-eventbridge-using-terraform-38jm</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;EC2 instances are virtual servers for running applications on the AWS infrastructure. It is crucial for providing scalable computing capacity, allowing users to deploy and manage applications efficiently in the cloud. EC2 instances are widely used for hosting websites, running databases, and handling various computing workloads.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Managing EC2 instances manually can be a daunting task, especially when dealing with multiple instances and varying usage patterns. Automating this process not only saves time but also ensures that your resources are used efficiently, leading to significant cost savings. By leveraging AWS Lambda, EventBridge, and Terraform, you can create an automated solution that starts and stops your EC2 instances based on a schedule, ensuring optimal resource utilization and cost efficiency.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;In this guide, we'll take you through the entire process of setting up this automation, from creating the EC2 instances to configuring the Lambda functions and EventBridge rules using Terraform. Let's dive in and unlock the potential of automated cloud resource management!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Architecture:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mb8ebt4kki6sbc0bdrl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mb8ebt4kki6sbc0bdrl.png" alt=" " width="800" height="551"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EC2:&lt;/strong&gt; &lt;em&gt;An EC2 instance is a virtual server which is used for running applications on the AWS infrastructure.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lambda:&lt;/strong&gt; &lt;em&gt;AWS Lambda is a serverless compute service that lets you run code without provisioning or managing servers. It automatically scales applications by running code in response to events. Lambda is widely used for event-driven applications, real-time file processing, and backend services.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EventBridge:&lt;/strong&gt; &lt;em&gt;Amazon EventBridge is a serverless event bus service that makes it easy to connect applications using data from your own apps, SaaS apps, and AWS services. It simplifies event-driven architecture by routing events between services and allowing you to build scalable, event-driven workflows for various use cases such as application integration, automation, and observability.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IAM Role:&lt;/strong&gt; &lt;em&gt;An IAM (Identity and Access Management) role in AWS defines permissions for entities like AWS services or users, ensuring secure access to AWS resources without needing long-term credentials. Roles are used to delegate permissions across AWS services and are integral for managing security and access control within cloud environments.&lt;/em&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites:&lt;/strong&gt;&lt;br&gt;
Before we dive into the steps, let's ensure you have the following prerequisites in place:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;AWS Account:&lt;/strong&gt; &lt;em&gt;If you don't have one, sign up for an AWS account.&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Terraform Installed:&lt;/strong&gt; &lt;em&gt;Download and install Terraform from the official website.&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS CLI Installed:&lt;/strong&gt; &lt;em&gt;Install the AWS CLI by following the instructions here.&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS Credentials Configured:&lt;/strong&gt; &lt;em&gt;Configure your AWS CLI with your credentials by running aws configure.&lt;/em&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Step-By-Step Procedure:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;We'll walk you through the entire process of setting up this automation using Terraform. The steps include configuring the AWS provider, creating the EC2 instances, setting up IAM roles and policies, defining the Lambda functions, and creating the EventBridge rules.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; &lt;em&gt;Create a main.tf file. This file contains the configuration for creating three instances, IAM role for the lambda function to access the EC2 instance, lambda functions for starting the EC2 instances and stopping the EC2 instances, EventBridge rules for triggering the startec2instance lambda function and stopec2instance lambda function.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

provider "aws" {
  region = "ap-south-1"
}

resource "aws_instance" "ec2" {
  count                  = var.instance_count
  ami                    = "ami-02a2af70a66af6dfb"  
  instance_type          = "t2.micro"  # Update with your desired instance type
  vpc_security_group_ids = [var.security_group_id]
  subnet_id              = var.subnet_id
  key_name               = var.key
  tags = merge(var.default_ec2_tags,
    {
      Name = "${var.name}-${count.index + 1}"
    }
  )
}

resource "aws_iam_role" "lambda_role" {
  name = "lambda_role"

  # Terraform's "jsonencode" function converts a
  # Terraform expression result to valid JSON syntax.
  assume_role_policy = jsonencode({
    Version = "2012-10-17"
    Statement = [
      {
        Action = "sts:AssumeRole"
        Effect = "Allow"
        Sid    = ""
        Principal = {
          Service = "lambda.amazonaws.com"
        }
      },
    ]
  })

  tags = {
    tag-key = "tag-value"
  }
}

resource "aws_iam_policy" "lambda_policy_start_stop_instance" {
  name        = "lambda_policy_start_stop_instance"
  path        = "/"
  description = "My test policy"


  # Terraform expression result to valid JSON syntax.
  policy = jsonencode({
    Version = "2012-10-17"
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": "arn:aws:logs:*:*:*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "ec2:Start*",
                "ec2:Stop*",
                "ec2:Describe*"
            ],
            "Resource": "*"
        }
    ]
  })
}

resource "aws_iam_role_policy_attachment" "test-attach" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = aws_iam_policy.lambda_policy_start_stop_instance.arn
}


resource "aws_lambda_function" "stop_ec2_instance" {
  # If the file is not in the current working directory you will need to include a
  # path.module in the filename.
  filename      = "stopec2instance.zip"
  function_name = "stop_ec2_instance"
  role          =  aws_iam_role.lambda_role.arn
  handler       = "stopec2instance.lambda_handler"
  source_code_hash = filebase64sha256("stopec2instance.zip")

  runtime = "python3.11"
}

resource "aws_lambda_function" "start_ec2_instance" {
  # If the file is not in the current working directory you will need to include a
  # path.module in the filename.
  filename      = "startec2instance.zip"
  function_name = "startec2instance"
  role          =  aws_iam_role.lambda_role.arn
  handler       = "startec2instance.lambda_handler"
  source_code_hash = filebase64sha256("startec2instance.zip")

  runtime = "python3.11"
}


resource "aws_cloudwatch_event_rule" "stop_ec2_schedule" {
    name                = "stop_ec2_schedule"
    description         = "Schedule to trigger Lambda to stop EC2 instances every 2 minutes"
    schedule_expression = "rate(2 minutes)"
  }


resource "aws_cloudwatch_event_target" "stop_ec2_target" {
  rule      = aws_cloudwatch_event_rule.stop_ec2_schedule.name
  target_id = "lambda"
  arn       = aws_lambda_function.stop_ec2_instance.arn
}

resource "aws_lambda_permission" "allow_cloudwatch_stop" {
  statement_id  = "AllowExecutionFromCloudWatch"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.stop_ec2_instance.function_name
  principal     = "events.amazonaws.com"
  source_arn    = aws_cloudwatch_event_rule.stop_ec2_schedule.arn
}

resource "aws_cloudwatch_event_rule" "start_ec2_schedule" {
    name                = "start_ec2_schedule"
    description         = "Schedule to trigger Lambda to start EC2 instances every 1 minute"
    schedule_expression = "rate(1 minute)"
  }

resource "aws_cloudwatch_event_target" "start_ec2_target" {
  rule      = aws_cloudwatch_event_rule.start_ec2_schedule.name
  target_id = "lambda"
  arn       = aws_lambda_function.start_ec2_instance.arn
}

resource "aws_lambda_permission" "allow_cloudwatch_start" {
  statement_id  = "AllowExecutionFromCloudWatch"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.start_ec2_instance.function_name
  principal     = "events.amazonaws.com"
  source_arn    = aws_cloudwatch_event_rule.start_ec2_schedule.arn
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; &lt;em&gt;Create variables.tf file&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variable "instance_count" {
  description = "Number of EC2 instances to create"
  default     = 3
}

variable "security_group_id" {
  description = "ID of the security group for EC2 instances"

}

variable "subnet_id" {
  description = "ID of the subnet for EC2 instances"

}

variable "key" {
  description = "Name of the SSH key pair for EC2 instances"

}

variable "name" {
  description = "Name prefix for EC2 instances"

}

variable "default_ec2_tags" {
  type        = map(string)
  description = "(optional) default tags for EC2 instances"
  default = {
    managed_by   = "terraform"
    Environment  = "Dev"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; &lt;em&gt;Create terraform.tfvars file which contains configuration such  as number of instance, security group id, subnet id, key pair name, name of the instance.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;instance_count     = 3
security_group_id  = "sg-0944b5d5471b421fb"
subnet_id          = "subnet-0582feff6651618d4"
key                = "mynewkeypair"
name               = "EC2-Test-Instance"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; &lt;em&gt;Create two python files stopec2instance, startec2instance this files contain the code for the lambda function. Make sure the python files are zipped and they lie in the same directory.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#stopec2instance
import boto3

def is_dev(instance):
    is_dev = False
    if 'Tags' in instance:
        for tag in instance['Tags']:
            if tag['Key'] == 'Environment' and tag['Value'] == 'Dev':
                is_dev = True
                break
    return is_dev

def is_running(instance):
    return instance['State']['Name'] == 'running'

def lambda_handler(event, context):
    ec2 = boto3.client('ec2', region_name='ap-south-1')

    try:
        response = ec2.describe_instances()
        reservations = response['Reservations']

        for reservation in reservations:
            for instance in reservation['Instances']:
                if is_dev(instance) and is_running(instance):
                    instance_id = instance['InstanceId']
                    ec2.stop_instances(InstanceIds=[instance_id])
                    print(f'Stopping instance: {instance_id}')

    except Exception as e:
        print(f'Error stopping instances: {str(e)}')

    return {
        'statusCode': 200,
        'body': 'Function executed successfully'
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#startec2instance
import boto3

def is_dev(instance):
    is_dev = False
    if 'Tags' in instance:
        for tag in instance['Tags']:
            if tag['Key'] == 'Environment' and tag['Value'] == 'Dev':
                is_dev = True
                break
    return is_dev

def is_stopped(instance):
    return instance['State']['Name'] == 'stopped'

def lambda_handler(event, context):
    ec2 = boto3.client('ec2', region_name='ap-south-1')

    try:
        response = ec2.describe_instances()
        reservations = response['Reservations']

        for reservation in reservations:
            for instance in reservation['Instances']:
                if is_dev(instance) and is_stopped(instance):
                    instance_id = instance['InstanceId']
                    ec2.start_instances(InstanceIds=[instance_id])
                    print(f'Starting instance: {instance_id}')

    except Exception as e:
        print(f'Error starting instances: {str(e)}')

    return {
        'statusCode': 200,
        'body': 'Function executed successfully'
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;terraform init:&lt;/strong&gt; — &lt;em&gt;To initialize the backend that means terraform will check in this step what is the provider used here and correspondingly download all the dependencies of that provider (AWS in our case) if everything is fine the output will show somewhat like this :&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2dk8crsa0vs3wov92krt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2dk8crsa0vs3wov92krt.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;terraform plan:&lt;/strong&gt; &lt;em&gt;In this step terraform will show you how many resources it will create like this :&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzxaozrycwbb9uzbcw9b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzxaozrycwbb9uzbcw9b.png" alt=" " width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;terraform apply:&lt;/strong&gt; &lt;em&gt;In this step it wil actually create the resources based on the previous step&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg4ik6mfskecfto94nswx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg4ik6mfskecfto94nswx.png" alt=" " width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Once all the resources are created the output will be like this:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EC2 instance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fff4ply6dhblza2j7elnt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fff4ply6dhblza2j7elnt.png" alt=" " width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lambda Function&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kn0xjwmytr1hacmkn4x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kn0xjwmytr1hacmkn4x.png" alt=" " width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EventBridge Rules&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fazdo2p0y1s7vc10zo5xv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fazdo2p0y1s7vc10zo5xv.png" alt=" " width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Whenever the lambda function is triggered by EventBridge rules the output will be like this&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0fl9xfwvnrtfg20s7k1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0fl9xfwvnrtfg20s7k1.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy47r4ikkntjswg91fm4v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy47r4ikkntjswg91fm4v.png" alt=" " width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fes4j6c177d8r3mwhg7hg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fes4j6c177d8r3mwhg7hg.png" alt=" " width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If you want to delete the resources you have to give terraform destroy command.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt; &lt;br&gt;
&lt;em&gt;By automating the start and stop of EC2 instances using Lambda, EventBridge, and Terraform, we've created an efficient and cost-effective solution for managing our cloud resources. This setup can be easily adapted to suit different schedules and requirements.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Happy automating!&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Streamlining Data Conversion: XML to JSON with AWS Lambda and S3</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Mon, 01 Jul 2024 15:21:50 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/streamlining-data-conversion-xml-to-json-with-aws-lambda-and-s3-43nf</link>
      <guid>https://dev.to/mohanapriya_s_1808/streamlining-data-conversion-xml-to-json-with-aws-lambda-and-s3-43nf</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;br&gt;
In today's data-driven world, managing and transforming data formats efficiently is crucial for seamless integration and processing. XML (eXtensible Markup Language) and JSON (JavaScript Object Notation) are two widely used data formats, each with its own set of advantages. However, there are scenarios where converting data from XML to JSON becomes necessary to leverage JSON's simplicity and compatibility with modern web applications and APIs.&lt;/p&gt;

&lt;p&gt;AWS (Amazon Web Services) provides a robust and scalable solution for such data transformation tasks using S3 buckets and Lambda functions. In this blog, we will walk you through the process of setting up an automated workflow that converts XML files stored in an S3 bucket to JSON format and uploads them to another S3 bucket using AWS Lambda. By the end of this tutorial, you'll have a clear understanding of how to harness the power of AWS to simplify your data processing pipelines.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Setting Up S3 Buckets&lt;br&gt;
First, create two S3 buckets: one for storing the input XML files and another for storing the converted JSON files. For this tutorial, we'll refer to them as &lt;strong&gt;input-bucket&lt;/strong&gt; and &lt;strong&gt;output-bucket&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksfzsc2nxla3500ajmwz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksfzsc2nxla3500ajmwz.png" alt=" " width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Creating the Lambda Function&lt;br&gt;
Navigate to the AWS Lambda console and create a new Lambda function using the Python 3.9 runtime. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczj5805v2erqmbf8z5j7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczj5805v2erqmbf8z5j7.png" alt=" " width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Configuring the Lambda Function&lt;br&gt;
&lt;strong&gt;Permissions:&lt;/strong&gt; Ensure your Lambda function has the necessary permissions to read from the input-bucket and write to the output-bucket. Attach an IAM role with S3FullAccess policies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Code:&lt;/strong&gt; Use the following code to handle the XML to JSON conversion and upload the converted file to the output bucket:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import xml.etree.ElementTree as ET
import boto3

s3 = boto3.client('s3')

def parse_element(element):
    parsed_data = {}
    if element.attrib:
        parsed_data.update(('@' + k, v) for k, v in element.attrib.items())
    if element.text and element.text.strip():
        parsed_data['#text'] = element.text.strip()
    children = list(element)
    if children:
        child_data = {}
        for child in children:
            child_dict = parse_element(child)
            child_tag = child.tag
            if child_tag not in child_data:
                child_data[child_tag] = []
            child_data[child_tag].append(child_dict)
        for k, v in child_data.items():
            if len(v) == 1:
                parsed_data[k] = v[0]
            else:
                parsed_data[k] = v
    return parsed_data

def xml_to_json(xml_content):
    root = ET.fromstring(xml_content)
    return json.dumps(parse_element(root), indent=4)

def lambda_handler(event, context):
    source_bucket = 'input-bucket'  # Replace with your source bucket name
    destination_bucket = 'output-bucket'  # Replace with your destination bucket name

    try:
        records = event['Records']
    except KeyError:
        return {
            'statusCode': 400,
            'body': json.dumps('Event does not contain Records key.')
        }

    for record in records:
        try:
            key = record['s3']['object']['key']
        except KeyError:
            return {
                'statusCode': 400,
                'body': json.dumps('Record does not contain S3 object key.')
            }

        if key.endswith('.xml'):
            xml_obj = s3.get_object(Bucket=source_bucket, Key=key)
            xml_content = xml_obj['Body'].read().decode('utf-8')

            json_content = xml_to_json(xml_content)
            json_key = key.replace('.xml', '.json')
            s3.put_object(Bucket=destination_bucket, Key=json_key, Body=json_content)

            return {
                'statusCode': 200,
                'body': json.dumps('XML to JSON conversion and upload successful!')
            }
        else:
            return {
                'statusCode': 400,
                'body': json.dumps('The file is not an XML file.')
            }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; Setting Up S3 Trigger&lt;/p&gt;

&lt;p&gt;Configure an S3 trigger for the input-bucket to invoke the Lambda function whenever a new XML file is uploaded. This ensures that the function is automatically triggered and processes the file as soon as it is added to the bucket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-5:&lt;/strong&gt; Testing the Workflow&lt;/p&gt;

&lt;p&gt;Upload a sample XML file to the &lt;strong&gt;input-bucket&lt;/strong&gt; and verify that the Lambda function is triggered. Check the &lt;strong&gt;output-bucket&lt;/strong&gt; for the converted JSON file. For example, you can use the following XML content for testing:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuvnpf69o5olodcweeos7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuvnpf69o5olodcweeos7.png" alt=" " width="800" height="433"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;note&amp;gt;
    &amp;lt;to&amp;gt;Tove&amp;lt;/to&amp;gt;
    &amp;lt;from&amp;gt;Jani&amp;lt;/from&amp;gt;
    &amp;lt;heading&amp;gt;Reminder&amp;lt;/heading&amp;gt;
    &amp;lt;body&amp;gt;Don't forget me this weekend!&amp;lt;/body&amp;gt;
&amp;lt;/note&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Output:&lt;/strong&gt;&lt;br&gt;
The output of this lambda function will be:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fygoalvov55noqifyiz3k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fygoalvov55noqifyiz3k.png" alt=" " width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6x58yxzmvk0y7tgp4jkr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6x58yxzmvk0y7tgp4jkr.png" alt=" " width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
By following the steps outlined in this blog, you have successfully set up an automated system for converting XML files to JSON using AWS Lambda and S3. This approach leverages the scalability and reliability of AWS services to streamline your data processing tasks, making your workflows more efficient and easier to manage.&lt;/p&gt;

&lt;p&gt;Whether you are handling large datasets or integrating with systems that require JSON format, this solution provides a robust and scalable way to manage your data transformations. Start leveraging AWS Lambda and S3 today to simplify your data processing pipelines and enhance your application's performance.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Automating Email Notifications for S3 Object Uploads using SNS</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Sun, 30 Jun 2024 14:59:13 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/automating-email-notifications-for-s3-object-uploads-using-sns-5efb</link>
      <guid>https://dev.to/mohanapriya_s_1808/automating-email-notifications-for-s3-object-uploads-using-sns-5efb</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In today's cloud-centric world, automation is key to managing and scaling infrastructure efficiently. Amazon Web Services (AWS) offers a robust suite of services that allow developers to build complex systems with relative ease. One such powerful combination is Amazon Simple Storage Service (S3) and Amazon Simple Notification Service (SNS). Together, they can automate notifications for various events, such as when a new object is uploaded to an S3 bucket. This blog post will guide you through the process of setting up an S3 bucket, creating an SNS topic, subscribing to the topic via email, and configuring the system to send notifications when objects are uploaded to the S3 bucket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Create an S3 Bucket&lt;br&gt;
First, we need an S3 bucket where objects will be stored. Follow these steps to create an S3 bucket:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.Navigate to the S3 console:&lt;/strong&gt; Open the AWS Management Console and navigate to the S3 service.&lt;br&gt;
&lt;strong&gt;2.Create a new bucket:&lt;/strong&gt;Click on "Create bucket" and provide a unique name for your bucket. Choose the appropriate region and configure other settings as per your requirements. For this example, we’ll stick with the default settings.&lt;br&gt;
&lt;strong&gt;3.Create the bucket:&lt;/strong&gt; Click on "Create bucket" to finalize the creation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjq7coqp5btq0gornjm1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjq7coqp5btq0gornjm1.png" alt=" " width="800" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Create an SNS Topic&lt;br&gt;
Next, we’ll create an SNS topic to which notifications will be sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Navigate to the SNS console:&lt;/strong&gt; Open the AWS Management Console and navigate to the SNS service.&lt;br&gt;
&lt;strong&gt;2. Create a topic:&lt;/strong&gt; Click on "Create topic", select "Standard" for the topic type, and provide a name for your topic. Click on "Create topic" to finalize the creation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80jh40e8fgj27xh49a78.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80jh40e8fgj27xh49a78.png" alt=" " width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Subscribe to the SNS Topic via Email&lt;br&gt;
To receive notifications, we need to subscribe to the SNS topic using an email address.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Create a subscription:&lt;/strong&gt;In the SNS console, select your topic and click on "Create subscription".&lt;br&gt;
&lt;strong&gt;2. Set protocol and endpoint:&lt;/strong&gt; Choose "Email" as the protocol and enter your email address in the endpoint field.&lt;br&gt;
&lt;strong&gt;3. Confirm the subscription:&lt;/strong&gt; AWS will send a confirmation email to the provided address. Check your email and click on the confirmation link to complete the subscription.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; Update the SNS Topic Access Policy&lt;br&gt;
To allow S3 to publish messages to the SNS topic, we need to modify the topic's access policy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Edit the policy:&lt;/strong&gt; In the SNS console, select your topic and click on "Edit" under the Access Policy section.&lt;br&gt;
&lt;strong&gt;2. Add the policy:&lt;/strong&gt; Add the following policy to allow S3 to publish notifications:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "s3.amazonaws.com"
      },
      "Action": "SNS:Publish",
      "Resource": "arn:aws:sns:your-region:your-account-id:your-topic-name"
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-5:&lt;/strong&gt; Configure S3 to Send Notifications to SNS&lt;br&gt;
Finally, configure the S3 bucket to send notifications to the SNS topic when an object is uploaded.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Navigate to the S3 bucket:&lt;/strong&gt; In the S3 console, select your bucket and click on the "Properties" tab.&lt;br&gt;
&lt;strong&gt;2. Add a notification:&lt;/strong&gt; Under the "Event notifications" section, click on "Create event notification".&lt;br&gt;
&lt;strong&gt;3. Configure the event:&lt;/strong&gt; Provide a name for the notification, select the "All object create events" event type, and choose SNS topic as the destination. Select your SNS topic from the dropdown and save the configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvm7iin0yll617g964d4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvm7iin0yll617g964d4.png" alt=" " width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-6:&lt;/strong&gt; Upload the object&lt;br&gt;
At last upload the object in the s3 bucket that has been created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw699omyknqrajl7f9ic.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw699omyknqrajl7f9ic.png" alt=" " width="800" height="419"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SNS will send notification to the subscribed email once the object is uploaded in the S3 bucket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
By following the steps outlined in this blog post, you have successfully set up an automated notification system that sends an email whenever a new object is uploaded to your S3 bucket. This integration between S3 and SNS not only demonstrates the power of AWS services but also showcases how they can be used to build robust and automated workflows. Whether for operational monitoring, security alerts, or just keeping track of changes in your storage, this setup is a valuable tool in any cloud infrastructure arsenal. Happy automating!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Creating and Managing IAM Users from Your EC2 Instance</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Sat, 29 Jun 2024 15:48:58 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/creating-and-managing-iam-users-from-your-ec2-instance-dhk</link>
      <guid>https://dev.to/mohanapriya_s_1808/creating-and-managing-iam-users-from-your-ec2-instance-dhk</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;br&gt;
Managing user access on your EC2 instance is a fundamental task for maintaining security and efficiency in your AWS environment. In this guide, we'll walk you through creating a new user from the EC2 instance, setting up SSH access, and verifying the new user. Let's get started!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Create a EC2 instance&lt;br&gt;
Create a ec2 instance in the console.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwgom9368qx5v0fkk17z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwgom9368qx5v0fkk17z.png" alt=" " width="800" height="434"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Connect to your EC2 instance&lt;br&gt;
Connect to your EC2 instance using SSH. Open your terminal and run the following command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh -i path_to_your_key.pem ec2-user@your_ec2_public_dns

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace path_to_your_key.pem with the path to your private key file and your_ec2_public_dns with the public DNS of your EC2 instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Create a new user&lt;br&gt;
Once connected to your EC2 instance, you can create a new user. Use the following command, replacing new_username with the desired username:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo adduser new_username

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command creates a new user and sets up a home directory for them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; Set a password for the new user&lt;br&gt;
Next, set a password for the new user. Use this command to set password for the new user.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo passwd new_username

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-5:&lt;/strong&gt; Verify the New User&lt;br&gt;
To verify that the new user has been created successfully, use this command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cat /etc/passwd

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see an entry similar to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new_username:x:1001:1001::/home/new_username:/bin/bash

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This confirms that the new user has been added to the system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqmj7hjselrnfjjxfzu0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqmj7hjselrnfjjxfzu0.png" alt=" " width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foxn8x8wqftzm8g5o1kvo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foxn8x8wqftzm8g5o1kvo.png" alt=" " width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
Creating and managing users on your EC2 instance is a straightforward process that enhances your system's security and accessibility. By following these steps, you can efficiently add new users, set up SSH access, and verify their presence on your system.&lt;/p&gt;

&lt;p&gt;Regularly review and manage user accounts to maintain a secure and organized environment. Happy managing!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Transforming Cloud Infrastructure with Terraform: Build, Change, Deploy</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Thu, 27 Jun 2024 18:18:30 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/transforming-cloud-infrastructure-with-terraform-build-change-deploy-54jn</link>
      <guid>https://dev.to/mohanapriya_s_1808/transforming-cloud-infrastructure-with-terraform-build-change-deploy-54jn</guid>
      <description>&lt;p&gt;&lt;strong&gt;Intoduction:&lt;/strong&gt;&lt;br&gt;
In today's fast-paced tech world, infrastructure as code (IaC) has become essential for managing and automating your cloud resources. Recently, I had the opportunity to dive into Terraform, an open-source IaC tool that allows you to define and provision your infrastructure using a simple, declarative programming language. In this blog, I'll walk you through my journey of building, changing, and deploying infrastructure, as well as querying data outputs using Terraform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before we dive into the steps, let's ensure you have the following prerequisites in place:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS Account: If you don't have one, sign up for an AWS account.&lt;/li&gt;
&lt;li&gt;Terraform Installed: Download and install Terraform from the official website.&lt;/li&gt;
&lt;li&gt;AWS CLI Installed: Install the AWS CLI by following the instructions here.&lt;/li&gt;
&lt;li&gt;AWS Credentials Configured: Configure your AWS CLI with your credentials by running aws configure.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Building the infrastructure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Terraform Configuration&lt;/p&gt;

&lt;p&gt;Each Terraform configuration must be in its own working directory. Create a directory for your configuration.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir terraform-learning
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Change into the directory&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd terraform-learning
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a file to define your infrastructure.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;code main.tf
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Open main.tf in your text editor, give the configuration below, and save the file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~&amp;gt; 4.16"
    }
  }

  required_version = "&amp;gt;= 1.2.0"
}

provider "aws" {
  region  = "us-west-2"
}

resource "aws_instance" "My_app_server" {
  ami           = "ami-830c94e3"
  instance_type = "t2.micro"

  tags = {
    Name = "ExampleInstance"
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Initialize the Terraform&lt;/p&gt;

&lt;p&gt;Initializing a configuration directory download and installs the providers defined in the configuration, which in this case is the aws provider.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Create the infrastructure&lt;/p&gt;

&lt;p&gt;Apply the configuration now with the terraform apply command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Enter Yes to apply the cofiguration.&lt;br&gt;
You have now created infrastructure using Terraform! Visit the EC2 console and find your new EC2 instance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Changing the infrastructure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Configuration of new ami&lt;/p&gt;

&lt;p&gt;Now update the ami of your instance. Change the aws_instance.My_app_server resource under the provider block in main.tf by replacing the current AMI ID with a new one.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; resource "aws_instance" "My_app_server" {
-  ami           = "ami-830c94e3"
+  ami           = "ami-08d70e59c07c61a3a"
   instance_type = "t2.micro"
 }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Apply the changes&lt;/p&gt;

&lt;p&gt;After changing the configuration, run terraform apply again to see how Terraform will apply this change to the existing resources.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As included before terraform destroys the existing instance first and the create the new instance in place of it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Destroy the infrastructure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once you no longer need infrastructure, you may want to destroy it to reduce your security exposure and costs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The terraform destroy command terminates resources managed by your Terraform project.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Defining the Input Variable&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Set the instance name with variable&lt;/p&gt;

&lt;p&gt;Create a new file called variables.tf with a block defining a new instance_name variable.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variable "instance_name" {
  description = "Value of the Name tag for the EC2 instance"
  type        = string
  default     = "ExampleInstance"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Update main.tf&lt;/p&gt;

&lt;p&gt;In main.tf, update the aws_instance resource block to use the new variable. The instance_name variable block will default to its default value ("ExampleInstance") unless you declare a different value.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; resource "aws_instance" "My_app_server" {
   ami           = "ami-08d70e59c07c61a3a"
   instance_type = "t2.micro"

   tags = {
-    Name = "ExampleInstance"
+    Name = var.instance_name
   }
 }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Apply Configuration&lt;/p&gt;

&lt;p&gt;Apply the configuration. Enter yes to confirm the cofiguration.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; Passing the variable&lt;/p&gt;

&lt;p&gt;Now apply the configuration again, this time overriding the default instance name by passing in a variable using the -var flag.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply -var "instance_name=SecondNameForInstance"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Terraform will update the instance's Name tag with the new name.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Query the Data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Output EC2 instance configuration&lt;/p&gt;

&lt;p&gt;Create a file called outputs.tf in your learn-terraform-aws-instance directory.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;code output.tf
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Inspect output values&lt;/p&gt;

&lt;p&gt;Apply the configuration and enter yes to confirm it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Query Output value&lt;/p&gt;

&lt;p&gt;Query the outputs with the terraform output command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform output
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Working with Terraform has been an enlightening experience. Its simplicity and power make managing infrastructure a breeze. Whether you're setting up a static website or managing complex cloud environments, Terraform's declarative approach and extensive provider support have got you covered.&lt;br&gt;
Happy Coding!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Hosting a Static Website on Amazon S3 with Terraform: A Step-by-Step Guide</title>
      <dc:creator>Mohana Priya </dc:creator>
      <pubDate>Thu, 27 Jun 2024 04:30:02 +0000</pubDate>
      <link>https://dev.to/mohanapriya_s_1808/hosting-a-static-website-on-amazon-s3-with-terraform-a-step-by-step-guide-3m1</link>
      <guid>https://dev.to/mohanapriya_s_1808/hosting-a-static-website-on-amazon-s3-with-terraform-a-step-by-step-guide-3m1</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
In the world of web development, static websites offer a straightforward and efficient way to present content without the complexities of server-side processing. Amazon S3 provides a robust platform for hosting these static websites, ensuring high availability and scalability. To further streamline the deployment process, Terraform, an Infrastructure as Code (IaC) tool, can be used to automate the creation and management of your AWS resources.&lt;/p&gt;

&lt;p&gt;This guide will walk you through the process of hosting a static website on Amazon S3 using Terraform, leveraging a modular file structure for clarity and ease of management. By the end of this tutorial, you'll have a fully functional static website hosted on Amazon S3, managed entirely through Terraform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites&lt;/strong&gt;&lt;br&gt;
Before we dive into the steps, let's ensure you have the following prerequisites in place:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS Account: If you don't have one, sign up for an AWS account.&lt;/li&gt;
&lt;li&gt;Terraform Installed: Download and install Terraform from the official website.&lt;/li&gt;
&lt;li&gt;AWS CLI Installed: Install the AWS CLI by following the instructions here.&lt;/li&gt;
&lt;li&gt;AWS Credentials Configured: Configure your AWS CLI with your credentials by running aws configure.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Step-1:&lt;/strong&gt; Create the Directory Structure&lt;br&gt;
First, let's create a directory for our Terraform project and navigate into it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir my-static-website
cd my-static-website
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-2:&lt;/strong&gt; Define your Terraform Configuration&lt;/p&gt;

&lt;p&gt;Create a file named terraform.tf and define your provider configuration. This configuration sets up the Terraform to use the AWS provider, specifying your AWS profile and region&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Terraform
terraform {
  required_version = "1.8.5"
  required_providers {
    aws = {
        source = "hashicorp/aws"
        version = "5.40.0"
    }
  }
}

#Provider
provider "aws" {
  profile = "default"
  region = "us-east-1"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-3:&lt;/strong&gt; Create the S3 bucket&lt;br&gt;
Create a file named bucket.tf to define your S3 bucket and its configuration. This defines a S3 bucket and uploads an index.html file to it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create S3 Bucket
resource "aws_s3_bucket" "terraform-demo-1808" {
  bucket = "terraform-demo-1808"
}

# Upload file to S3
resource "aws_s3_object" "terraform_index" {
  bucket = aws_s3_bucket.terraform-demo-1808.id
  key = "index.html"
  source = "index.html"
  content_type = "text/html"
  etag = filemd5("index.html")
}

# S3 Web hosting
resource "aws_s3_bucket_website_configuration" "terraform_hosting" {
  bucket = aws_s3_bucket.terraform-demo-1808.id

  index_document {
    suffix = "index.html"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-4:&lt;/strong&gt; Set up the bucket policies &lt;br&gt;
Create a file named policy.tf to define your S3 bucket policies to allow public access. This block temporarily disables S3’s default Block Public Access settings for this specific bucket.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# S3 public access
resource "aws_s3_bucket_public_access_block" "terraform-demo" {
    bucket = aws_s3_bucket.terraform-demo-1808.id
  block_public_acls = false
  block_public_policy = false
}

# S3 public Read policy
resource "aws_s3_bucket_policy" "open_access" {
  bucket = aws_s3_bucket.terraform-demo-1808.id

  policy = jsonencode({
    Version = "2012-10-17"
    Id      = "Public_access"
    Statement = [
      {
        Sid = "IPAllow"
        Effect = "Allow"
        Principal = "*"
        Action = ["s3:GetObject"]
        Resource = "${aws_s3_bucket.terraform-demo-1808.arn}/*"
      },
    ]
  })
  depends_on = [ aws_s3_bucket_public_access_block.terraform-demo ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-5:&lt;/strong&gt; Configure the Output variable&lt;br&gt;
Create a file named output.tf for your website's URL.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Website URL
output "website_url" {
  value = "http://${aws_s3_bucket.terraform-demo-1808.bucket}.s3-website.${aws_s3_bucket.terraform-demo-1808.region}.amazonaws.com"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-6:&lt;/strong&gt; Initialize your terraform&lt;br&gt;
It essentially prepares Terraform’s working directory for managing infrastructure. It downloads and installs any required provider plugins based on your configuration like hashicorp/aws provider.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-7:&lt;/strong&gt; Terraform Validate&lt;br&gt;
It performs a static analysis of your Terraform configuration files and validates the overall syntax of your Terraform code, ensuring it adheres to the Terraform language rules.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform validate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-8:&lt;/strong&gt; Terraform Plan&lt;br&gt;
It is used for understanding and reviewing the intended changes to your infrastructure before actually applying them.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-9:&lt;/strong&gt; Terraform Apply&lt;br&gt;
The Terraform apply command in Terraform is the one that actually executes the actions outlined in the plan generated by the Terraform plan. It’s the final step in making the desired infrastructure changes a reality.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step-10:&lt;/strong&gt; Access your website&lt;br&gt;
After the apply process completes terraform will output your website's URL. Visit the URL to see your static website live.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqwurje5mkpud5t3eqldh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqwurje5mkpud5t3eqldh.jpg" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
Congratulations! You've successfully hosted a static website on Amazon S3 using Terraform. This approach not only makes deployment straightforward but also ensures your infrastructure is version-controlled and easily reproducible.&lt;/p&gt;

&lt;p&gt;By following this guide, you can quickly deploy static websites for various purposes, such as personal blogs, portfolios, or documentation sites. Explore the power of Infrastructure as Code with Terraform and take your web hosting to the next level!&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
