<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: usamanisarkhan</title>
    <description>The latest articles on DEV Community by usamanisarkhan (@usamanisarkhan).</description>
    <link>https://dev.to/usamanisarkhan</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/usamanisarkhan"/>
    <language>en</language>
    <item>
      <title>What happens in Vegas, stays in Vegas- But not for AWS</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Tue, 05 Dec 2023 18:54:03 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/what-happens-in-vegas-stays-in-vegas-but-not-for-aws-2aea</link>
      <guid>https://dev.to/usamanisarkhan/what-happens-in-vegas-stays-in-vegas-but-not-for-aws-2aea</guid>
      <description>&lt;p&gt;So here is the quickest update of announcements at the Re-invent 23 event.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. &lt;a href="https://youtu.be/y6P-3CPj7CQ"&gt;Amazon Q&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
Amazon Q is a new type of generative AI assistant specifically for work. It can be tailored to your business and is with security and privacy in mind. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Expanded choice of models in Amazon Bedrock&lt;/strong&gt;&lt;br&gt;
With Amazon Bedrock, customers can drive rapid innovation with the latest versions of foundation models. Customers have even more choice of models to build and scale generative AI applications. This includes additions from Anthropic, Cohere, Meta, and Stability AI, as well as new models in the Amazon Titan family.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/pm/sagemaker/?gclid=Cj0KCQiA35urBhDCARIsAOU7QwnGw5DiRil6v9vEjSW9GTsiQqtW_qEMXzN7EtrTNOvtVp-d4rMzzUsaAp2dEALw_wcB&amp;amp;trk=b6c2fafb-22b1-4a97-a2f7-7e4ab2c7aa28&amp;amp;sc_channel=ps&amp;amp;ef_id=Cj0KCQiA35urBhDCARIsAOU7QwnGw5DiRil6v9vEjSW9GTsiQqtW_qEMXzN7EtrTNOvtVp-d4rMzzUsaAp2dEALw_wcB:G:s&amp;amp;s_kwcid=AL!4422!3!651751060692!e!!g!!amazon%20sagemaker!19852662230!145019225977"&gt;&lt;strong&gt;3. 5 new Amazon SageMaker capabilities&lt;/strong&gt;&lt;/a&gt;&lt;br&gt;
Amazon SageMaker is a fully managed service that brings together a broad set of tools to enable high-performance, low-cost machine learning for any use case. Five new capabilities will make it even easier for customers to build, train, and deploy models for generative AI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. AWS serverless innovations&lt;/strong&gt;&lt;br&gt;
Three new AWS serverless innovations for Amazon Aurora, Amazon Redshift, and Amazon ElastiCache build on the work AWS has been doing ever since the first service, Amazon S3: Amazon Simple Storage Service, was launched. The new offerings are aimed to help customers analyze and manage data at any scale, while dramatically simplifying their operations&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Amazon S3 Express One Zone&lt;/strong&gt;&lt;br&gt;
Amazon S3 is one of the most popular cloud object storage services, holding more than 350 trillion “objects,” or pieces of data, and averaging more than 100 million requests for data a second. Amazon S3 Express One Zone is a new purpose-built storage class for running applications that require extremely fast data access, to achieve the highest possible efficiency. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>amazon</category>
      <category>communitybuilders</category>
    </item>
    <item>
      <title>Amazon Q - A Perfect ITSM</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Wed, 29 Nov 2023 10:06:18 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/amazon-q-a-perfect-itsm-4ode</link>
      <guid>https://dev.to/usamanisarkhan/amazon-q-a-perfect-itsm-4ode</guid>
      <description>&lt;p&gt;AWS launched a new service named as Amazon Q. It can be tailored to your business by connecting it to company data, information and systems made simple with more than 40 built-in connectors. You can make Amazon Q an expert in your organization's knowledge and experience by pointing it at your internal data repositories, as well as connect to more than 40 enterprise systems—including Amazon Simple Storage Service (Amazon S3), Salesforce, Google Drive, Microsoft 365, ServiceNow, Gmail, Slack, Atlassian, and Zendesk—to help people across your organization easily get the answers they need, make faster decisions, synthesize new information and lift out of the drag of repetitive tasks.&lt;br&gt;
find out more here &lt;a href="https://youtube.com/watch?v=bZsIPinetV4"&gt;https://youtube.com/watch?v=bZsIPinetV4&lt;/a&gt;&lt;/p&gt;

</description>
      <category>itsm</category>
      <category>aws</category>
      <category>amazonq</category>
    </item>
    <item>
      <title>Amazon Rekognition - IS too Good</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Sat, 25 Nov 2023 22:43:22 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/amazon-rekognition-5c2o</link>
      <guid>https://dev.to/usamanisarkhan/amazon-rekognition-5c2o</guid>
      <description>&lt;p&gt;I am glad to see the confidence intervals in #AmazonRekognition. Its a powerful #AI service from #AWS that has a pretty good, trained model. Its really helpful in building an innovative application that harnesses the capabilities for image and video analysis.&lt;br&gt;
As we continue to explore the vast landscape of AI and AWS, I'm eager to see the positive impact this technology can bring to businesses and society. &lt;br&gt;
Let's drive innovation together!&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--s-nEi5Qv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kwtk1iq0iicx06k0xemv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--s-nEi5Qv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kwtk1iq0iicx06k0xemv.png" alt="Image description" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yllNsYew--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j8rpj7wkjfz67ow5jcvc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yllNsYew--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j8rpj7wkjfz67ow5jcvc.png" alt="Image description" width="800" height="393"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aiops</category>
      <category>amazonrekognition</category>
      <category>aws</category>
    </item>
    <item>
      <title>A guide to Basic AWS provisioning through IAC using Terraform</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Fri, 24 Nov 2023 19:37:10 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/a-guide-to-basic-aws-provisioning-through-iac-using-terraform-31fj</link>
      <guid>https://dev.to/usamanisarkhan/a-guide-to-basic-aws-provisioning-through-iac-using-terraform-31fj</guid>
      <description>&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the simplest tutorial of IAC using terraform provision following resource&lt;br&gt;
a. An EC2 Instance&lt;br&gt;
b. S3 Bucket&lt;br&gt;
c. VPC.&lt;br&gt;
d. Covering security groups and subnets.&lt;br&gt;
Pre Requisites&lt;br&gt;
AWS account&lt;br&gt;
Terraform downloaded in local PC.&lt;br&gt;
Step1.&lt;br&gt;
Using IAM AWS console create your keys&lt;/p&gt;

&lt;p&gt;In PowerShell execute the command :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will now require 4 entries two of which are regarding access and other two are standard until changed.&lt;/p&gt;

&lt;p&gt;Open up VSCode and make four files&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;main.tf: Terraform configuration file.&lt;/li&gt;
&lt;li&gt;variables.tf: Define variables for your project.&lt;/li&gt;
&lt;li&gt;outputs.tf: Define output variables for your project.&lt;/li&gt;
&lt;li&gt;provider.tf: Store name of region
The Code to be pasted in Main.tf is below and for further clarity it is commented.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
# Create a custom VPC
resource "aws_vpc" "set14-vpc" {
  cidr_block       = "10.0.0.0/16"
  instance_tenancy = "default"

  tags = {
    Name = "set14-vpc"
  }
}

#Create a public subnet
resource "aws_subnet" "set14-public-subnet" {
  vpc_id = aws_vpc.set14-vpc.id
  cidr_block = "10.0.1.0/24"
  availability_zone = "us-east-1a"

  tags = {
    Name = "set14-public-subnet"
  }
}
#creating an IGW
resource "aws_internet_gateway" "set14-igw" {
  vpc_id = aws_vpc.set14-vpc.id

  tags = {
    Name = "main"
  }
}

resource "aws_s3_bucket" "set-14-s3-backend" {
  bucket = var.bucket_name

  tags = {
    Name        = "set-14-s3-backend"
    Environment = "Dev"
  }
}

resource "aws_s3_bucket_acl" "set14-acl" {
  bucket = aws_s3_bucket.set-14-s3-backend.id
  acl = "public-read"
}

#security gp
resource "aws_security_group" "set14-sg" {
  name        = "allow_tls"
  description = "Allow TLS inbound traffic"
  vpc_id      = aws_vpc.set14-vpc.id

  ingress {
    description      = "TLS from VPC"
    from_port        = var.port_ssh
    to_port          = var.port_ssh
    protocol         = "tcp"
    cidr_blocks      = ["0.0.0.0/0"]

  }
  ingress {
    description      = "TLS from VPC"
    from_port        = 0
    to_port          = 0
    protocol         = "-1"
    cidr_blocks      = ["0.0.0.0/0"]

  }

  egress {
    from_port        = 0
    to_port          = 0
    protocol         = "-1"
    cidr_blocks      = ["0.0.0.0/0"]

  }

  tags = {
    Name = "allow_tls"
  }
}
#creating ec2 Instance
resource "aws_instance" "set14-ec2" {
  ami           = "ami-05a5f6298acdb05b6"
  instance_type = "t2.micro"
  subnet_id = aws_subnet.set14-public-subnet.id
  vpc_security_group_ids = [aws_security_group.set14-sg.id]

user_data = &amp;lt;&amp;lt;-EOF
#!bin/bash
sudo yum update -y
sudo yum install httpd -y
sudo systemctl start httpd
EOF

  tags = {
    Name = "HelloWorld"
    Owner = "Kenny"

  }

}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Code to be pasted in variable.tf is below and for further clarity it is commented.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variable "bucket_name"{
    default = "set-14-s3-backend"
}

variable "port_ssh"{
    default = 22
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Code to be pasted in output.tf is below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;output "Kenny-ip-address" {
    value = aws_instance.set14-ec2.public_ip
}
output "Kenny-vpc-id" {
    value = aws_vpc.set14-vpc.id
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Code to be pasted in provide.tf is below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Configure the AWS Provider
provider "aws" {
  region = "us-east-1"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In VSC using terminal execute :&lt;br&gt;
&lt;em&gt;&lt;strong&gt;terraform init&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In VSC using terminal execute :&lt;br&gt;
&lt;strong&gt;&lt;em&gt;terraform plan&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In VSC using terminal execute :&lt;br&gt;
&lt;em&gt;&lt;strong&gt;terraform apply&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;If you open up the Amazon Console, you will be able to see the resource provisioned.&lt;br&gt;
In order to destroy all resources and get back to initial stage .&lt;br&gt;
In VSC using terminal execute :&lt;br&gt;
&lt;em&gt;&lt;strong&gt;terraform destroy&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;I hope you liked this tutorial !. Let me know in the comments&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>iac</category>
      <category>aws</category>
      <category>devops</category>
    </item>
    <item>
      <title>Are you feeling Hungry ? How about a free food App</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Wed, 22 Nov 2023 00:08:27 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/are-you-feeling-hungry-how-about-free-food-app-4kbd</link>
      <guid>https://dev.to/usamanisarkhan/are-you-feeling-hungry-how-about-free-food-app-4kbd</guid>
      <description>&lt;p&gt;With PartyRock, you can share apps and content you create with a single click. Publish your app for others to find with shareable links. Discover, play, and remix apps to make them your own. You can even create a snapshot link of the content your app generates to quickly share your results with your community. PartyRock is most fun with friends. &lt;br&gt;
&lt;strong&gt;Check out this food app i made in 3 minutes&lt;/strong&gt; &lt;a href="https://partyrock.aws/u/Usama/cL2O2jrFn/Community-Helper%3A-Get-Me-Food-for-Free"&gt;https://partyrock.aws/u/Usama/cL2O2jrFn/Community-Helper%3A-Get-Me-Food-for-Free&lt;/a&gt;&lt;/p&gt;

</description>
      <category>partyrock</category>
      <category>aws</category>
      <category>generative</category>
      <category>ai</category>
    </item>
    <item>
      <title>Building end-to-end AWS DevSecOps CI/CD pipeline (Part 1- Continuous Delivery)</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Sat, 18 Nov 2023 23:51:10 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/building-end-to-end-aws-devsecops-cicd-pipeline-part-1-continuous-delivery-20jm</link>
      <guid>https://dev.to/usamanisarkhan/building-end-to-end-aws-devsecops-cicd-pipeline-part-1-continuous-delivery-20jm</guid>
      <description>&lt;p&gt;This series of Articles will cover setting up of end to end AWS DevSecOps CI/CD Pipelines. We will start from an easier and faster side in a very basic component and will then setup the complete pipeline.&lt;br&gt;
DESIRED PRODUCT AT THE END OF THIS SERIES&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0YiB9YUq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zvxa158bsm232x2gau1k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0YiB9YUq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zvxa158bsm232x2gau1k.png" alt="Image description" width="800" height="561"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This can be a very good starting point for someone who starting a career as a DevOps Engineer and also showcasing on Github profile.&lt;br&gt;
So Lets kick in the first Step. The tutorials of CD pipeline are available on AWS student hub but i have made this more easier.&lt;br&gt;
DESIRED PRODUCT AT THE END OF THIS PART&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DE3N2ii_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vb8cjanm8abg92apwttp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DE3N2ii_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vb8cjanm8abg92apwttp.png" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pre Requisites&lt;/strong&gt;&lt;br&gt;
Will to learn.&lt;br&gt;
You must have an AWS account (free tier is sufficient)&lt;br&gt;
Must have a Github account.&lt;br&gt;
Downloaded Git and Visual Studio Code on your local machine.&lt;br&gt;
Time.&lt;br&gt;
&lt;strong&gt;Step 1 : Create an IAM Role&lt;/strong&gt;&lt;br&gt;
Sign in to the AWS Management Console and open the IAM console at &lt;a href="https://console.aws.amazon.com/iam/"&gt;https://console.aws.amazon.com/iam/&lt;/a&gt;.&lt;br&gt;
In the IAM console, in the navigation pane, choose Policies, and then choose Create policy.&lt;br&gt;
On the Specify permissions page, choose JSON.&lt;br&gt;
Remove the example JSON code.&lt;br&gt;
Paste the following code: { "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:Get*", "s3:List*" ], "Effect": "Allow", "Resource": "*" } ]}&lt;br&gt;
Choose Next.&lt;br&gt;
On the Review and create page, in the Policy name box, type CodeDeployDemo-EC2-Permissions.&lt;br&gt;
(Optional) For Description, type a description for the policy.&lt;br&gt;
Choose Create policy.&lt;br&gt;
In the navigation pane, choose Roles, and then choose Create role.&lt;br&gt;
Under Use case, choose the EC2 use case.&lt;br&gt;
Choose Next.&lt;br&gt;
In the list of policies, select the check box next to the policy you just created (CodeDeployDemo-EC2-Permissions). If necessary, use the search box to find the policy.&lt;br&gt;
To use Systems Manager to install or configure the CodeDeploy agent, select the check box next to AmazonSSMManagedInstanceCore. This AWS managed policy enables an instance to use Systems Manager service core functionality. If necessary, use the search box to find the policy.&lt;br&gt;
Choose Next.&lt;br&gt;
On the Name, review, and create page, in Role name, enter a name for the service role (for example, CodeDeployDemo-EC2-Instance-Profile), and then choose Create role.&lt;br&gt;
You've now created an IAM instance profile to attach to your Amazon EC2 instances.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Elastic Beanstalk&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a new browser tab, open the AWS Elastic Beanstalk console.&lt;br&gt;
Choose the orange Create Application button.&lt;br&gt;
Choose Web server environment under the Configure environment heading.&lt;br&gt;
In the text box under the heading Application name, enter DevOpsGettingStarted.&lt;br&gt;
In the Platform dropdown menu, under the Platform heading, select Node.js . Platform branch and Platform version will automatically populate with default selections.&lt;br&gt;
Confirm that the radio button next to Sample application under the Application code heading is selected.&lt;br&gt;
Confirm that the radio button next to Single instance (free tier eligible) under the Presets heading is selected.&lt;br&gt;
Select Next.&lt;br&gt;
On the Configure service access screen, choose Use an existing service role for Service Role.&lt;br&gt;
For EC2 instance profile dropdown list, the values displayed in this dropdown list may vary, depending on whether you account has previously created a new environment.&lt;br&gt;
Now that you've created an IAM Role, and refreshed the list, it displays as a choice in the dropdown list. Select the IAM Role you just created from the EC2 instance profile dropdown list.&lt;br&gt;
Choose Skip to Review on the Configure service access page.&lt;br&gt;
The Review page displays a summary of all your choices.&lt;br&gt;
Choose Submit at the bottom of the page to initialize the creation of your new environment.&lt;br&gt;
While waiting for deployment, you should see:&lt;br&gt;
A screen that will display status messages for your environment.&lt;br&gt;
After a few minutes have passed, you will see a green banner with a checkmark at the top of the environment screen.&lt;br&gt;
Once you see the banner, you have successfully created an AWS Elastic Beanstalk application and deployed it to an environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connect Git&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a new browser tab, navigate to GitHub and make sure you are logged into your account.&lt;br&gt;
In that same tab, open the aws-elastic-beanstalk-express-js-sample repo.&lt;br&gt;
Choose the white Fork button on the top right corner of the screen. Next, you will see a small window asking you where you would like to fork the repo.&lt;br&gt;
Verify it is showing your account and choose Create a fork. After a few seconds, your browser will display a copy of the repo in your account under Repositories.&lt;br&gt;
Go to the repository and choose the green Code button near the top of the page.&lt;br&gt;
To clone the repository using HTTPS, confirm that the heading says Clone with HTTPS. If not, select the Use HTTPS link.&lt;br&gt;
Choose the white button with a clipboard icon on it (to the right of the URL)&lt;br&gt;
If you're on a Mac or Linux computer, open your terminal. If you're on Windows, launch Git Bash.5. In the terminal or Bash platform, whichever you are using, enter the following command and paste the URL you just copied in Step 2 when you clicked the clipboard icon. Be sure to change "YOUR-USERNAME" to your GitHub username. You should see a message in your terminal that starts with Cloning into. This command creates a new folder that has a copy of the files from the GitHub repo.git clone &lt;a href="https://github.com/YOUR-USERNAME/aws-elastic-beanstalk-express-js-sample"&gt;https://github.com/YOUR-USERNAME/aws-elastic-beanstalk-express-js-sample&lt;/a&gt;&lt;br&gt;
In the new folder there is a file named app.js. Open app.js in your favorite code editor.&lt;br&gt;
Change the message in line 5 to say something other than "Hello World!" and save the file.&lt;br&gt;
Go to the folder created with the name aws-elastic-beanstalk-express-js-sample/ and Commit the change with the following commands:git add app.jsgit commit -m "change message"&lt;br&gt;
Push the local changes to the remote repo hosted on GitHub with the following command. Note that you need to configure Personal access tokens (classic) under Developer Settings in GitHub for remote authentication.git push&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Code Build&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a new browser tab, open the AWS CodeBuild console.&lt;br&gt;
Choose the orange Create project button.&lt;br&gt;
In the Project name field, enter Build-DevOpsGettingStarted.&lt;br&gt;
Select GitHub from the Source provider dropdown menu.&lt;br&gt;
Confirm that the Connect using OAuth radio button is selected.&lt;br&gt;
Choose the white Connect to GitHub button. A new browser tab will open asking you to give AWS CodeBuild access to your GitHub repo.&lt;br&gt;
Choose the green Authorize aws-codesuite button.&lt;br&gt;
Enter your GitHub password.&lt;br&gt;
Choose the orange Confirm button.&lt;br&gt;
Select Repository in my GitHub account.&lt;br&gt;
Enter aws-elastic-beanstalk-express-js-sample in the search field.&lt;br&gt;
Confirm that Managed Image is selected.&lt;br&gt;
Select Amazon Linux 2 from the Operating system dropdown menu.&lt;br&gt;
Select Standard from the Runtime(s) dropdown menu.&lt;br&gt;
Select aws/codebuild/amazonlinux2-x86_64-standard:3.0 from the Image dropdown menu.&lt;br&gt;
Confirm that Always use the latest image for this runtime version is selected for Image version.&lt;br&gt;
Confirm that Linux is selected for Environment type.&lt;br&gt;
Confirm that New service role is selected.&lt;br&gt;
Select Insert build commands.&lt;br&gt;
Choose Switch to editor.&lt;br&gt;
Replace the Buildspec in the editor with the code below&lt;br&gt;
:version: 0.2phases: build: commands: - npm i --saveartifacts: files: - '*&lt;em&gt;/&lt;/em&gt;'&lt;br&gt;
Choose the orange Create build project button. You should now see a dashboard for your project.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating a New Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a browser window, open the AWS CodePipeline console.&lt;br&gt;
Choose the orange Create pipeline button. A new screen will open up so you can set up the pipeline.&lt;br&gt;
In the Pipeline name field, enter Pipeline-DevOpsGettingStarted.&lt;br&gt;
Confirm that New service role is selected.&lt;br&gt;
Choose the orange Next button.&lt;br&gt;
Select GitHub version 1 from the Source provider dropdown menu.&lt;br&gt;
Choose the white Connect to GitHub button. A new browser tab will open asking you to give AWS CodePipeline access to your GitHub repo.&lt;br&gt;
Choose the green Authorize aws-codesuite button. Next, you will see a green box with the message You have successfully configured the action with the provider.&lt;br&gt;
From the Repository dropdown, select the repo you created in Connect Git step.&lt;br&gt;
Select main from the branch dropdown menu.&lt;br&gt;
Confirm that GitHub webhooks is selected.&lt;br&gt;
Choose the orange Next button.&lt;br&gt;
From the Build provider dropdown menu, select AWS CodeBuild. Select Build-DevOpsGettingStarted under Project name.Choose the orange Next button.&lt;br&gt;
Configure the deploy stage&lt;br&gt;
Select AWS Elastic Beanstalk from the Deploy provider dropdown menu.&lt;br&gt;
Select the field under Application name and confirm you can see the app DevOpsGettingStarted created in Step Above.&lt;br&gt;
Select DevOpsGettingStarted-env from the Environment name textbox.&lt;br&gt;
Choose the orange Next button. You will now see a page where you can review the pipeline configuration.&lt;br&gt;
Choose the orange Create pipeline button&lt;br&gt;
Open the AWS CodePipeline console.&lt;br&gt;
You should see the pipeline we created, which was called Pipeline-DevOpsGettingStarted. Select this pipeline.&lt;br&gt;
Choose the white Edit button near the top of the page.&lt;br&gt;
Choose the white Add stage button between the Build and Deploy stages.&lt;br&gt;
In the Stage name field, enter Review.&lt;br&gt;
Choose the orange Add stage button.&lt;br&gt;
In the Review stage, choose the white Add action group button.&lt;br&gt;
Under Action name, enter Manual_Review.&lt;br&gt;
From the Action provider dropdown, select Manual approval.&lt;br&gt;
Confirm that the optional fields have been left blank.&lt;br&gt;
Choose the orange Done button.&lt;br&gt;
Choose the orange Save button at the top of the page.&lt;br&gt;
Choose the orange Save button to confirm the changes. You will now see your pipeline with four stages: Source, Build, Review, and Deploy.&lt;br&gt;
In your favorite code editor, open the app.js file from Connect Git.&lt;br&gt;
Change the message in Line 5.&lt;br&gt;
Save the file.&lt;br&gt;
Open your preferred Git client.&lt;br&gt;
Navigate to the folder created in Connect Git.&lt;br&gt;
Commit the change and push with the following commands:git add app.jsgit commit -m "Full pipeline test"git push&lt;br&gt;
Navigate to the AWS CodePipeline console.&lt;br&gt;
Select the pipeline named Pipeline-DevOpsGettingStarted. You should see the Source and Build stages switch from blue to green.&lt;br&gt;
When the Review stage switches to blue, choose the white Review button.&lt;br&gt;
Write an approval comment in the Comments textbox.&lt;br&gt;
Choose the orange Approve button.&lt;br&gt;
Wait for the Review and Deploy stages to switch to green.&lt;br&gt;
Select the AWS Elastic Beanstalk link in the Deploy stage. A new tab listing your Elastic Beanstalk environments will open.&lt;br&gt;
Select the URL in the Devopsgettingstarted-env row. You should see a webpage with a white background and the text you had in your most recent GitHub commit.&lt;br&gt;
Congratulations! You have a fully functional continuous delivery pipeline hosted on AWS.&lt;/p&gt;

</description>
      <category>cicd</category>
      <category>aws</category>
      <category>devops</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Secure your S3 Buckets</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Thu, 09 Nov 2023 21:11:56 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/secure-your-s3-buckets-3jgk</link>
      <guid>https://dev.to/usamanisarkhan/secure-your-s3-buckets-3jgk</guid>
      <description>&lt;p&gt;Protection of data on Cloud remains of paramount importance. With the ever changing environment, the basic building blocks of Cloud infrastructure must be safeguarded right from the start. Here are 3 simple steps for Cloud Engineers and Managers to ensure the security of S3 Buckets. #CloudSecurity #S3Buckets #Cybersecurity #AWS #awsacademy #awscloud&lt;/p&gt;

&lt;p&gt;STEP 1 :&lt;br&gt;
SELECT  “BLOCK ALL PUBLIC ACCESS” OPTION &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--q6-9JPHg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/77668o9ju6nlhrs2omk9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--q6-9JPHg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/77668o9ju6nlhrs2omk9.png" alt="Image description" width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;STEP 2:&lt;br&gt;
SELECT  “DISABLE ACLs” OPTION IN OBJECT OWNERSHP&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vHh7WZYy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e9ol06ckl60w5361781t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vHh7WZYy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e9ol06ckl60w5361781t.png" alt="Image description" width="767" height="310"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;STEP 3:&lt;br&gt;
SELECT  “ENABLE VERSIONING” OPTION &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--i6DLKZlt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rn0y41q9g2m0v2o2851f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i6DLKZlt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rn0y41q9g2m0v2o2851f.png" alt="Image description" width="800" height="210"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudstorage</category>
      <category>security</category>
      <category>devops</category>
    </item>
    <item>
      <title>AWS Cloud Institute</title>
      <dc:creator>usamanisarkhan</dc:creator>
      <pubDate>Tue, 07 Nov 2023 04:20:20 +0000</pubDate>
      <link>https://dev.to/usamanisarkhan/aws-cloud-institute-179n</link>
      <guid>https://dev.to/usamanisarkhan/aws-cloud-institute-179n</guid>
      <description>&lt;p&gt;The virtual cloud-skills training &lt;a href="https://aws.amazon.com/training/aws-cloud-institute/"&gt;program&lt;/a&gt; that will help you launch your career as a cloud developer in as little as one year - regardless of your technical background. Come build the in-demand skills that put YOU in demand. This is going to be a game changer for all the learners. (Available in USA only)&lt;/p&gt;

</description>
      <category>devops</category>
      <category>aws</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
