<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Tanmay Shukla</title>
    <description>The latest articles on DEV Community by Tanmay Shukla (@tanmaygi).</description>
    <link>https://dev.to/tanmaygi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tanmaygi"/>
    <language>en</language>
    <item>
      <title>Setup CloudFront &amp; Amazon S3 to Deliver objects on the Web Apps (securely &amp; efficiently)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Mon, 20 Mar 2023 12:08:15 +0000</pubDate>
      <link>https://dev.to/aws-builders/setup-cloudfront-amazon-s3-to-deliver-objects-on-the-web-apps-securely-efficiently-2gnk</link>
      <guid>https://dev.to/aws-builders/setup-cloudfront-amazon-s3-to-deliver-objects-on-the-web-apps-securely-efficiently-2gnk</guid>
      <description>&lt;h2&gt;
  
  
  What is S3 ?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS S3 (Simple storage service)&lt;/strong&gt; is an object storage service offering industry-leading scalability, data availability, security, and performance.&lt;/li&gt;
&lt;li&gt;Businesses can store and retrieve any amount of data, at any time, from anywhere on the web. S3 is highly durable and designed to provide 99.999999999% durability of objects over a given year, making it a reliable and secure solution for storing critical data.&lt;/li&gt;
&lt;li&gt;S3 is widely used in storing and serving object on the internet, whether it is images, videos, or any other unstructured data. But when we use S3 with Cloudfront it provides lot of benefits&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What is Cloudfront ?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Cloudfront is the AWS Content delivery network that speeds up distribution of your static and dynamic web content, such as .html, .css, .js, and image files, to your users.&lt;/li&gt;
&lt;li&gt;CloudFront delivers your content through a worldwide network of data centers called edge locations.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Here are some applications of CloudFront:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Content Delivery:&lt;/strong&gt; CloudFront accelerates the delivery of your content by caching your content at edge locations worldwide. This reduces the time it takes for users to access your content and improves the performance of your web applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Video Streaming:&lt;/strong&gt; CloudFront can deliver high-quality video streaming to users worldwide. It can also integrate with AWS Elemental Media Services to provide a complete video streaming solution.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Website Security:&lt;/strong&gt; CloudFront integrates with AWS Shield to provide protection against DDoS attacks. It also supports SSL/TLS encryption to secure your content in transit.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Global Applications:&lt;/strong&gt; CloudFront provides a global presence to your application by caching your content at edge locations worldwide. This allows your users to access your content from a location that is geographically closer to them, reducing latency and improving their experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;API Gateway:&lt;/strong&gt; CloudFront can be used as a custom origin for API Gateway. This allows you to cache API responses at edge locations, reducing the load on your API backend and improving the performance of your APIs.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6tx6ld5z5j04b1g0h6z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6tx6ld5z5j04b1g0h6z.png" alt="Image 2" width="577" height="140"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Serving S3 objects with CloudFront has several benefits:
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fok1q7v3nvorgut3rvoys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fok1q7v3nvorgut3rvoys.png" alt="Image 1" width="800" height="418"&gt;&lt;/a&gt; &lt;br&gt;
&lt;strong&gt;1.) Improved performance:&lt;/strong&gt; CloudFront is a content delivery network (CDN) that caches content at edge locations around the world. By serving S3 objects with CloudFront, you can improve the performance of your application by reducing latency and improving load times for users located far away from the S3 bucket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2.) Reduced costs:&lt;/strong&gt; Serving S3 objects with CloudFront can help reduce your data transfer costs by caching frequently accessed content at edge locations. This reduces the number of requests made to the S3 bucket, which can help reduce your data transfer costs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.) Improved security:&lt;/strong&gt; CloudFront can help improve the security of your S3 objects by enabling you to restrict access to your content using a variety of authentication methods, such as signed URLs or AWS Identity and Access Management (IAM) policies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4.) Customization:&lt;/strong&gt; CloudFront provides a range of customization options, such as custom SSL certificates, custom error pages, and content compression, which can help you optimize the delivery of your S3 objects to your users.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5.) Scalability:&lt;/strong&gt; CloudFront is designed to handle high levels of traffic and can automatically scale to accommodate increases in demand. This can help ensure that your application remains highly available and responsive, even during periods of high traffic.&lt;/p&gt;

&lt;p&gt;Overall, serving S3 objects with CloudFront is a best practice for many AWS customers who need to distribute their content globally, improve the performance of their applications, and reduce their costs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup the S3 and Cloudfront for serving objects through Web application.
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;u&gt;&lt;strong&gt;Objective&lt;/strong&gt;&lt;/u&gt;
&lt;/h3&gt;

&lt;p&gt;1.) Sign in to AWS management console.&lt;/p&gt;

&lt;p&gt;2.) Create and setup the &lt;strong&gt;&lt;em&gt;S3 Bucket (Private)&lt;/em&gt;&lt;/strong&gt; to store objects.&lt;/p&gt;

&lt;p&gt;3.) Create the &lt;strong&gt;&lt;em&gt;Cloudfront distrubution&lt;/em&gt;&lt;/strong&gt; for your S3 bucket.&lt;/p&gt;

&lt;p&gt;4.) Setup the &lt;em&gt;&lt;strong&gt;IAM role&lt;/strong&gt;&lt;/em&gt; and permissions for Web application.&lt;/p&gt;

&lt;p&gt;5.) Finally, test the application !!!&lt;/p&gt;

&lt;h3&gt;
  
  
  Steps for Implementation of the project:
&lt;/h3&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;1.) Sign in to AWS management console.&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt; First, sign in to the AWS console with your username and password and select the appropriate region like us-east-1.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;2.) Create and setup the S3 Bucket (Private) to store objects.&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Let's search for S3 and open th S3 console &lt;/li&gt;
&lt;li&gt;Create S3 bucket:

&lt;ul&gt;
&lt;li&gt; Give a &lt;strong&gt;&lt;code&gt;globally unique name&lt;/code&gt;&lt;/strong&gt; and region to your bucket.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fooduck2q1rmmdawewg66.png" alt="s3 1" width="739" height="311"&gt;
&lt;/li&gt;
&lt;li&gt;Set the &lt;strong&gt;&lt;code&gt;object ownership&lt;/code&gt;&lt;/strong&gt; and keep &lt;strong&gt;&lt;code&gt;ACL's disabled&lt;/code&gt;&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh38e23vmm8p3kr07gida.png" alt="s3 2" width="719" height="366"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;Block all public access&lt;/code&gt;&lt;/strong&gt; -  It provides an additional layer of security for your data, preventing unauthorized access and ensuring that your data is protected at all times.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmnj8mbc3b6ujibr4j5s1.png" alt="s3 3" width="752" height="527"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;Enable the versioning&lt;/code&gt;&lt;/strong&gt; -  With S3 versioning, every time an object is updated or deleted, a new version of the object is created, allowing users to access and restore previous versions of the object.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi07o21symo7a139j1ygb.png" alt="s3 4" width="722" height="193"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;Enable the encryption&lt;/code&gt;&lt;/strong&gt; - In Server-side encryption, Amazon S3 encrypts your objects before saving them on disks in AWS data centers and then decrypts the objects when you download them. All Amazon S3 buckets have encryption configured by default and all new objects uploaded to an S3 bucket are automatically encrypted at rest
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu9m3xmtdezu6bhdl3c20.png" alt="S3 5" width="732" height="283"&gt;
&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;&lt;code&gt;Object lock&lt;/code&gt;&lt;/strong&gt; - It helps to prevent accidental deletion or modification of objects. With Object Lock, you can set a retention period for objects in a bucket, during which time the objects cannot be deleted or modified. For now we'll keep this default.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpblc7r05jqkzjqxdnz7x.png" alt="S3 6" width="727" height="291"&gt;
&lt;/li&gt;
&lt;li&gt;Finally click on the &lt;strong&gt;&lt;code&gt;Create bucket&lt;/code&gt;&lt;/strong&gt; &lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Now let's upload some objects in S3 bucket for testing or project.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fue9ci2fjbxq5kpq4t1tg.png" alt="S3 7" width="800" height="480"&gt;
&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;3.) Create the &lt;em&gt;Cloudfront distrubution&lt;/em&gt; for your S3 bucket.&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Search for clodufront in AWS console search bar &amp;amp; open the cloudfront dashboard. &lt;/li&gt;
&lt;li&gt;Select the &lt;strong&gt;Origin domain&lt;/strong&gt; of your bucket from drop down &amp;amp; give the name of the cloudfront distribution. 
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo4ze2vs0vmlc4u66xvd6.png" alt="cf 1" width="658" height="458"&gt;
&lt;/li&gt;
&lt;li&gt;Set the &lt;strong&gt;origin access&lt;/strong&gt; as &lt;strong&gt;Origin access control settings&lt;/strong&gt;. 
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmq3nekrwncpfjvi2ffuq.png" alt="cf 2" width="800" height="495"&gt;
&lt;/li&gt;
&lt;li&gt;We also need to  create the &lt;em&gt;&lt;code&gt;access control setting&lt;/code&gt;&lt;/em&gt; for this cloudfrony distro like below:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6br0xsqf3hxcpe3hq5n.png" alt="cf 3" width="587" height="606"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;Enable the Shield&lt;/code&gt;&lt;/strong&gt; as it helps to minimize your origin’s load, improve its availability, and reduce its operating costs.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdryiseq83g6speh9sjay.png" alt="cf 4" width="795" height="405"&gt;
&lt;/li&gt;
&lt;li&gt;In &lt;strong&gt;Default cache behaviour&lt;/strong&gt; settings set configurations as below, else keep it default.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdvk46t7qv6gggtorqdv.png" alt="cf 5" width="797" height="1136"&gt;
&lt;/li&gt;
&lt;li&gt;Let &lt;strong&gt;Function associations - optional&lt;/strong&gt; be as &lt;em&gt;default&lt;/em&gt; for now.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;final section&lt;/strong&gt; i.e. settings, choose as per your usecase and requirements like below:

&lt;ul&gt;
&lt;li&gt;In this we need to choose the &lt;strong&gt;price class&lt;/strong&gt; and AWS WAF web ACL we want to associate it with.&lt;/li&gt;
&lt;li&gt;If we want to add a &lt;strong&gt;custom domain&lt;/strong&gt; to the clodfront distribution then we can add it here.&lt;/li&gt;
&lt;li&gt;In &lt;strong&gt;default root object&lt;/strong&gt;, the object you specify will return if user requests root URL.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxui9k6gn7qq9wme7ns6j.png" alt="cf 6" width="800" height="1062"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Once the &lt;strong&gt;Cloudfront Distribution got created&lt;/strong&gt; then we need to add the given policy in the respective &lt;strong&gt;&lt;code&gt;S3 Bucket policy&lt;/code&gt;&lt;/strong&gt;. So copy the bucket policy from here.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa2f6fy1op406s2zqqukt.png" alt="Cf 7" width="800" height="300"&gt;
&lt;/li&gt;

&lt;li&gt;Then &lt;strong&gt;&lt;code&gt;go to S3 bucket &amp;gt; Permissions &amp;gt; Edit bucket policy&lt;/code&gt;&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5pqhpu8zx97unkkkgbf9.png" alt="cf 8" width="800" height="439"&gt;
&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;4.) Setup the IAM role and permissions for Web application.&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiqyqdb0rylspntq40s16.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiqyqdb0rylspntq40s16.png" alt="Image 232" width="715" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;So now, you can use this cloudfront distribution in your applications. 
Usually Applications requires &lt;code&gt;PutObject&lt;/code&gt;, &lt;code&gt;GetObject&lt;/code&gt; and &lt;code&gt;DeleteObject&lt;/code&gt; permissions for S3 bucket For that you need to give your application via &lt;em&gt;&lt;strong&gt;IAM user or role (recommended)&lt;/strong&gt;&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Get Operation (read access but only to web app user not anonymous)&lt;/strong&gt; will be handled by Cloudfront.&lt;/li&gt;
&lt;li&gt; For Put and delete object you need to create an IAM role and attach the below IAM policy.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::cloudfront-demo1-s3/*",
            ]
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt; Now, assign the &lt;strong&gt;IAM role&lt;/strong&gt; to web application server** and you are good to go.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;5.) Finally, test the Application !!&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt; Now we can test the cloudfront domain which is setup to deliver the object from private S3 bucket.&lt;/li&gt;
&lt;li&gt;By hitting the &lt;code&gt;https://Distribution-domain-name/s3-object-name&lt;/code&gt;, it will return as below.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fre7p0cj9qxod1ku4rvht.png" alt="Cf 9" width="800" height="367"&gt;
&lt;/li&gt;
&lt;li&gt;While if we just hit the domain &lt;code&gt;https://Distribution-domain-name&lt;/code&gt; then it will show us the object that we had set in default root object.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2l61gahqnsglein28t49.png" alt="cf 10" width="800" height="429"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  🚩🚩 Note: Some helpful Links:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://catalog.us-east-1.prod.workshops.aws/workshops/9331108e-464e-4699-8a9c-486090105878/en-US" rel="noopener noreferrer"&gt;Accelerate your content using Amazon CloudFront&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://catalog.us-east-1.prod.workshops.aws/workshops/4557215e-2a5c-4522-a69b-8d058aba088c/en-US/basic-configuration/create-cloudfront-distribution" rel="noopener noreferrer"&gt;Improve Your Architecture With Amazon CloudFront&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.wellarchitectedlabs.com/security/200_labs/200_automated_deployment_of_web_application_firewall/2_config_cloudfront/" rel="noopener noreferrer"&gt;CONFIGURE AMAZON CLOUDFRONT&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-service.html" rel="noopener noreferrer"&gt;Creating Role and setting permissions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/cloudfront/getting-started/S3/" rel="noopener noreferrer"&gt;Set up a CloudFront distribution for Amazon S3&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwfpcptd9hu6djzmrk4e.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwfpcptd9hu6djzmrk4e.gif" width="400" height="168"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>performance</category>
      <category>scalability</category>
      <category>monitoring</category>
    </item>
    <item>
      <title>Jenkins 101 - Beginner to Advanced 🚀🚀 (PART -1/2)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Tue, 14 Mar 2023 15:41:03 +0000</pubDate>
      <link>https://dev.to/tanmaygi/jenkins-101-beginner-to-advanced-part-12-5h25</link>
      <guid>https://dev.to/tanmaygi/jenkins-101-beginner-to-advanced-part-12-5h25</guid>
      <description>&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Jenkins&lt;/strong&gt; is a free and open-source automation server. It helps automate the parts of software development related to building, testing, and deploying, facilitating continuous integration and continuous delivery.&lt;/p&gt;

&lt;h3&gt;
  
  
  Installation of Jenkins
&lt;/h3&gt;

&lt;p&gt;There are multiple ways to install jenkins&lt;/p&gt;

&lt;h4&gt;
  
  
  1. Jenkins On &lt;a href="https://phoenixnap.com/kb/install-jenkins-ubuntu" rel="noopener noreferrer"&gt;Ubuntu&lt;/a&gt;:
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;First, &lt;a href="https://mkyong.com/java/how-to-install-java-jdk-on-ubuntu-linux/" rel="noopener noreferrer"&gt;install java&lt;/a&gt;:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get update
sudo apt-get -y upgrade
sudo apt search openjdk
sudo apt install openjdk-11-jdk
java -version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Now go to &lt;a href="https://pkg.jenkins.io/debian-stable/" rel="noopener noreferrer"&gt;Jenkins download&lt;/a&gt; and follow instructions. Then run:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo systemctl status jenkins
sudo systemctl enable jenkins
sudo systemctl start jenkins
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Finally go to:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://ip_address_or_domain:8080
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vr2wjk9ovfrhmsuarcw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vr2wjk9ovfrhmsuarcw.png" alt="jeniks unlock"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run this to get initial password
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo cat /var/lib/jenkins/secrets/initialAdminPassword
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Select like below: 
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft8icu9bkeukrw4kzzxnn.png" alt="Image depluginn"&gt;
&lt;/li&gt;
&lt;li&gt;Create a first admin user
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flx5vfv5a9vo8hhb68cm9.png" alt="Image descriegegption"&gt;
&lt;/li&gt;
&lt;li&gt;Now your jenkins is ready to use, Welcome to the world of jenkins:
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcag74mla6tr2hzt0imzq.png" alt="Imajenk"&gt;
Note : Make sure you have enabled 8080 port(by enabling custom tcp/udp as 8080 inside inbound rules in your security group if using EC2) &lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  2. On Docker container:
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt; Install docker:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get update
sudo apt-get -y upgrade
sudo install docker.io
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Now to download an jenkins image and run it as docker command we need to run the following command
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p 8080:8080 -p 50000:50000 -d -v jenkins_home:/var/jenkins_home
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;8080:8080&lt;/strong&gt; --&amp;gt; specify the port(bind the host(server) port 8080 to jenkins port 8080) 
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzebvxak004cpnc7e48jc.png" alt="Image jenkins"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;50000:50000&lt;/strong&gt; --&amp;gt;  port where jenkins master and slave will communicate(as jenkins can be build and started as a cluster). If we have large workloads that you are running with the jenkins, so this port is where communication between master and worker node is happening. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;-d&lt;/code&gt; :  To run container in detached mode(background)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-v&lt;/code&gt; : To mount the volumes jenkins_home
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcd2bwy5nxgsil82h0uc9.png" alt="Image volume"&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;jenkins/jenkins:lts&lt;/code&gt; : &lt;a href="https://hub.docker.com/r/jenkins/jenkins" rel="noopener noreferrer"&gt;Official docker image for jenkins&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Now Check our jenkins container with &lt;code&gt;docker ps&lt;/code&gt;:
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3uyu39ho41910n1yux5.png" alt="Imdocker ps"&gt;
&lt;/li&gt;
&lt;li&gt;Finally, please go to: &lt;code&gt;http://ip_address_or_domain:8080&lt;/code&gt; &lt;/li&gt;
&lt;li&gt;After opening the Jenkins dashboard, follow the above steps.&lt;/li&gt;
&lt;li&gt;Run docker exec -it ` bash --&amp;gt; To run a command in a running container(i.e. login as jenkins user)&lt;/li&gt;
&lt;li&gt;Run `docker volume inspect jenkins_home --&amp;gt; To get info about our created volume.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  3. On &lt;a href="https://www.jenkins.io/doc/tutorials/tutorial-for-installing-jenkins-on-AWS/" rel="noopener noreferrer"&gt;Amazon Linux 2&lt;/a&gt;/ &lt;a href="https://phoenixnap.com/kb/how-to-install-jenkins-on-centos-8" rel="noopener noreferrer"&gt;CentOS&lt;/a&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt; First install java: 
&lt;code&gt;sudo yum update -y
sudo yum install java-1.8.0-openjdk&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Now download latest jenkins package:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo wget -O /etc/yum.repos.d/jenkins.repo https://pkg.jenkins.io/redhat/jenkins.repo
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Import a key file from Jenkins-CI to enable installation from the package:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo rpm --import https://pkg.jenkins.io/redhat-stable/jenkins.io.key
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Install Jenkins
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo yum install jenkins -y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Enable the Jenkins service to start at boot:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo systemctl enable jenkins
sudo systemctl start jenkins
sudo systemctl status jenkins
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  ✨✨Note: For CentOS make sure to configure firewall as, Jenkins service uses port &lt;strong&gt;8080&lt;/strong&gt; to communicate. If you’re using the default firewalld service, enter the following commands to allow access:
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo firewall-cmd ––permanent ––zone=public ––add-port=8080/tcp
sudo firewall-cmd ––reload
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Open a web browser, and enter the following URL: &lt;code&gt;http://ip_address_or_domain:8080&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  4. Jenkins on Windows 10:
&lt;/h4&gt;

&lt;p&gt;While it is recommended that you install jenkins controller on linux based server as windows has some complication like locking file sysytem semantics, but we can also install it on our local &lt;strong&gt;Windows&lt;/strong&gt; machine also.&lt;br&gt;
Just Go through these videos to get up &amp;amp; running:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=XuMrEDA8cAI&amp;amp;t=409s" rel="noopener noreferrer"&gt;Install jenkins on Windows server 2022&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=I--rP8MdQFE" rel="noopener noreferrer"&gt;Install jenkins on local windows 10&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://phoenixnap.com/kb/install-jenkins-on-windows" rel="noopener noreferrer"&gt;Blog on jenkins on Windows 10&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  5. Running jenkins as a WAR(&lt;a href="https://www.javatpoint.com/war-file" rel="noopener noreferrer"&gt;Web Application Resource or Archive&lt;/a&gt;) file
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Before installing jenkins make sure you have following

&lt;ul&gt;
&lt;li&gt;Running jenkins using war file is platform/OS independent. &lt;/li&gt;
&lt;li&gt;Only requires that JRE or JDK be insatlled on target.&lt;/li&gt;
&lt;li&gt;Java 8 and 11 are supported only.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Go to &lt;a href="https://www.jenkins.io/download/" rel="noopener noreferrer"&gt;jenkins download page&lt;/a&gt; and copy the link of Generic Java package(.war) file. &lt;/li&gt;

&lt;li&gt;Then use &lt;code&gt;wget &amp;lt;jenkins.war link&amp;gt;&lt;/code&gt; to download the war file inside your home directory.&lt;/li&gt;

&lt;li&gt;Now all you need to run jenkins is invoke java by running:
&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;java -jar jenkins.war
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Finally go to
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;localhost or &amp;lt;your_ip_address&amp;gt;:8080
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;as jenkins by default runs on port 8080. Hence your jenkins is up and running.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg7cvdj7n96s1eqlmflsw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg7cvdj7n96s1eqlmflsw.png" alt="jenkins by war"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Note:
&lt;/h2&gt;

&lt;p&gt;We can do this other way also by &lt;a href="https://www.edureka.co/blog/install-jenkins/" rel="noopener noreferrer"&gt;installing tomcat and then deploy the war file&lt;/a&gt;. As eventually when we run java application on our servers. We require tomcat (Apache Tomcat is a web server and servlet container that is used to deploy and serve Java web applications)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;sudo apt install openjdk-11-jdk&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;sudo apt install wget&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Install tomcat 9 with
&lt;code&gt;wget https://archive.apache.org/dist/tomcat/tomcat-9/v9.0.0.M10/bin/apache-tomcat-9.0.0.M10.tar.gz&lt;/code&gt;

&lt;ul&gt;
&lt;li&gt;Extract the files &lt;code&gt;tar xzf apache-tomat-9.0.0.M10.tar.gz&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;To make it simple we will move this extracted file to a new directory Tomcat9 using the mv command and to do that I will execute the following command: 
&lt;code&gt;mv apache-tomcat-9.0.0.M10 tomcat9&lt;/code&gt;
&lt;/li&gt;

&lt;li&gt;Our next step is to provide a username and password for Apache Tomcat ---&amp;gt; &lt;code&gt;vim /home/edureka/tomcat9/conf/tomcat-users.xml&lt;/code&gt;
&lt;/li&gt;

&lt;li&gt;Now replace the tomcat-users.xml content with following. This is to create tomcat users
&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;?xml version='1.0' encoding='utf-8'?&amp;gt;
&amp;lt;tomcat-users&amp;gt;
    &amp;lt;role rolename="manager-gui"/&amp;gt;
    &amp;lt;role rolename="manager-script"/&amp;gt;
    &amp;lt;role rolename="manager-jmx"/&amp;gt;
    &amp;lt;role rolename="manager-jmx"/&amp;gt;
    &amp;lt;role rolename="admin-gui"/&amp;gt;
    &amp;lt;role rolename="admin-script"/&amp;gt;
    &amp;lt;user username="user1" password="password1" roles="manager-gui,manager-script,manager-jmx,manager-status,admin-gui,admin-script"/&amp;gt;
&amp;lt;/tomcat-users&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the above code, as you can see that I have defined several roles and for all these roles I have given one single username called &lt;code&gt;user1&lt;/code&gt; and password i.e. &lt;code&gt;password&lt;/code&gt;. If you want to assign different username and password for different roles you can do that as well.&lt;br&gt;
Now save it and close the file to go back to the terminal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We need to start Apache Tomcat now, but before that I will change my directory to Tomcat9 by executing the below command:
&lt;code&gt;cd tomcat9&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;To start Tomcat use the below command:
&lt;code&gt;./bin/startup.sh&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Now we need to download Jenkins war File:&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  JENKINS FUNDAMENTALS
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Roles&lt;/strong&gt; in jenkins : In Jenkins we have two kinds of Roles.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn2gtpxwdnfugnarfa1e2.png" alt="Image roles"&gt;
&lt;/li&gt;
&lt;li&gt;We create jobs in jenkins to Automate app workflow&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build Tools:&lt;/strong&gt; Maven(java app), Gradle, NPM(nodejs app)

&lt;ul&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt; &lt;a href="https://www.digitalocean.com/community/tutorials/how-to-install-node-js-on-ubuntu-20-04" rel="noopener noreferrer"&gt;Install Nodejs and NPM on ubuntu:&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Create simple freestyle jobs and Configure Git Repository.&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Run Tests and build java application
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fenq67iky91x4p1mezix0.png" alt="run tests"&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devops</category>
      <category>jenkins</category>
      <category>cloud</category>
      <category>aws</category>
    </item>
    <item>
      <title>Automating the deletion of specific inbound rules from any security groups in AWS via Config</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Mon, 13 Feb 2023 23:29:36 +0000</pubDate>
      <link>https://dev.to/aws-builders/automating-the-deletion-of-specifc-inbound-rules-from-any-security-groups-in-aws-via-config-21lb</link>
      <guid>https://dev.to/aws-builders/automating-the-deletion-of-specifc-inbound-rules-from-any-security-groups-in-aws-via-config-21lb</guid>
      <description>&lt;h3&gt;
  
  
  Hello everyone 👋👋. Thanks for taking out time to read my blog. I hope you'll endup having little more knowledge and practical experience after completing this one.
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Here is the simple workflow to understand what we want to achieve here.&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fri02sydea7h8waqg0zp5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fri02sydea7h8waqg0zp5.png" alt="Image Workflow"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As much as it is so much fun to create resources in cloud like AWS, it is also very important to keep your security checks in place. Because even one security breach is enough to take your cloud infrastructure down in some cases. &lt;br&gt;
So i tell you a little story now, I used to create application on &lt;strong&gt;Elastic beanstalk&lt;/strong&gt; as a developer since its very easy to create and managed which is aws managed service. Then after a while I noticed that elastic beanstalk creates the whole think as a cloudformation stack. One of the drawback of this is, it created the EC2 server with default security group rules (like &lt;strong&gt;SSH (port 22)&lt;/strong&gt; open to anywhere i.e. &lt;strong&gt;0.0.0.0/0&lt;/strong&gt; which is BIG NO NO !!)&lt;br&gt;
So everytime EB create, rebuild or update the scurity groups, this SSH rule comes as default. Earlier i used manually delete those ssh rule after warning from Trust Advisor, which is very tedious but most important risky as we never know what could happen if we let SSH port open for even 5 minutes.&lt;/p&gt;

&lt;p&gt;Now i found out a way by which you can automatically set the rule in &lt;strong&gt;AWS Config&lt;/strong&gt; (it helps you assess, audit, and evaluate the configurations and relationships of your resources) and delete those rules with AWS SSM(Systems Manager) whenever config detects that rule in any of your security groups. So lets get started and understand the whole process.&lt;/p&gt;

&lt;p&gt;AWS config continuously monitors the resources and configuration and takes the remediation actions on the basis of config rules.&lt;/p&gt;
&lt;h3&gt;
  
  
  Create the IAM Role for Systems Manager(AWS SSM)
&lt;/h3&gt;

&lt;p&gt;1.) Go to &lt;strong&gt;AWS Management Console&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;2.) Create a IAM role and give permissions to &lt;strong&gt;Allows SSM to call AWS services on your behalf.&lt;/strong&gt;&lt;br&gt;
 Click on create a role, then select AWS services.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkfurdjld4ebm3zz24y7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkfurdjld4ebm3zz24y7.png" alt="Image 2"&gt;&lt;/a&gt;&lt;br&gt;
and usecase as &lt;strong&gt;Systems Manager.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faph898obkdg6qz3fgphp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faph898obkdg6qz3fgphp.png" alt="Image 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On Next step choose the &lt;strong&gt;AmazonSSMAutomationRole&lt;/strong&gt; policy from the list.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0mx9i8cbho11k9afrphz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0mx9i8cbho11k9afrphz.png" alt="Image 4"&gt;&lt;/a&gt;&lt;br&gt;
3.) Finally give the name and review the permissions of &lt;strong&gt;SSMAutomationRoleForConfig&lt;/strong&gt; and create the Role&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0u1ao292a44hnps5h60a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0u1ao292a44hnps5h60a.png" alt="Image 5"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4.) After this attach one inline policy for giving the Role access to delete the Security groups. Below is the IAM policy for that.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "ec2:RevokeSecurityGroupIngress",
                "ec2:DescribeSecurityGroups"
            ],
            "Resource": "*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setup the AWS Config rules for Restricting the SSH access.
&lt;/h3&gt;

&lt;p&gt;5.) Go to AWS Config, click on &lt;strong&gt;Add Rule.&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw0rr8aevivzq2pua0r49.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw0rr8aevivzq2pua0r49.png" alt="Image 6"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6.) Now select the rule type as &lt;strong&gt;AWS Managed rule&lt;/strong&gt; i.e. &lt;strong&gt;restricted-ssh&lt;/strong&gt; from the Rules list.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn2kbx6q0cj2dotx1fep.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn2kbx6q0cj2dotx1fep.png" alt="Image 7"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;7.) Configure the rule and give the intuitive name like &lt;strong&gt;restricted-ssh&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshd9wgeir5tvfridqrzv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshd9wgeir5tvfridqrzv.png" alt="Image 8"&gt;&lt;/a&gt;&lt;br&gt;
then choose the Resources as &lt;strong&gt;security groups&lt;/strong&gt; in the scope of changes section in Evaluation mode.&lt;/p&gt;

&lt;p&gt;You can leave the other optionals like parameters and tags as default. &lt;br&gt;
Finally review the Config rule and click on Add Rule once done.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb44hgw7qmyoqpb5lgry1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb44hgw7qmyoqpb5lgry1.png" alt="Image 9"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Manage the remediation of Config rule.
&lt;/h3&gt;

&lt;p&gt;8.) After creating, select the rule and in the &lt;strong&gt;Actions&lt;/strong&gt; click on &lt;strong&gt;Manage Remediation.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fobdxzq2djxcp8160znog.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fobdxzq2djxcp8160znog.png" alt="Image 10"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now select the Remediation action as automatic. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwqsbh7a4y4m9ou1nr4z2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwqsbh7a4y4m9ou1nr4z2.png" alt="Image 4rtwfe"&gt;&lt;/a&gt;&lt;br&gt;
Then choose the remediation action as &lt;strong&gt;AWS-DisableincomingSShOnPort22&lt;/strong&gt;. This will disable the &lt;strong&gt;unrestricted incoming SSH traffic&lt;/strong&gt; on port 22 for EC2 security groups.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp98l0gfqpf7wznr0fynn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp98l0gfqpf7wznr0fynn.png" alt="Image dsanc"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can update the Rate limits as per your usecase or leave it default for now. And in the resource Id parameter, select the &lt;strong&gt;SecurityGroupIds&lt;/strong&gt;(or else you can pass the resource ID of noncompliant resources to a remediation action by choosing a parameter that is dependent on the resource type.). But here we want do it for all security groups.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi63h186n04j8674y817f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi63h186n04j8674y817f.png" alt="Image 11"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the next section of parameters, the &lt;strong&gt;SecurityGroupIds&lt;/strong&gt; will be greyed out, so no need to worry about that. And in the &lt;strong&gt;AutomationAssumeRole&lt;/strong&gt;, put the ARN no. of Role that you created in above i.e. &lt;strong&gt;SSMAutomationRoleForConfig&lt;/strong&gt; in our case.&lt;/p&gt;

&lt;p&gt;Finally review everything and &lt;strong&gt;Save Changes.&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Test if it really works 🤔.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Go that EC2 dashboard, select any instance and edit the inbound rule of that security group as 🚨🚨 &lt;strong&gt;SSH (port 22) allow to 0.0.0.0/0 (Warning: Please don't do that with production resources(or serious resources), use only testing servers that doesn't have anything to be compromised in any case.)&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F88tcuxlpjza46z0s4xo6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F88tcuxlpjza46z0s4xo6.png" alt="Image 12"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If some body trys to allow ssh (port 22) inbound rule to a critical server so that he can ssh inside it and take away any data or harm the Application in someway. &lt;/li&gt;
&lt;li&gt;&lt;p&gt;Even if it is 5 minutes, we won’t even know that someone has taken away the data or has done ssh into the virtual machine. You won’t get any alarm also. But with config we can completely resolve this issue and secure our AWS Cloud infrastructure.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In a while you will see the remediation like below. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ej4hwlj3s0r4cjrnrdf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ej4hwlj3s0r4cjrnrdf.png" alt="Image edfgsic"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fruuj6o3fn7danber09ws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fruuj6o3fn7danber09ws.png" alt="Image dejjnjnon"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffj9yfghjtxafp15z65as.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffj9yfghjtxafp15z65as.png" alt="Image 342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And you'll found that ssh rule will be deleted by config rule and AWS SSM &lt;strong&gt;StartAutomaticExectionAPI.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnc1ui2scfio84oco432.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnc1ui2scfio84oco432.png" alt="Image ddsfsvon"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  BONUS:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt; If you want to setup alerts and notification if if find any resource as uncompliant.
&lt;a href="https://aws.amazon.com/premiumsupport/knowledge-center/config-resource-non-compliant/" rel="noopener noreferrer"&gt;Get notified when an AWS resource is non-compliant using AWS Config?
&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If any problem while setting this up, you can &lt;a href="https://aws.amazon.com/premiumsupport/knowledge-center/config-remediation-executions/" rel="noopener noreferrer"&gt;troubleshoot failed remediation actions in AWS Config?&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you find any issue while doing this please let me know in comments or you ca connect me directly. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/tanmay-shukla/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="//tanmayshukla.bio.link"&gt;Bio.link Profile&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>security</category>
      <category>ai</category>
    </item>
    <item>
      <title>Learn AWS (Amazon web services) for FREE😎😎 in 2023 (Updated)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Wed, 10 Aug 2022 13:10:00 +0000</pubDate>
      <link>https://dev.to/tanmaygi/learn-aws-amazon-web-services-for-free-in-2022-updated-4m0a</link>
      <guid>https://dev.to/tanmaygi/learn-aws-amazon-web-services-for-free-in-2022-updated-4m0a</guid>
      <description>&lt;p&gt;Hello everyone, and thank you for taking out your precious time to read this blog and investing in your future. I have started my journey of learning cloud(AWS) around 6 months back in late 2021. It was a time during AWS re:invent. It got me somewhat interested, and I just keep learning and dabbling into it. Throughout this learning journey, I have used 100% free resources to learn, practice hands-on. So, today, I want to show you some of the best resources to learn AWS without spending any single penny 💸💸. The best part is they are not third party training providers, instead we got to learn from those who build AWS. Lets go!!!&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://explore.skillbuilder.aws" rel="noopener noreferrer"&gt;AWS Skill Builder: &lt;/a&gt; Your learning center to build in-demand cloud skills&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7d8ql0hyg46tiacg7ck.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7d8ql0hyg46tiacg7ck.png" alt="Imag5n"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://explore.skillbuilder.aws/learn/course/11458/play/42651/play-cloud-quest-cloud-practitioner" rel="noopener noreferrer"&gt;AWS Cloud Quest - Cloud Practitioner(GAME):&lt;/a&gt; AWS Cloud Quest is the only role-playing game to help you build practical AWS Cloud skills. Whether you’re starting your cloud learning journey or diving into specialized skills, &lt;a href="https://explore.skillbuilder.aws/learn/course/11458/play/43072/trailer" rel="noopener noreferrer"&gt;AWS Cloud Quest&lt;/a&gt; helps you learn in an interactive, engaging way.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsmz3ccvi5194im5zsb8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsmz3ccvi5194im5zsb8.png" alt="Image cloud pra"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.awseducate.com" rel="noopener noreferrer"&gt;AWS Educate:&lt;/a&gt; AWS Educate offers hundreds of hours of self-paced training and resources for new-to-cloud learners—including hands-on labs in the AWS Management Console.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frfhepah29wkxneuapfbd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frfhepah29wkxneuapfbd.png" alt="Imag4on"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/getting-started/hands-on" rel="noopener noreferrer"&gt;AWS Hands-on Tutorials:&lt;/a&gt; Discover tutorials, digital training, reference deployments and white papers for common AWS use cases.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6x528wmmd0kpb9yczpck.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6x528wmmd0kpb9yczpck.png" alt="Image4n"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://awsworkshop.io" rel="noopener noreferrer"&gt;AWS Workshops:&lt;/a&gt; This website lists workshops created by the teams at Amazon Web Services (AWS). Workshops are hands-on events designed to teach or introduce practical skills, techniques, or concepts which you can use to solve business problems.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F95y3g9huwsp6b95d30eh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F95y3g9huwsp6b95d30eh.png" alt="Image d43on"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/developer/community/heroes/content-library" rel="noopener noreferrer"&gt;AWS Heroes Content Library:&lt;/a&gt; AWS Hero authored content including blogs, videos, slide presentations, podcasts, and more.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5mlal1azz57ylrtyv27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5mlal1azz57ylrtyv27.png" alt="Image d43on"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/architecture/back-to-basics" rel="noopener noreferrer"&gt;Back to Basics:&lt;/a&gt; Back to Basics' is a video series that explains, examines, and decomposes basic cloud architecture pattern best practices.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqi0zuphvki7u5pujs0y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqi0zuphvki7u5pujs0y.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/whitepapers" rel="noopener noreferrer"&gt;AWS Whitepapers:&lt;/a&gt; Expand your knowledge of the cloud with AWS technical content authored by AWS and the AWS community, including technical whitepapers, technical guides, reference material, and reference architecture diagrams.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fncgcz41cz32589v4q7v1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fncgcz41cz32589v4q7v1.png" alt="Image34"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/training/twitch" rel="noopener noreferrer"&gt;AWS Power Hour on twitch:&lt;/a&gt; AWS Training and Certification offers free live and on-demand training on Twitch.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr6e4q9g8yw1vgrdde6jl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr6e4q9g8yw1vgrdde6jl.png" alt="Image wang"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/architecture/?nc2=h_ql_le_arc&amp;amp;cards-all.sort-by=item.additionalFields.sortDate&amp;amp;cards-all.sort-order=desc&amp;amp;awsf.content-type=*all&amp;amp;awsf.methodology=*all&amp;amp;awsf.tech-category=*all&amp;amp;awsf.industries=*all" rel="noopener noreferrer"&gt;AWS Architecture center:&lt;/a&gt; The AWS Architecture Center provides reference architecture diagrams, vetted architecture solutions, Well-Architected best practices, patterns, icons, and more. This expert guidance was contributed by cloud architecture experts from AWS, including AWS Solutions Architects, Professional Services Consultants, and Partners.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8vsz8vatecf4yhb5xyei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8vsz8vatecf4yhb5xyei.png" alt="Im5"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  ✨✨BONUS✨✨
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.coursera.org/aws" rel="noopener noreferrer"&gt;Coursera's AWS Courses(Free to enroll via audit):&lt;/a&gt; AWS also provides various specializations in partnership with coursera&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvn1qdtjmewo9rgxwiy5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvn1qdtjmewo9rgxwiy5.png" alt="Image coursera"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/podcasts" rel="noopener noreferrer"&gt;AWS podcast(Learn aws on the go, anytime)&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnwi524mv1dzuls22qpem.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnwi524mv1dzuls22qpem.png" alt="Ima232"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.youtube.com/playlist?list=PLs8YaskKVodPPkxJKAmubKMks2bbXdO0w" rel="noopener noreferrer"&gt;AWS Solutions Architect Bootcamp by AWS usergroup&lt;/a&gt; Free bootcamp for preparing for aws solutions architect exam &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80lxbnm7yv0oa96uwdt6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80lxbnm7yv0oa96uwdt6.png" alt="Imasolur"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://awsreskill.com/" rel="noopener noreferrer"&gt;re:Skill (In association with AWS)&lt;/a&gt; Make the best use of this platform to reskill yourself on AWS platform by going through the learning content and taking up the challenges.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjyl2ktc6nmqpinzf3lo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjyl2ktc6nmqpinzf3lo.png" alt="Imrocha"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/training/ramp-up-guides" rel="noopener noreferrer"&gt;AWS Ramp-Up guides:&lt;/a&gt; Downloadable AWS Ramp-Up Guides offer a variety of resources to help you build your skills and knowledge of the AWS Cloud.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbb0rk2oqxanrbd2ake3x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbb0rk2oqxanrbd2ake3x.png" alt="Imaredd"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>cloudskills</category>
      <category>free</category>
    </item>
    <item>
      <title>Deploy a PHP(Laravel) app on AWS Elastic Beanstalk via CodePipeline</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Fri, 29 Jul 2022 07:14:00 +0000</pubDate>
      <link>https://dev.to/tanmaygi/deploy-a-phplaravel-app-on-aws-elastic-beanstalk-via-codepipeline-934</link>
      <guid>https://dev.to/tanmaygi/deploy-a-phplaravel-app-on-aws-elastic-beanstalk-via-codepipeline-934</guid>
      <description>&lt;p&gt;In this tutorial I will explain you to deploy a PHP laravel application on AWS Elastic Beanstalk service.&lt;br&gt;
&lt;strong&gt;About :&lt;/strong&gt; Amazon Elastic Beanstalk is an easy-to-use service for deploying and scaling web applications and services developed with Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker on familiar servers such as Apache, Nginx, Passenger, and IIS.&lt;br&gt;
&lt;strong&gt;Working:&lt;/strong&gt; You simply upload your code and Elastic Beanstalk automatically handles the deployment, from capacity provisioning, load balancing, and automatic scaling to web application health monitoring, with ongoing fully managed patch and security updates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step-1) Create a Beanstalk Environment and APP
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Select environment as &lt;strong&gt;web server environment&lt;/strong&gt; 
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffeqjzry1lhsy3ey555sb.png" alt="Image1"&gt;
&lt;/li&gt;
&lt;li&gt;Create the Application and Environment name:
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2d6ijxakjnjgcfyq1upg.png" alt="Image 2"&gt;
&lt;/li&gt;
&lt;li&gt;Choose the Platform on which you want to deploy your app.Here We will choose PHP for our laravel project.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsc6ikul42se79sm8wh6y.png" alt="Image 3"&gt;
&lt;/li&gt;
&lt;li&gt;The next step is to upload the code(for now we will go with sample code as we will push the original code in the next version with the code pipeline.)
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnzof40d3f0pz9vj2l379.png" alt="Image 4"&gt;
&lt;/li&gt;
&lt;li&gt;Click on create and let the enviornment get created.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3roql837ihqx6ds6eaen.png" alt="Image 5"&gt;
Finally It will show something like this
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6970sspsg2av6gk3uh79.png" alt="Image 6"&gt;
And when you click the APP url like this &lt;code&gt;http://laravelapp-env.eba-muin3rqg.us-east-2.elasticbeanstalk.com/&lt;/code&gt; it will open your app with sample code.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkc70pc2dazaboegerhhf.png" alt="Image 7"&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step-2) Create the CodePipeline
&lt;/h2&gt;

&lt;p&gt;Go to searchbar, type codepipeline and click create codepipeline&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk33e6ybcbxk99uvzuagb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk33e6ybcbxk99uvzuagb.png" alt="Image 8"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now Add the source stage, connect to your version control system(like Github here) and select the repository(laraval-demo) and branch(dev)
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmd9wei3bbnzmoeijh3ku.png" alt="Image 9"&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Check the Start the pipeline on source code change to automatically starts your pipeline when a change occurs in the source code. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzw63wjvic6fmkztilll.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzw63wjvic6fmkztilll.png" alt="Image 10"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Skip the Build stage for now. Add Deployment stage (Pipelines must have at least two stages. Your second stage must be either a build or deployment stage.)&lt;br&gt;
Choose Elastic Beanstalk at the place of Deployment Provider, application name and environment name.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhxwng9shyi15lifex5w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhxwng9shyi15lifex5w.png" alt="Image 23"&gt;&lt;/a&gt;&lt;br&gt;
Click Next and Review all the steps and cofiguration.&lt;br&gt;
Finally Click on &lt;code&gt;Create Pipeline&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftxynl5pi8i1auuk6yirh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftxynl5pi8i1auuk6yirh.png" alt="Image 23"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After Pipeline is succesfull create and deployed it will look something like this:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3v2k5ja12ym89x5985ia.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3v2k5ja12ym89x5985ia.png" alt="Image 24"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now Go to Beanstalk enviornment dashboard and open the application version you will found out that the latest version deployed is via codepipeline &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpprjw013h1sb4vblnvx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpprjw013h1sb4vblnvx.png" alt="Image 26"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;See the health of our app as &lt;code&gt;OK&lt;/code&gt; &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kxh795t39ludmi6w14v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kxh795t39ludmi6w14v.png" alt="Image 44"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on Beanstalk App URL like &lt;code&gt;http://laravelapp-env.eba-muin3rqg.us-east-2.elasticbeanstalk.com/&lt;/code&gt; to see our deployed laravel app.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn196fw3hehm4urpiyx9w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn196fw3hehm4urpiyx9w.png" alt="Image 54"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Summary:
&lt;/h3&gt;

&lt;p&gt;The CodePipeline will trigger whenever we will push the changes into the git. Hence it is our Continuous integration pipeline.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>elastibeanstalk</category>
      <category>devops</category>
      <category>php</category>
    </item>
    <item>
      <title>How to create a SFTP server on EC2(CentOS/Ubuntu) ?</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Thu, 28 Jul 2022 18:59:00 +0000</pubDate>
      <link>https://dev.to/tanmaygi/how-to-create-a-sftp-server-on-ec2centosubuntu--1f0m</link>
      <guid>https://dev.to/tanmaygi/how-to-create-a-sftp-server-on-ec2centosubuntu--1f0m</guid>
      <description>&lt;p&gt;In this tutorial you'll learn to create a SFTP server. SFTP stands for SSH File Transfer Protocol, and is a secure way to transfer files between machines using an encrypted SSH connection. Although similar in name, this is a different protocol than FTP (File Transfer Protocol), but SFTP is widely supported by modern FTP clients.&lt;/p&gt;

&lt;p&gt;SFTP is available by default with no additional configuration on all servers with SSH access enabled. Though it’s secure and fairly straightforward to use, one disadvantage of SFTP is that in a standard configuration, the SSH server grants file transfer access and terminal shell access to all users with an account on the system. In many cases, it is more secure to apply granular control over user permissions. For example, you may want to allow certain users to only perform file transfers, but prevent them from gaining terminal access to the server over SSH.&lt;/p&gt;

&lt;p&gt;Here you’ll set up the SSH daemon to limit SFTP access to one directory with no SSH access allowed on a per-user basis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisite:&lt;/strong&gt; First update the machine&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

sudo yum check-update
sudo yum update -y


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Note : For Ubuntu just use &lt;strong&gt;apt&lt;/strong&gt; instead of &lt;strong&gt;yum&lt;/strong&gt;(CentOS) pkg manager.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

sudo apt update
sudo apt upgrade 


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;Install the &lt;strong&gt;OpenSSh server&lt;/strong&gt;
```
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;sudo yum install openssh-server&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
###Step-1) Create a new user(s) out of which first two will be granted access to sftp server.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;adduser sftp-user1&lt;br&gt;
adduser sftp-user2&lt;br&gt;
adduser normal_user&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;After this you will be prompted to create password and fill some additional info about user. To check the SFTP working we will not add normal_user to sftpusers groups hence it will not get access to SFTP server.
Now you have created users that will be granted the access to restricted directory.
Let's create a group and add the users into it.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;groupadd sftpusers&lt;br&gt;
sudo usermod -a -G sftpusers sftp-user1&lt;br&gt;
sudo usermod -a -G sftpusers sftp-user1&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;If you want to give access of sftp server to multiple users then you can create group and add those users into group and specify in the below(Step-4) sshd-config file the name of group as **Match Groups &amp;lt;group name&amp;gt;**

### Step-2) Create a directory for Restricted access
In order to restrict SFTP access to one directory, you first have to make sure the directory complies with the **SSH server’s** permissions requirements, which are very specific.
Specifically, the directory itself and all directories before it in the filesystem tree must be owned by **root** and not writable by anyone else. At the same time, it’s not possible to give restricted access to a user’s home directory because home directories are owned by the user, not root.
- As there are different ways to work around this ownership issue. Here we will create **/var/www/public_ftp** as a target upload directory. **/var/www** will owned by root and will not be writable by other users. The subdirectory public_ftp will be owned by sftp-user1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;mkdir -p /var/www/public_ftp&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Set the owner of /var/www to root:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sudo chown root:root /var/www&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Give root write permissions to the same directory, and give other users only read and execute rights:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sudo chmod 755 /var/www&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Now change the ownership of public_ftp directory that you have just created. The following command will make the owner of this directory to **sftp-user1** or its corresponding group.
Now the directory structure is in place, you can configure the SSH server itself.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sudo chown sftp-user1:sftpsusers /var/www/public_ftp/&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;$ ls -lrt&lt;br&gt;
total 0&lt;br&gt;
drwxr-xr-x. 2 sftp-user1 sftpusers 26 Jul 28 12:06 public_ftp&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
### Step-3) Restricting Access to only one directory.
In this step, you’ll modify the SSH server configuration to disallow terminal access for sftp-user1 but **allow file transfer access**.
Open the SSH server configuration file using nano or vim.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sudo vim /etc/ssh/sshd_config&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Scroll to the very bottom of the file and add the following configuration snippet:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Match User sftp-user1 &lt;br&gt;
ForceCommand internal-sftp&lt;br&gt;
PasswordAuthentication yes&lt;br&gt;
ChrootDirectory /var/www&lt;br&gt;
PermitTunnel no&lt;br&gt;
AllowAgentForwarding no&lt;br&gt;
AllowTcpForwarding no&lt;br&gt;
X11Forwarding no&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
🚨 Note: Here’s what each directive does:
- **Match User** tells the SSH server to apply the following commands only to the specified user. Here, we specify sftp-user1. Again, make sure to update this with your own user’s name, if different.
- **ForceCommand internal-sftp** forces the SSH server to run the SFTP server upon login, disallowing shell access.
- **PasswordAuthentication yes** allows password authentication for this user.
- **ChrootDirectory /var/www/** ensures that the user will not be allowed access to anything beyond the **/var/www/** directory.
- **AllowAgentForwarding no, AllowTcpForwarding no, and X11Forwarding no** disables port forwarding, tunneling, and X11 forwarding, respectively. The purpose of adding these directives is to further limit this user’s access to the server.
This set of commands, starting with Match User, can be copied and repeated for different users too. Make sure to modify the username in the Match User line accordingly.

### Step-4 Run the sshd command to test the changes, then restart the service and Verify the configuration

`Important: If this step is performed incorrectly, it might break your SSHD configuration.`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sshd -t&lt;br&gt;
service sshd restart&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;**Now to Verify the Configuration**
Let’s ensure that our new  user can only transfer files. As mentioned previously, SFTP is used to transfer files between machines. You can verify this works by testing a transfer between your local machine and server.
First, try logging into your server as the user you created in Step 1. Because of the settings you added to the SSH configuration file, this won’t be possible:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;ssh sftp-user1@your_server_ip&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;You’ll receive the following message before being returned to your original prompt:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output&lt;br&gt;
This service allows sftp connections only.&lt;br&gt;
Connection to your_server_ip closed.&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;This means that sftp-user1 can no longer access the server shell using SSH.

Next, verify if the user can successfully access SFTP for file transfer:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sftp sftp-user1@your_server_ip&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Instead of an error message, this command will generate a successful login message with an interactive prompt:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output&lt;br&gt;
Connected to &lt;strong&gt;your_server_ip&lt;/strong&gt;&lt;br&gt;
sftp&amp;gt;&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;You can list the directory contents using ls in the prompt:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sftp&amp;gt; ls&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;This will show the **public_ftp** directory that was created in the previous step and return you to the sftp&amp;gt; prompt:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output&lt;br&gt;
uploads&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#### Final check 
For sftp-user1:
![user1](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pyqlb1c5v6u6wmlhdlfn.png)
For sftp-user2:
![sftp-user2](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2gwwfb59xbib6l7l4md.png)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>security</category>
      <category>linux</category>
    </item>
    <item>
      <title>How can I connect or access an AWS S3 bucket from an Amazon EC2 Instance?</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Wed, 20 Jul 2022 18:30:00 +0000</pubDate>
      <link>https://dev.to/tanmaygi/how-can-i-connect-or-access-an-aws-s3-bucket-from-an-amazon-ec2-instance-45gd</link>
      <guid>https://dev.to/tanmaygi/how-can-i-connect-or-access-an-aws-s3-bucket-from-an-amazon-ec2-instance-45gd</guid>
      <description>&lt;p&gt;In this tutorial I'll show you step by step to connect &amp;amp; access your S3 bucket from EC2 instance via IAM Roles.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step-1) Create an IAM instance profile that grants access to Amazon S3.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Open the &lt;strong&gt;IAM&lt;/strong&gt; console.&lt;/li&gt;
&lt;li&gt;Choose Roles, and then choose &lt;strong&gt;Create role&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select AWS Service, and then choose &lt;strong&gt;EC2&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Next: Permissions&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Create a custom policy that provides the minimum required permissions to access your S3 bucket.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F854qazxhxgsdnxz1rsox.png" alt="Image 0"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Creating a policy with the minimum required permissions is a security best practice. However, to allow EC2 access to all your Amazon S3 buckets, you can use the AmazonS3ReadOnlyAccess or AmazonS3FullAccess managed IAM policy.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Next: Tags&lt;/strong&gt;, and then select &lt;strong&gt;Next: Review&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Enter a &lt;strong&gt;Role name&lt;/strong&gt;, and then select &lt;strong&gt;Create role&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-2) Create an EC2 instance and attach IAM instance profile to this instance
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Create the EC2 instance &lt;a href="https://dev.to/tanmaygi/launch-an-ec2-instance-with-ami-in-1-minute-step-by-step-guide-5g6l"&gt;(Launch an EC2 instance in 1 min)&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For this click on checkbox on that instance and go to &lt;strong&gt;Actions tab &amp;gt; Security &amp;gt; Modify IAM role.&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqhbub4hjnkk88yag16wq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqhbub4hjnkk88yag16wq.png" alt="Image 1"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now select the IAM role that you created in step-1 then save it. This will assign the IAM role to your ec2 instance.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7vnvenb8gxhhfbt4vyfm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7vnvenb8gxhhfbt4vyfm.png" alt="Image 2"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-3) Validate &lt;strong&gt;S3&lt;/strong&gt; to check permissions, if it has any denying policy attached to it.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt; Go to &lt;strong&gt;Permissions&amp;gt;Bucket Policy&lt;/strong&gt; then search fo &lt;strong&gt;Effect:Deny&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Now in your bucket policy, edit or remove any &lt;strong&gt;Effect: Deny&lt;/strong&gt; statements that are denying the IAM instance profile access to your bucket.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fucy3u0ewekdeq1p37ojb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fucy3u0ewekdeq1p37ojb.png" alt="Image 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step-4) Check the network connectivity from EC2 instance to Amazon S3.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Verify if EC2 has network connectivity to &lt;strong&gt;S3 Endpoints&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;We need to make sure that our instance must possess one of the following quality

&lt;ul&gt;
&lt;li&gt;[] EC2 instance with a public IP address and a route table entry with the default route pointing to an Internet Gateway.&lt;/li&gt;
&lt;li&gt;[] Private EC2 instance with a default route through a NAT gateway.&lt;/li&gt;
&lt;li&gt;[] Private EC2 instance with connectivity to Amazon S3 using a Gateway VPC endpoint.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-5) Validate access to S3 buckets
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;First we need to Install &lt;strong&gt;AWS CLI&lt;/strong&gt; &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html" rel="noopener noreferrer"&gt;Install or update AWS CLI&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt; To Verify access to your S3 buckets by running the following command. Replace BUCKET-NAME with the name of your S3 bucket.
This command will list all the objects in your bucket:
```
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;aws s3 ls s3://BUCKET-NAME&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;![Image 4](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3tblxhd6h4dbu0fqm4il.png)
![Image 5](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aom8g08cxzhlkivt34l3.png)

**Note:** Run the AWS s3 **cp** command to copy the files to the S3 bucket and vice versa but remember to give IAM role **AmazonS3FullAccess** or **AdminsterAccess**. 
Now as we have also installed the aws command line you can simply use the following commands to copy the files to S3 Bucket from EC2.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  To List the S3 Bucket
&lt;/h1&gt;

&lt;p&gt;aws s3 ls s3://&lt;/p&gt;

&lt;h1&gt;
  
  
  To copy the files from EC2 to S3
&lt;/h1&gt;

&lt;p&gt;aws s3 cp  s3://&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;**&amp;lt;u&amp;gt;Addition resources:&amp;lt;/u&amp;gt;**
- https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/
- https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
- https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-troubleshooting.html

## Connect with me 
- [Github](https://github.com/tanmaycode2)
- [Linkedin](https://www.linkedin.com/in/tanmay-shukla/)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>aws</category>
      <category>ec2</category>
      <category>s3</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Amazon Elastic Container Service (ECS)- Practical Guide 🚀🚀(Cheat Sheet)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Sat, 16 Jul 2022 12:22:21 +0000</pubDate>
      <link>https://dev.to/tanmaygi/amazon-elastic-container-service-ecs-practical-guide-cheat-sheet-155g</link>
      <guid>https://dev.to/tanmaygi/amazon-elastic-container-service-ecs-practical-guide-cheat-sheet-155g</guid>
      <description>&lt;p&gt;&lt;strong&gt;Amazon Elastic Container Service&lt;/strong&gt; is a highly scalable, fast, container management service that makes it easy to run, stop and manage &lt;strong&gt;docker&lt;/strong&gt; containers on a cluster of amazon EC2 instances.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It let's you launch and stop contianer-enabled applications with simple &lt;strong&gt;API calls&lt;/strong&gt;, that allows us to get the state of our cluster from a centralized service, and gives us access to many familiar EC2 features.&lt;/li&gt;
&lt;li&gt;Through ECS we can &lt;strong&gt;schedule&lt;/strong&gt; the placement of containers across our cluster based on &lt;u&gt;resource needs&lt;/u&gt;, &lt;u&gt;isolation policies&lt;/u&gt;, and &lt;u&gt;availability requirements&lt;/u&gt;.&lt;/li&gt;
&lt;li&gt;ECS eliminates the need for you to operate you own cluster (management) and configuration management or even worry about scaling your management infrastructure.&lt;/li&gt;
&lt;li&gt; ECS also has integration with the Application Load balancer to expose your service to the world. &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  ECS has two launch types:&lt;br&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Amazon ECS - EC2 launch Type:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8a56llkmmiv9pll4aq6j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8a56llkmmiv9pll4aq6j.png" alt="Image 2"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ECS = Elastic Container Service&lt;/li&gt;
&lt;li&gt;Launch Docker containers on AWS = Launch &lt;strong&gt;ECS Tasks&lt;/strong&gt; on ECS Clusters&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;EC2 Launch Type: you must provision &amp;amp; maintain the infrastructure (the EC2 instances)&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Each EC2 Instance must run the &lt;strong&gt;ECS Agent&lt;/strong&gt; to register in the ECS Cluster&lt;/li&gt;
&lt;li&gt;AWS takes care of starting / stopping 
containers.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Amazon ECS - Fargate Launch Type:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6j8dswsy8y7dkm7o4xsx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6j8dswsy8y7dkm7o4xsx.png" alt="Image 2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Launch Docker containers on AWS.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;We do Not have to provision the infrastructure&lt;/strong&gt;(EC2 instances), which makes it simpler and easy to use!&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;It's all Serverless.&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt; AWS runs the contaiers for us based on the CPU/RAM we need.&lt;/li&gt;
&lt;li&gt;We just create task definitions. Then the fargate service will launch the task and there is no need to create EC2 instance beforehand(which makes it lot easier).&lt;/li&gt;
&lt;li&gt;To access this fargate tasks, we're going to have &lt;strong&gt;elastic network interfaces(ENI)&lt;/strong&gt; that are also going to launched within our VPC to &lt;strong&gt;bind&lt;/strong&gt; this task to a network IP. So, the more tasks we have, more ENI's(distinct and private IP; so make sure we have enough IP addresses in our VPC) will be created and they are going to be unique for all of these tasks.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv0h6aqo230wux7dkum4o.png" alt="Image 3"&gt; &lt;/li&gt;
&lt;li&gt;To scale, we just need to increase the number of tasks. Much simpler than EC2 instances.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;🚩🚩🚩 &lt;strong&gt;Note:&lt;/strong&gt; All these ECS Tasks will maybe perform some operations on you AWS services. For example they may need to interact with DynamoDB or S3, so therefore they need IAM roles. So lets look at IAM roles for ECS tasks now.&lt;/p&gt;

&lt;h3&gt;
  
  
  IAM Roles for ECS Tasks
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;EC2 Instance Profile (EC2 Launch Type only):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Used by the &lt;strong&gt;&lt;u&gt;ECS agent&lt;/u&gt;&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Makes API calls to &lt;u&gt;ECS service&lt;/u&gt;
&lt;/li&gt;
&lt;li&gt;Send container logs to &lt;u&gt;CloudWatch Logs&lt;/u&gt;
&lt;/li&gt;
&lt;li&gt;Pull Docker image from &lt;u&gt;ECR&lt;/u&gt;
&lt;/li&gt;
&lt;li&gt;Reference sensitive data in Secrets Manager or SSM Parameter Store.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fos9n5g18r5bpsz62ds9s.png" alt="Image 4"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;ECS Task Role:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you create a task you attach task role to it.&lt;/li&gt;
&lt;li&gt;Tasks Role is defined in the &lt;u&gt;&lt;strong&gt;task defintion.&lt;/strong&gt;&lt;/u&gt;
&lt;/li&gt;
&lt;li&gt;It Allows each task to have a specific role. While it's also advisable to create new ECS task role for new task, as it ensures better security when accessing services.&lt;/li&gt;
&lt;li&gt;That's why we use different roles for the different ECS Services we run.&lt;/li&gt;
&lt;li&gt;As its a much better separation of security &amp;amp; you have as many task roles as different types of tasks.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;🚩🚩🚩 &lt;strong&gt;Note:&lt;/strong&gt; You can question like how these services share data? It's much more complex topic that you think so I'll cover this in another blog but for now let take a high level understanding.&lt;/p&gt;

&lt;h3&gt;
  
  
  ECS Data Volumes:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;ECS has an integration with the &lt;strong&gt;EFS(Elastics file system)&lt;/strong&gt; which is a NFS(network file sysytem).
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw0sn5r98850m0a3243gu.png" alt="Image 6"&gt;
&lt;/li&gt;
&lt;li&gt;If we have an EC2 instance and multiple tasks running, it is possible to create an EFS file system and to &lt;strong&gt;&lt;u&gt;mount&lt;/u&gt;&lt;/strong&gt; the file system directly onto these ECS tasks. - It works for both &lt;strong&gt;EC2 Tasks&lt;/strong&gt; and &lt;strong&gt;Fargate tasks.&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Tasks launched in any AZ will be able to share same data in the EFS volume.&lt;/li&gt;
&lt;li&gt;With the combination of Fargate + EFS we get truly serverless offering (i.e. serverless, data storage without managing servers)&lt;/li&gt;
&lt;li&gt;The main &lt;strong&gt;Usecase&lt;/strong&gt; is during persistent multi-AZ shared storage for your containers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Connect with me
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/tanmaycode2" rel="noopener noreferrer"&gt;Github&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/in/tanmay-shukla/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>docker</category>
      <category>ecs</category>
      <category>container</category>
    </item>
    <item>
      <title>Launch an EC2 instance with AMI in 1 minute (Step by Step guide) Updated 2022</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Fri, 15 Jul 2022 15:23:09 +0000</pubDate>
      <link>https://dev.to/tanmaygi/launch-an-ec2-instance-with-ami-in-1-minute-step-by-step-guide-5g6l</link>
      <guid>https://dev.to/tanmaygi/launch-an-ec2-instance-with-ami-in-1-minute-step-by-step-guide-5g6l</guid>
      <description>&lt;p&gt;This article  will guide you to get started with AWS EC2, launch your first EC2 instance and Configure it.&lt;br&gt;
After this step by step tutorial you will understand EC2 practically.&lt;/p&gt;

&lt;p&gt;First of all, Go to AWS management console and login with root or IAM user. Then select &lt;strong&gt;EC2&lt;/strong&gt; by searching it on global search box.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdq3giczka2zuyj0kq8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdq3giczka2zuyj0kq8p.png" alt="1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Launch Your Amazon EC2 Instance
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;In the AWS Management Console on the &lt;strong&gt;Services&lt;/strong&gt; menu, click &lt;strong&gt;EC2&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Click &lt;code&gt;Launch instances&lt;/code&gt; &amp;gt; &lt;strong&gt;Launch instances&lt;/strong&gt;.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ucsrzay7wdcx5e5p5gn.png" alt="2"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-1: Give name and tags
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;First, give the name and tags to your instance. &lt;strong&gt;Tags&lt;/strong&gt; enable you to categorize your AWS resources in different ways, for example, by purpose, owner, or environment.&lt;/li&gt;
&lt;li&gt;Click Add Tag then configure:

&lt;ul&gt;
&lt;li&gt;Key: Name&lt;/li&gt;
&lt;li&gt;Value: Web Server
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F82a5zzh3f00jrt5x687q.png" alt="47"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-2: Choose Application and OS Images [Amazon Machine Image(AMI)]
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Then choose instance type as 2nd step.&lt;/li&gt;
&lt;li&gt;An Amazon Machine Image (AMI) provides the information required to launch an instance, which is a virtual server in the cloud. AMI includes

&lt;ul&gt;
&lt;li&gt;A template for the root volume for the instance (for example, an operating system or an application server with applications)&lt;/li&gt;
&lt;li&gt;Launch permissions that control which AWS accounts can use the AMI to launch instances&lt;/li&gt;
&lt;li&gt;A block device mapping that specifies the volumes to attach to the instance when it is launched&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;We can also create our own AMI but &lt;strong&gt;Quick Start&lt;/strong&gt; list contains the most commonly-used AMIs.&lt;/li&gt;

&lt;li&gt;Click select next &lt;strong&gt;Amazon Linux 2 AMI&lt;/strong&gt; (at the top of the list).
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3b7870h9ilc043by37a9.png" alt="4"&gt;
&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-3: Choose an Instance Type
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Amazon EC2 provides a wide selection of instance types optimized to fit different use cases. Instance types comprise varying combinations of CPU, memory, storage, and networking capacity and give you the flexibility to choose the appropriate mix of resources for your applications.&lt;/li&gt;
&lt;li&gt;Under Instance type, from the Instance type list, you can select the hardware configuration for your instance.&lt;/li&gt;
&lt;li&gt;A t2.micro instance has 1 vCPU and 1 GiB Memory.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwdt04r3dn2re1sy44wpw.png" alt="5"&gt;
&lt;/li&gt;
&lt;li&gt;Choose the t2.micro instance type, which is selected by default. The t2.micro instance type is eligible for the free tier. In Regions where t2.micro is unavailable, you can use a t3.micro instance under the free tier.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-4: Create Key pair (login)
&lt;/h3&gt;

&lt;p&gt;We use key pair to securely connect to our instance.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwysch13vhxqw3jbehi6y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwysch13vhxqw3jbehi6y.png" alt="6"&gt;&lt;/a&gt;&lt;br&gt;
Create new key-pair if don't have any, then give it name and select RSA. Finally download the PEM or PPK file.&lt;br&gt;&lt;br&gt;
🚨 ⚠️ &lt;strong&gt;Warning&lt;/strong&gt;&lt;br&gt;
Do not choose Proceed without a key pair (Not recommended). If you launch your instance without a key pair, then you can't connect to it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step-5: Configure Network settings
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuhc7sni0r6j411zbt3s8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuhc7sni0r6j411zbt3s8.png" alt="7"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Next to Network settings, choose &lt;code&gt;Edit&lt;/code&gt;. For Security group name, you'll see that the wizard created and selected a security group for you.&lt;/li&gt;
&lt;li&gt;We can use this security group, or alternatively you can select the security group that you created when getting set up using the following steps:

&lt;ul&gt;
&lt;li&gt;Configure Security Group:

&lt;ul&gt;
&lt;li&gt;Keep the default selection, and select Create a new security group.&lt;/li&gt;
&lt;li&gt;Security group name: &lt;code&gt;Web Server security group&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Description: &lt;code&gt;Security group for my web server&lt;/code&gt;
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkjtqg1wisn8cet4cu3ay.png" alt="8"&gt; &lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;/li&gt;

&lt;li&gt;Add SSH(port-22) and HTTP(port-80) rules in &lt;code&gt;Inbound security groups rules&lt;/code&gt; to access our instance from putty(ssh) and through web browser(internet) respectively.&lt;/li&gt;

&lt;li&gt;Keep the default selections for the other configuration settings for your instance.&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-6: Configure Storage
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Amazon EC2 stores data on a network-attached virtual disk called Elastic Block Store.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp89erpns1ybnve99myz4.png" alt="9"&gt;
&lt;/li&gt;
&lt;li&gt;We will launch the Amazon EC2 instance using a default 8 GiB disk volume. This will be your root volume (also known as a 'boot' volume).&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step-7: Advanced Details
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;A field for &lt;strong&gt;User data&lt;/strong&gt; will appear.

&lt;ul&gt;
&lt;li&gt;When you launch an instance, you can pass user data to the instance that can be used to perform common automated configuration tasks and even run scripts after the instance starts.&lt;/li&gt;
&lt;li&gt;Your instance is running Amazon Linux, so you will provide a shell script that will run when the instance starts.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Copy the following commands and paste them into the User data field:
&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash
yum -y install httpd
systemctl enable httpd
systemctl start httpd
echo '&amp;lt;html&amp;gt;&amp;lt;h1&amp;gt;Hello From Your Web Server!&amp;lt;/h1&amp;gt;&amp;lt;/html&amp;gt;' &amp;gt; /var/www/html/index.html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Leave everything else as &lt;code&gt;default&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Review the &lt;strong&gt;Summary&lt;/strong&gt; and Click &lt;strong&gt;&lt;code&gt;Launch&lt;/code&gt;&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn86jqqfzblcwloo20adl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn86jqqfzblcwloo20adl.png" alt="10"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connect to your Instance
&lt;/h2&gt;

&lt;p&gt;There are multiple ways to connect to your instance.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Connect to EC2 instance using PuTTY&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnec1cfz7mei3nmvrinuq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnec1cfz7mei3nmvrinuq.png" alt="11"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open PuTTY&lt;/li&gt;
&lt;li&gt;Provide the public IPv4 address( of your ec2 instance in the Host Name section.&lt;/li&gt;
&lt;li&gt;On the left menu, expand SSH and click on Auth&lt;/li&gt;
&lt;li&gt;Click on Browse and open the .ppk file
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdxzzq6l1s7wwsr7lz3bz.png" alt="12"&gt;
&lt;/li&gt;
&lt;li&gt;Now click open and &lt;code&gt;accept&lt;/code&gt; if comes in prompt then login as `ec2-user.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚩🚩 &lt;strong&gt;Important&lt;/strong&gt;&lt;br&gt;
You can't connect to your instance unless you launched it with a key pair for which you have the .pem file and you launched it with a security group that allows SSH access from your computer. If you can't connect to your instance, see Troubleshoot connecting to your instance for assistance.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Connect using EC2 Instance Connect&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Supported Linux distributions:

&lt;ul&gt;
&lt;li&gt;Amazon Linux 2 (any version)&lt;/li&gt;
&lt;li&gt;Ubuntu 16.04 or later&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt; To connect to your instance using the browser-based client from the Amazon EC2 console

&lt;ul&gt;
&lt;li&gt;Open the Amazon EC2 console at &lt;a href="https://console.aws.amazon.com/ec2/" rel="noopener noreferrer"&gt;https://console.aws.amazon.com/ec2/&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;In the navigation pane, choose Instances.&lt;/li&gt;
&lt;li&gt;Select the instance and choose Connect.&lt;/li&gt;
&lt;li&gt;Choose EC2 Instance Connect.&lt;/li&gt;
&lt;li&gt;Verify the user name and choose Connect to open a terminal window.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkjsx55qj8n9wkplmji19.png" alt="67"&gt;
&lt;/li&gt;
&lt;li&gt;When you hit coonect it will open terminal in your browser lik this: 
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4xigbu2tbgla4bbcqawf.png" alt="75"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>ec2</category>
      <category>cloud</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Amazon EC2 (Elastic compute cloud ☁🖥) -Zero to Hero🚀🚀 (Cheat Sheet)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Fri, 15 Jul 2022 07:58:52 +0000</pubDate>
      <link>https://dev.to/tanmaygi/new-exciting-announcements-from-aws-at-nyc-summit-2022-1nnf</link>
      <guid>https://dev.to/tanmaygi/new-exciting-announcements-from-aws-at-nyc-summit-2022-1nnf</guid>
      <description>&lt;p&gt;In this series I am going to share multiple articles that will teach you from basic to advanced about EC2. So Lets start !!!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq53n7u90itwqutybza2n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq53n7u90itwqutybza2n.png" alt="EC2 n"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is EC2 ?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;EC2 is a web service that provides resizable compute capacity in the cloud.&lt;/li&gt;
&lt;li&gt;It is designed to make web-scale cloud computing easier for developers.&lt;/li&gt;
&lt;li&gt;Most popular and most used AWS offering.
&lt;/li&gt;
&lt;li&gt;EC2 = Elastic Compute Cloud = Infrastructure as a service(IaaS) 

&lt;ul&gt;
&lt;li&gt;It mainly consists of:
– Renting Virtual machines(EC2)&lt;/li&gt;
&lt;li&gt;Storing data on Virtual drives(EBS)&lt;/li&gt;
&lt;li&gt;Distributing load across machines(ELB)&lt;/li&gt;
&lt;li&gt;Scaling the services using an auto-scaling group(ASG)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;In this you pay only for capacity that you actually use.&lt;/li&gt;

&lt;li&gt;EC2 Provides developers the tools to build failure resilient applications and isolate themselves from common failure scenarios.&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Below are some features of EC2:&lt;/strong&gt;  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Reliability&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;EC2 provides 99.9% availability in each region. The services are highly reliable, where replacement of instances can be done easily and rapidly.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Cost Saving&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;EC2 is inexpensive as it allows the user to select plans as per the requirement. It helps the users to save cost and utilize the resources fully. &lt;/li&gt;
&lt;li&gt;User's also get benefits from the AWS scale, which enables the users to pay less for virtual servers than other cloud providers. &lt;/li&gt;
&lt;li&gt;EC2 works on pay-as-you go model and as a customer we only pay for the time we use EC2.&lt;/li&gt;
&lt;li&gt;With the use of EC2, we can eliminate the need to invest upfront cost on Capex for hardware (servers).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3. Elasticity&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Companies can easily increase or decrease capacity within minutes. They can also provision thousands of server instances simultaneously. &lt;/li&gt;
&lt;li&gt;Apart from that, all the server instances are handled by web service APIs that can scale up and down the servers as per the requirements.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4. Scalability&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In EC2 we can scale-in and scale-out depending on load. It also provides autoscaling capabilities&lt;/li&gt;
&lt;li&gt;Auto-scaling is the capability built into AWS that allows you to ensure you have the right number of EC2 instances provisioned to handle the load of your application. &lt;/li&gt;
&lt;li&gt;We can use EC2 to launch as many virtual machines as per our needs.&lt;/li&gt;
&lt;li&gt;It provides scalable computing capacity in AWS cloud.&lt;/li&gt;
&lt;li&gt;It also helps in building application with redundancy and resilience.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;5. Security&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS works with Amazon VPC to provide robust networking and security for the compute resources.&lt;/li&gt;
&lt;li&gt;All the compute instances are located in a VPC (Virtual Private cloud) in a specific range. This specific functions help the user in deciding which instances are exposed to the internet and which remains private
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi0.wp.com%2Fwww.daniloaz.com%2Fwp-content%2Fuploads%2F2017%2F08%2Faws-ec2-smaller-partitioned-root-volume.png%3Fresize%3D729%252C498%26ssl%3D1" alt="EC2."&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  EC2 sizing &amp;amp; configuration
&lt;/h2&gt;

&lt;p&gt;We can choose from various options in EC2 like below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Operating System(OS): Linux, Windows or macOS&lt;/li&gt;
&lt;li&gt;Compute power, processors and cores(CPU)&lt;/li&gt;
&lt;li&gt;Random-access memory(RAM)&lt;/li&gt;
&lt;li&gt;Storage space:

&lt;ul&gt;
&lt;li&gt;Hardware(EC2 Instance Store)&lt;/li&gt;
&lt;li&gt;Network-attached storage(EBS &amp;amp; EFS)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Firewall Rules: Security group&lt;/li&gt;

&lt;li&gt; Network card: speed of the card, Public IP address&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  EC2 User Data
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;We can bootstrap our instances using an EC2 User data script.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bootstrapping&lt;/strong&gt; means launching commands when a machine starts&lt;/li&gt;
&lt;li&gt;This script only run once when the instance start.&lt;/li&gt;
&lt;li&gt; Usecase of EC2 user data is to automate boot tasks such as:

&lt;ul&gt;
&lt;li&gt;Installing updates&lt;/li&gt;
&lt;li&gt;Installing softwares&lt;/li&gt;
&lt;li&gt;Downloading common files from the internet&lt;/li&gt;
&lt;li&gt;A lot more &lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;The EC2 user data script runs with the &lt;strong&gt;root&lt;/strong&gt; user&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Security Groups
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Security groups(SG) are the fundamentals of network security in AWS.&lt;/li&gt;
&lt;li&gt;SG controls, how the traffic is allowed into or out of our instances.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2zdbxaj40isu8n7ysw8s.png" alt="SG"&gt;
&lt;/li&gt;
&lt;li&gt;SG only contain &lt;strong&gt;allow&lt;/strong&gt; rules&lt;/li&gt;
&lt;li&gt;SG rules can reference by IP or by security group.&lt;/li&gt;
&lt;li&gt;Security groups acts as a &lt;strong&gt;"firewall"&lt;/strong&gt; for EC2 instances.&lt;/li&gt;
&lt;li&gt;Security groups regulates : 

&lt;ul&gt;
&lt;li&gt;Authorized &lt;strong&gt;IP ranges&lt;/strong&gt; -IPv4 and Ipv6&lt;/li&gt;
&lt;li&gt;Access to &lt;strong&gt;ports&lt;/strong&gt; (like SSH, HTTP and HTTPS) &lt;/li&gt;
&lt;li&gt;Control of inbound network(from other to the instance)&lt;/li&gt;
&lt;li&gt;Control of outbound Network(from instance to other)
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  EC2 Image Builder
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;It is used to automate the creation of virtual machines and container images.&lt;/li&gt;
&lt;li&gt;Its a free service i.e. we only for uderlying resources.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crux&lt;/strong&gt;- Automates the creation, maintain, validate and test &lt;strong&gt;EC2 AMIs&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;We can run it via scheduling(weekly or whenever packages are updated).
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frptjnzwj2s1ptr2gta74.png" alt="AWS images builder "&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Connect with me
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/tanmaycode2" rel="noopener noreferrer"&gt;Github&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/in/tanmay-shukla/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>serverless</category>
      <category>beginners</category>
    </item>
    <item>
      <title>AWS Identity and Access management-Practical Guide 🚀🚀(Cheat sheet)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Thu, 14 Jul 2022 17:36:06 +0000</pubDate>
      <link>https://dev.to/tanmaygi/aws-identity-and-access-management-practical-guide-cheat-sheet-3528</link>
      <guid>https://dev.to/tanmaygi/aws-identity-and-access-management-practical-guide-cheat-sheet-3528</guid>
      <description>&lt;p&gt;This is the Practical guide to understand and revise AWS IAM service. This can also be looked as quick review cheat sheet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvhjtf6jdib0wb048kkap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvhjtf6jdib0wb048kkap.png" alt="IAM aws"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  IAM: Users &amp;amp; Groups
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;IAM&lt;/strong&gt; = Identity and Access Management, Global service&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Root&lt;/strong&gt; account created by default, shouldn’t be used or shared&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Users&lt;/strong&gt; are people within your organization, and can be grouped&lt;/li&gt;
&lt;li&gt;Groups only contain users, not other groups&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Users&lt;/strong&gt; don’t have to belong to a group, and user can belong to multiple groups&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  IAM: Permissions
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Users or Groups can be assigned JSON documents called policies.&lt;/li&gt;
&lt;li&gt;These policies define the permissions of the users. &lt;/li&gt;
&lt;li&gt;In AWS you apply the least privilege principle: don’t give more permissions than a user needs.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  IAM Policies Structure
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmhkldwac3k8ip8z7z5s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmhkldwac3k8ip8z7z5s.png" alt="IAM policy"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;1. IAM Policies Consists of&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Version:&lt;/strong&gt; policy language version, always include “2012-10-17”&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Id:&lt;/strong&gt; an identifier for the policy (optional) &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Statement:&lt;/strong&gt; one or more individual statements (required).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Statements consists of&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sid:&lt;/strong&gt; an identifier for the statement (optional) &lt;/li&gt;
&lt;li&gt;Effect: whether the statement allows or denies access (Allow, Deny)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Principal:&lt;/strong&gt; account/user/role to which this policy applied to&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; list of actions this policy allows or denies &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Resource:&lt;/strong&gt;list of resources to which the actions applied to &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Condition:&lt;/strong&gt; conditions for when this policy is in effect (optional).
&lt;strong&gt;Example:&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "FirstStatement",
      "Effect": "Allow",
      "Action": ["iam:ChangePassword"],
      "Resource": "*"
    },
    {
      "Sid": "SecondStatement",
      "Effect": "Allow",
      "Action": "s3:ListAllMyBuckets",
      "Resource": "*"
    },
    {
      "Sid": "ThirdStatement",
      "Effect": "Allow",
      "Action": [
        "s3:List*",
        "s3:Get*"
      ],
      "Resource": [
        "arn:aws:s3:::confidential-data",
        "arn:aws:s3:::confidential-data/*"
      ],
      "Condition": {"Bool": {"aws:MultiFactorAuthPresent": "true"}}
    }
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  IAM – Password Policy
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Strong passwords = higher security for your account&lt;/li&gt;
&lt;li&gt;In AWS, you can setup a password policy:

&lt;ul&gt;
&lt;li&gt;Set a minimum password length&lt;/li&gt;
&lt;li&gt;Require specific character types:&lt;/li&gt;
&lt;li&gt;including uppercase letters&lt;/li&gt;
&lt;li&gt;lowercase letters&lt;/li&gt;
&lt;li&gt;numbers&lt;/li&gt;
&lt;li&gt;non-alphanumeric characters&lt;/li&gt;
&lt;li&gt;Allow all IAM users to change their own passwords&lt;/li&gt;
&lt;li&gt;Require users to change their password after some time (password expiration)&lt;/li&gt;
&lt;li&gt;Prevent password re-use.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h3&gt;
  
  
  Multi Factor Authentication - MFA
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Users have access to your account and can possibly change 
configurations or delete resources in your AWS account&lt;/li&gt;
&lt;li&gt;You want to protect your Root Accounts and IAM users&lt;/li&gt;
&lt;li&gt;MFA = password you know + security device you own
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz8of749ukhphxr21lzbn.png" alt="MFA"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  MFA devices options in AWS
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Virtual MFA device: Google authenticator, Authy.&lt;/li&gt;
&lt;li&gt;Universal 2nd Factor (U2F) Security Key: YubiKey by Yubico (3rd party)&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  How can users access AWS ?
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;To access AWS, you have three options:

&lt;ol&gt;
&lt;li&gt;AWS Management Console (protected by password + MFA)&lt;/li&gt;
&lt;li&gt;AWS Command Line Interface (CLI): protected by access keys&lt;/li&gt;
&lt;li&gt;AWS Software Developer Kit (SDK) - for code: protected by access keys&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;li&gt;Access Keys are generated through the AWS Console&lt;/li&gt;
&lt;li&gt;Users manage their own access keys&lt;/li&gt;
&lt;li&gt;Access Keys are secret, just like a password. Don’t share them&lt;/li&gt;
&lt;li&gt;Access Key ID ~= username&lt;/li&gt;
&lt;li&gt;Secret Access Key ~= password&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  IAM Roles for Services
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Some AWS service will need to perform actions on your behalf&lt;/li&gt;
&lt;li&gt;To do so, we will assign &lt;strong&gt;permissions&lt;/strong&gt; to AWS services with &lt;strong&gt;IAM Roles&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Common roles: 

&lt;ul&gt;
&lt;li&gt;EC2 Instance Roles &lt;/li&gt;
&lt;li&gt;Lambda Function Roles &lt;/li&gt;
&lt;li&gt;Roles for CloudFormation
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1u3zuywpj7fu1suzbnm0.png" alt="iAM ROLES"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h3&gt;
  
  
  IAM Guidelines &amp;amp; Best Practices
&lt;/h3&gt;

&lt;p&gt;• Don’t use the root account except for AWS account setup&lt;br&gt;
• One physical user = One AWS user&lt;br&gt;
• Assign users to groups and assign permissions to groups&lt;br&gt;
• Create a strong password policy&lt;br&gt;
• Use and enforce the use of Multi Factor Authentication (MFA)&lt;br&gt;
• Create and use Roles for giving permissions to AWS services&lt;br&gt;
• Use Access Keys for Programmatic Access (CLI / SDK)&lt;br&gt;
• Audit permissions of your account with the IAM Credentials Report&lt;br&gt;
• Never share IAM users &amp;amp; Access Keys&lt;/p&gt;




&lt;h2&gt;
  
  
  IAM – Summary
&lt;/h2&gt;

&lt;p&gt;• Users: mapped to a physical user, has a password for AWS Console&lt;br&gt;
• Groups: contains users only &lt;br&gt;
• Policies: JSON document that outlines permissions for users or groups&lt;br&gt;
• Roles: for EC2 instances or AWS services&lt;br&gt;
• Security: MFA + Password Policy&lt;br&gt;
• Access Keys: access AWS using the CLI or SDK&lt;br&gt;
• Audit: IAM Credential Reports &amp;amp; IAM Access Advisor&lt;/p&gt;

&lt;h2&gt;
  
  
  Connect with me
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/tanmaycode2" rel="noopener noreferrer"&gt;Github&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/in/tanmay-shukla/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cheatsheet</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>How to Install LAMP(Linux Apache Mysql PHP) on Ubuntu 20.04(AWS-EC2)</title>
      <dc:creator>Tanmay Shukla</dc:creator>
      <pubDate>Thu, 14 Jul 2022 09:02:00 +0000</pubDate>
      <link>https://dev.to/tanmaygi/how-to-install-lamplinux-apache-mysql-php-on-ubuntu-2004-5fl1</link>
      <guid>https://dev.to/tanmaygi/how-to-install-lamplinux-apache-mysql-php-on-ubuntu-2004-5fl1</guid>
      <description>&lt;h2&gt;
  
  
  What is LAMP Stack?
&lt;/h2&gt;

&lt;p&gt;The LAMP stack is a popular open-source solution stack used primarily in web development.&lt;br&gt;
LAMP consists of four components necessary to establish a fully functional web development environment. The first letters of the components' names make up the LAMP acronym:&lt;br&gt;
This term is actually an acronym which represents the &lt;strong&gt;Linux&lt;/strong&gt; operating system, with the &lt;strong&gt;Apache&lt;/strong&gt; web server. The site data is stored in a &lt;strong&gt;MySQL&lt;/strong&gt; database, and dynamic content is processed by &lt;strong&gt;PHP&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The illustration below shows how the layers stack together:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fphoenixnap.com%2Fkb%2Fwp-content%2Fuploads%2F2022%2F01%2Fvisual-representation-of-the-lamp-stack-pnap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fphoenixnap.com%2Fkb%2Fwp-content%2Fuploads%2F2022%2F01%2Fvisual-representation-of-the-lamp-stack-pnap.png" alt="Lamp"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Step-1 Install Apache and Update the Firewall.
&lt;/h3&gt;

&lt;p&gt;First we need to install apache on our Virtual machine but before that lets update the necessary softwares by following command. Just to remind, ubuntu used apt package manager and Red hat based distros like centos uses yum package manager.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt update
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$sudo apt install apache2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To see Apache version on a Debian/Ubuntu Linux, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apache2 -v
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For CentOS/RHEL/Fedora Linux server, type command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;httpd -v
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can locate apache2 or httpd path using the type command or command command. For instance:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type -a httpd
type -a apache2
whereis httpd
command -v httpd
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you need to update the firewall so that your firewall allow HTTP and HTTPS traffic. To check the UFW (&lt;strong&gt;uncomplicated firewall&lt;/strong&gt;) which is a firewall configuration tool that runs on top of iptables, included by default within Ubuntu distributions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo ufw app list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Output
Available applications:
  Apache
  Apache Full
  Apache Secure
  OpenSSH
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you look at the Apache Full profile details, you’ll see that it enables traffic to ports 80 and 443:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo ufw app info "Apache Full"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Output
Profile: Apache Full
Title: Web Server (HTTP,HTTPS)
Description: Apache v2 is the next generation of the omnipresent Apache web
server.

Ports:
  80,443/tcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now to allow incoming HTTP and HTTPS traffic for this server, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo ufw allow "Apache Full"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;OUTPUT

Rules updated
Rules updated (v6)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then to check , go to your instance  public IP address&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://your_server_ip
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: To know your IP address,we can use following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt install curl
$ curl http://icanhazip.com
$ curl ifconfig.me
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It will show something like this with apache information:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.digitalocean.com%2Farticles%2Fhow-to-install-lamp-ubuntu-18%2Fsmall_apache_default_1804.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.digitalocean.com%2Farticles%2Fhow-to-install-lamp-ubuntu-18%2Fsmall_apache_default_1804.png" alt="Apache"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Step-2 Installing MYSQL.
&lt;/h3&gt;

&lt;p&gt;As we have now our webserver up and running, its time to install Mysql.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt install mysql-server
$ sudo mysql --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will show you a list of the packages that will be installed, along with the amount of disk space they’ll take up. Enter Y to continue.&lt;br&gt;
When installation is done, run a simple security script to remove some dangerous defaults and lock down access to your database system.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo mysql_secure_installation
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now read and complete the installtion steps like password set etc.&lt;br&gt;
When it's done test if you are able to logged in to the mysql console&lt;/p&gt;

&lt;p&gt;Answer Y for yes, or anything else to continue without enabling.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;VALIDATE PASSWORD PLUGIN can be used to test passwords
and improve security. It checks the strength of password
and allows the users to set only those passwords which are
secure enough. Would you like to setup VALIDATE PASSWORD plugin?

Press y|Y for Yes, any other key for No:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you’ve enabled password validation, you’ll be shown the password strength for the root password you just entered and your server will ask if you want to change that password. If you are happy with your current password, enter N for “no” at the prompt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Using existing password for root.

Estimated strength of the password: 100
Change the password for root ? ((Press y|Y for Yes, any other key for No) : n
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally type the following to enter into mysql console&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo mysql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Output
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 5
Server version: 5.7.34-0ubuntu0.18.04.1 (Ubuntu)

Copyright (c) 2000, 2021, Oracle and/or its affiliates.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql&amp;gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and type&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exit
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;br&gt;
  to get out of the console.&lt;/p&gt;
&lt;h3&gt;
  
  
  Step-3 Installing PHP
&lt;/h3&gt;

&lt;p&gt;Why? to install it. Because PHP can run scripts, connect to your MySQL databases to get information, and hand the processed content over to your web server so that it can display the results to your visitors. &lt;br&gt;
Type the below command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo apt install php libapache2-mod-php php-mysql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We've installed three packages in the above command:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;php&lt;/strong&gt; package&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;libapache2-mod-php&lt;/strong&gt; to integrate PHP into Apache webserver.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;php-mysql&lt;/strong&gt; package to allow PHP to connect to MySQL databases.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To extend the functionality of PHP, you have the option to install some additional modules.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ apt search php- | less
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use the arrow keys to scroll up and down, and press Q to quit.&lt;/p&gt;

&lt;p&gt;The results are all optional components that you can install. It will give you a short description for each:&lt;/p&gt;

&lt;p&gt;For example, to find out what the php-cli module does, you could type this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ apt show php-cli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you decided that php-cli is something that you need, you could type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install php-cli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step-4 Test the PHP Processing on your Web Server
&lt;/h3&gt;

&lt;p&gt;To test that your system is properly configured for PHP, create a PHP script called &lt;strong&gt;info.php&lt;/strong&gt;. In order for Apache to find this file and serve it correctly, it must be saved to your web &lt;strong&gt;root directory&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo nano /var/www/**your_domain**/info.php
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Insert the following command to show the php information.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;?php
phpinfo();
?&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save and close it &lt;br&gt;
The address you will want to visit is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://your_domain/info.php
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The page that you come to should look something like this:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.digitalocean.com%2Farticles%2Fhow-to-install-lamp-ubuntu-18%2Fsmall_php_info_1804.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.digitalocean.com%2Farticles%2Fhow-to-install-lamp-ubuntu-18%2Fsmall_php_info_1804.png" alt="php"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that you have a LAMP stack installed, you have many choices for what to do next. You’ve installed a platform that will allow you to install most kinds of websites and web software on your server.&lt;/p&gt;

</description>
      <category>ubuntu</category>
      <category>devops</category>
      <category>cloud</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
