<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Andy Lim</title>
    <description>The latest articles on DEV Community by Andy Lim (@andylim0221).</description>
    <link>https://dev.to/andylim0221</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/andylim0221"/>
    <language>en</language>
    <item>
      <title>UFW and Docker Security on Linux machine</title>
      <dc:creator>Andy Lim</dc:creator>
      <pubDate>Fri, 27 Aug 2021 08:53:11 +0000</pubDate>
      <link>https://dev.to/andylim0221/ufw-and-docker-security-on-linux-machine-oc3</link>
      <guid>https://dev.to/andylim0221/ufw-and-docker-security-on-linux-machine-oc3</guid>
      <description>&lt;p&gt;Uncomplicated Firewall (UFW) is served as a frontend tool for &lt;em&gt;iptables&lt;/em&gt; in Linux-based machine. It's used to provide easy interface to manage firewall so you don't have to manage them through complicated &lt;em&gt;iptables&lt;/em&gt; and &lt;em&gt;netfilter&lt;/em&gt; commands.&lt;/p&gt;

&lt;p&gt;So today I tried &lt;strong&gt;experimenting&lt;/strong&gt; UFW with NGINX running as a Docker container on Amazon Linux 2 EC2 instance in my AWS environments. &lt;/p&gt;

&lt;p&gt;Before this, I have launched an Amazon Linux 2 EC2 instance and have SSH and TCP port 80 allowed for incoming request in the security group attached to the instance. &lt;/p&gt;

&lt;h2&gt;
  
  
  Install UFW
&lt;/h2&gt;

&lt;p&gt;First, ssh into the EC2 instance.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ ssh -i example.pem ec2-user@1.23.45.678


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;By default, UFW is not pre-installed on Amazon Linux 2 and not available in RHEL repository. We will need to install EPEL repository to our system. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo yum update -y 
$ sudo yum install epel-release -y


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, install UFW by running command below:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo yum install --enablerepo="epel" ufw -y


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, start UFW service and enable it to start on boot time.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo ufw enable


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Run command below to ensure UFW is running:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo ufw status


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  Install Docker
&lt;/h2&gt;

&lt;p&gt;Next, install Docker runtime in the instance:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo amazon-linux-extras install docker
$ sudo service docker start
$ sudo usermod -a -G docker ec2-user


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Confirm Docker is installed and you see version as output:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo docker -v

Docker version 20.10.7, build f0df350


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, run a NGINX container using command below:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo docker run -d --name my-nginx -p 80:80 nginx


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This command will pull the Docker image from Docker Hub and run as a container on local Docker runtime. Run command below to see if &lt;code&gt;my-nginx&lt;/code&gt; container is running:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo docker ps

CONTAINER ID   IMAGE     COMMAND                  CREATED          STATUS          PORTS                               NAMES
c3aee3db60f1   nginx     "/docker-entrypoint.…"   22 seconds ago   Up 21 seconds   0.0.0.0:80-&amp;gt;80/tcp, :::80-&amp;gt;80/tcp   my-nginx


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Next, run &lt;code&gt;curl&lt;/code&gt; command to see if your website is running perfect or go to your browser and search.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ curl -L 1.23.45.678:80
&amp;lt;!DOCTYPE html&amp;gt;
&amp;lt;html&amp;gt;
&amp;lt;head&amp;gt;
&amp;lt;title&amp;gt;Welcome to nginx!&amp;lt;/title&amp;gt;
&amp;lt;style&amp;gt;
    body {
        width: 35em;
        margin: 0 auto;
        font-family: Tahoma, Verdana, Arial, sans-serif;
    }
&amp;lt;/style&amp;gt;
&amp;lt;/head&amp;gt;
&amp;lt;body&amp;gt;
&amp;lt;h1&amp;gt;Welcome to nginx!&amp;lt;/h1&amp;gt;
&amp;lt;p&amp;gt;If you see this page, the nginx web server is successfully installed and
working. Further configuration is required.&amp;lt;/p&amp;gt;

&amp;lt;p&amp;gt;For online documentation and support please refer to
&amp;lt;a href="http://nginx.org/"&amp;gt;nginx.org&amp;lt;/a&amp;gt;.&amp;lt;br/&amp;gt;
Commercial support is available at
&amp;lt;a href="http://nginx.com/"&amp;gt;nginx.com&amp;lt;/a&amp;gt;.&amp;lt;/p&amp;gt;

&amp;lt;p&amp;gt;&amp;lt;em&amp;gt;Thank you for using nginx.&amp;lt;/em&amp;gt;&amp;lt;/p&amp;gt;
&amp;lt;/body&amp;gt;
&amp;lt;/html&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You will see a beautiful NGINX website in front of you!&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4t0aphh9goz8z5685jj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4t0aphh9goz8z5685jj.png" alt="Nginx"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Problem arises
&lt;/h2&gt;

&lt;p&gt;Next, here is something fun: I want to deny anyone looking for this website through UFW!&lt;/p&gt;

&lt;p&gt;Go back to your EC2 instance. Run the command below to enable the rule of denying incoming request with port 80 in UFW:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo ufw deny 80


Rule updated
Rule updated (v6)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Validate the deny rule is enabled:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo ufw status
Status: active

To                         Action      From
--                         ------      ----
SSH                        ALLOW       Anywhere
224.0.0.251 mDNS           ALLOW       Anywhere
22                         ALLOW       Anywhere
80                         DENY        Anywhere
SSH (v6)                   ALLOW       Anywhere (v6)
ff02::fb mDNS              ALLOW       Anywhere (v6)
22 (v6)                    ALLOW       Anywhere (v6)
80 (v6)                    DENY        Anywhere (v6)



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now, test on your host machine and check again. You should not be able to see the NGINX website on your host machine!&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ curl -L 1.23.45.678:80
&amp;lt;!DOCTYPE html&amp;gt;
&amp;lt;html&amp;gt;
&amp;lt;head&amp;gt;
&amp;lt;title&amp;gt;Welcome to nginx!&amp;lt;/title&amp;gt;
&amp;lt;style&amp;gt;
    body {
        width: 35em;
        margin: 0 auto;
        font-family: Tahoma, Verdana, Arial, sans-serif;
    }
&amp;lt;/style&amp;gt;
&amp;lt;/head&amp;gt;
&amp;lt;body&amp;gt;
&amp;lt;h1&amp;gt;Welcome to nginx!&amp;lt;/h1&amp;gt;
&amp;lt;p&amp;gt;If you see this page, the nginx web server is successfully installed and
working. Further configuration is required.&amp;lt;/p&amp;gt;

&amp;lt;p&amp;gt;For online documentation and support please refer to
&amp;lt;a href="http://nginx.org/"&amp;gt;nginx.org&amp;lt;/a&amp;gt;.&amp;lt;br/&amp;gt;
Commercial support is available at
&amp;lt;a href="http://nginx.com/"&amp;gt;nginx.com&amp;lt;/a&amp;gt;.&amp;lt;/p&amp;gt;

&amp;lt;p&amp;gt;&amp;lt;em&amp;gt;Thank you for using nginx.&amp;lt;/em&amp;gt;&amp;lt;/p&amp;gt;
&amp;lt;/body&amp;gt;
&amp;lt;/html&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Why are you still here.....&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4t0aphh9goz8z5685jj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4t0aphh9goz8z5685jj.png" alt="Nginx-2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's the problem and how to fix it
&lt;/h2&gt;

&lt;p&gt;From &lt;a href="https://docs.docker.com/network/iptables/" rel="noopener noreferrer"&gt;Docker documentation&lt;/a&gt;, I only learned that by default, Docker directly manipulates iptables for network isolation. &lt;/p&gt;

&lt;p&gt;There are several ways to disable this. One of it is that we don't expose the port 80. &lt;/p&gt;

&lt;p&gt;Another way is, we can disable this Docker behaviour by creating or modifying /etc/docker/daemon.json.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ sudo vi /etc/docker/daemon.json

{ "iptables": false }

$ sudo service docker restart


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;By this, we don't have to worry if other port is exposed and open when we run another container with different ports exposed.&lt;/p&gt;

&lt;p&gt;Last, try to connect again:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

$ curl -L 1.23.45.678:80

curl: (7) Failed to connect to 1.23.45.678 port 80: Operation timed out


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Done!&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This article may not explain well about the fundamental of Linux security and you may think why do I need UFW when there's security group attached on EC2 instance and you can configure the allow/deny rules easily from security group. &lt;/p&gt;

&lt;p&gt;Well, it's for experiment purpose! I never learned about UFW or security related topics in Linux and I thought security group is the only solution to secure my instances. &lt;/p&gt;

&lt;p&gt;Let me know in the comments about your thoughts too! &lt;/p&gt;

&lt;p&gt;Happy coding! 💻 &lt;/p&gt;

</description>
      <category>aws</category>
      <category>docker</category>
      <category>linux</category>
    </item>
    <item>
      <title>The Art of Thinking Clearly - Book Review</title>
      <dc:creator>Andy Lim</dc:creator>
      <pubDate>Mon, 12 Apr 2021 04:18:19 +0000</pubDate>
      <link>https://dev.to/andylim0221/the-art-of-thinking-clearly-book-review-4h0k</link>
      <guid>https://dev.to/andylim0221/the-art-of-thinking-clearly-book-review-4h0k</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Recently I set a goal for myself to read up to 10 pages of a book everyday. I know it may sound easy for some people, but for someone who doesn't really like to read, like me, it's just like asking someone to play F chord when one doesn't know how to play guitar. This is definitely a consistent breakthough and although I missed out to reading (average 1 day per week), but I managed to maintain the consistency, just like coding. I would like to share my book review about a book that I read recently, The Art of Thinking Clearly by Rolf Dobelli.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1r4iuk1f2zj87nyycmpt.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1r4iuk1f2zj87nyycmpt.jpeg" alt="16248196"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's so special about this book
&lt;/h2&gt;

&lt;p&gt;This book covers the topic of cognitive biases which each cognitive bias is separated into one chapter and there are 99 chapters in this book. For those who don't know, cognitive bias is a type of error in thinking that occurs when people are processing and interpreting information in the world around them. It is often a result of your brain's attempt to simplify information processing and it affects the way we see and think about the world. &lt;/p&gt;

&lt;h2&gt;
  
  
  How does it relate to my career journey
&lt;/h2&gt;

&lt;p&gt;There are several cognitive biases that catch into my eyes and find it relatable to my career.&lt;/p&gt;

&lt;h3&gt;
  
  
  Swimmer's body illusion
&lt;/h3&gt;

&lt;p&gt;This cognitive bias describes how we confuse traits with result. We think that we can get the body of a professional swimmer in Olympia level by swimming a lot, but the truth is, they are professional swimmer because of their good physiques. Similarly, the female models advertise cosmetics to attract female customers to believe that these products make them beautiful. But it's not the cosmetics that make these women model-like but simply those models are born attractive. How their bodies are designed is a factor of selection and not the results of their activities.&lt;/p&gt;

&lt;p&gt;I thought that I can be smart by coding a lot like a professional software developer, but the truth is, they are good because they are smart. Being smart is one of traits to be a good and professional software developer, but not the result.&lt;/p&gt;

&lt;h3&gt;
  
  
  Déformation professionnelle
&lt;/h3&gt;

&lt;p&gt;This cognitive bias describes a tendency to look at things from the point of view of one's own profession or special expertise, rather than from a broader or humane perspective. This book states a good insight, &lt;strong&gt;&lt;em&gt;If your only tool is a hammer, all your problems will be nails&lt;/em&gt;&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;If you are a frontend developer who only focus on frontend and you are brought up to a backend problem, you will only be able to provide the solution from frontend perspetive, same goes to vice versa. It tells me don't expect an overall best solution. To better equip ourselves, we should at least learn some perspectives from various ends, be a T-shaped person. &lt;/p&gt;

&lt;h3&gt;
  
  
  Not Invented-Here (NIH) Syndrome
&lt;/h3&gt;

&lt;p&gt;NIH syndrome causes you to fall in love with your own ideas. Also, we tend to avoid things that we didn't create ourselves. This is true to certain extent that some of use might want to reinvent the wheel because we think that the things we do are far way better than the solutions provided by third party. I see this as a fallancy that we might easily face into, especially when we are building our own solution. When you face this situation, take a step back and then examine their quality and drawback. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;It is hard to summarise this book in a sentence, because there are many cognitive biases and each of them act separately to different situations. If you expect to have steps or roadmap on how to think clearly from this book, well, you may look into another book. This book doesn't work that way. It gives you insights on your behaviour and action when you are facing any situations. &lt;/p&gt;

&lt;p&gt;Surely there are cognitive biases that catch into your eyes too and feel free to share with me in the comments. &lt;/p&gt;

&lt;p&gt;Happy coding. 💻&lt;/p&gt;

</description>
      <category>books</category>
    </item>
    <item>
      <title>Delete Multiple Versioning-Enabled S3 Buckets by boto3</title>
      <dc:creator>Andy Lim</dc:creator>
      <pubDate>Thu, 01 Apr 2021 04:32:16 +0000</pubDate>
      <link>https://dev.to/andylim0221/delete-multiple-versioning-enabled-s3-buckets-by-boto3-24o6</link>
      <guid>https://dev.to/andylim0221/delete-multiple-versioning-enabled-s3-buckets-by-boto3-24o6</guid>
      <description>&lt;p&gt;Have you ever tried to delete multiple non-empty S3 buckets with versioning-enabled on AWS Console? How irritating it is to go through each of every S3 bucket? To delete the bucket on AWS console, you have to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;delete every objects in the bucket&lt;/li&gt;
&lt;li&gt;empty the bucket by typing the bucket name &lt;/li&gt;
&lt;li&gt;deleting the bucket by typing 'delete'&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You have to go through 2 confirmations. Let's say you have up to 20 buckets to delete, you have to go through total up to 40 confirmations to make by going through each of every bucket, which is totally a hell. And hence, I choose to do it programmatically. I will walk you through this post to write a simple script to delete multiple buckets.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisite
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;AWS CLI installed. Download from &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Configure AWS credentialds in AWS CLI. Can refer &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html" rel="noopener noreferrer"&gt;this document&lt;/a&gt; to learn how to configure your AWS CLI. Make sure you have the access to list buckets, delete versioned objects in the buckets and buckets.&lt;/li&gt;
&lt;li&gt;Have your Git installed and enabled on your local machine. &lt;/li&gt;
&lt;li&gt;pip installed. pip is a package manager for Python. Just like npm for NodeJS. Refer to &lt;a href="https://pip.pypa.io/en/stable/reference/pip_install/" rel="noopener noreferrer"&gt;this document&lt;/a&gt; to install pip.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Note that in this tutorial, I am using my administration account. It is always recommended to provide the least privilege permissions to the user.&lt;/p&gt;

&lt;h2&gt;
  
  
  Walkthrough
&lt;/h2&gt;

&lt;p&gt;In this project, we will be using:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;boto3: AWS SDK in Python&lt;/li&gt;
&lt;li&gt;inquirer: To ease the process of asking end user questions, parsing, validating answers, managing hierarchical prompts and providing error feedback&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Clone the repository and go to s3 directory in terminal.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/andylim0221/aws_scripts.git

&lt;span class="nb"&gt;cd &lt;/span&gt;aws_scripts/s3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Install the necessary packages such as boto3 and inquirer.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Run the script with your profile or default profile.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;python&lt;/span&gt; &lt;span class="n"&gt;delete_bucket&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;py&lt;/span&gt; &lt;span class="c1"&gt;# default profile
&lt;/span&gt;
&lt;span class="c1"&gt;# or with profile any
&lt;/span&gt;&lt;span class="n"&gt;python&lt;/span&gt; &lt;span class="n"&gt;delete_bucket&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;py&lt;/span&gt; &lt;span class="o"&gt;--&lt;/span&gt;&lt;span class="n"&gt;profile&lt;/span&gt; &lt;span class="nb"&gt;any&lt;/span&gt; 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And you will see the similar lists as below :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxp3iqotu1gjro5qqhpjy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxp3iqotu1gjro5qqhpjy.png" alt="ScreenShot1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can choose the buckets you want to delete by pressing &lt;code&gt;space&lt;/code&gt; bar and navigating by &lt;code&gt;up arrow&lt;/code&gt; and &lt;code&gt;down arrow&lt;/code&gt; button. The selected options will be shown in yellow font color. By clicking &lt;code&gt;space bar&lt;/code&gt; again on the selected buckets will remove it from the options.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoqnxchrrlxfa8v09sls.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoqnxchrrlxfa8v09sls.png" alt="Screenshot2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you have finished selecting, press &lt;code&gt;Enter&lt;/code&gt; button and go to next step.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5io3xxsknfrqleqh3g9m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5io3xxsknfrqleqh3g9m.png" alt="ScreenShot3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will see the question to confirm if you want to delete the buckets you chose earlier. By typing &lt;code&gt;delete&lt;/code&gt;, the action is confirmed and it will proceed to delete the buckets. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzq0qp65ae4j4bjzglyu5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzq0qp65ae4j4bjzglyu5.png" alt="ScreenShot4"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And now you just have to wait for the deletion to happen and in the meanwhile, have a good cup of coffee.&lt;/p&gt;

&lt;p&gt;When the script is done, check your buckets by listing your buckets to see if the selected buckets are deleted.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws s3 &lt;span class="nb"&gt;ls&lt;/span&gt; &lt;span class="c"&gt;#list all buckets&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that, if you want to cancel the action or reselect the buckets, you have to kill the script by &lt;code&gt;Ctrl-C&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Troubleshooting
&lt;/h2&gt;

&lt;p&gt;When you list the buckets, sometimes the deleted buckets are still there, the reason why it is still in the list because the list is not updated immediately. You can check in your AWS Console and you will see the deleted buckets are shown as below :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1sq1oboqw6xf3djq2g8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1sq1oboqw6xf3djq2g8t.png" alt="ScreenShot5"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Give it up to 24 hours, the bucket should just disappear from the list. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this one simple script, I can :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;remove objects&lt;/li&gt;
&lt;li&gt;remove versioned objects&lt;/li&gt;
&lt;li&gt;empty objects&lt;/li&gt;
&lt;li&gt;delete buckets&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Of course this can be improved in security or efficiency, but it should be doing well for the job. Let me know in the comments if you found bugs or issues from this solution. &lt;/p&gt;

&lt;p&gt;Happy coding! 💻&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>devops</category>
    </item>
    <item>
      <title>AWS CloudFormation and Github Action</title>
      <dc:creator>Andy Lim</dc:creator>
      <pubDate>Sun, 14 Mar 2021 08:09:54 +0000</pubDate>
      <link>https://dev.to/andylim0221/aws-cloudformation-and-github-action-2ck0</link>
      <guid>https://dev.to/andylim0221/aws-cloudformation-and-github-action-2ck0</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;The focus of this article is to introduce the integration of AWS CloudFormation and Github Action.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Github Action&lt;/strong&gt; is a CI/CD service provided by Github and it has free tier. With Github Action, you can choose to run CI/CD actions on your self-hosted server or Github docker. It's user friendly and there are multiple plugins available in Github Action Marketplace which reduce your time of development and avoid RTW (Reinvent the Wheel).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CloudFormation&lt;/strong&gt; is a service that gives developers an easy way to automate the infrastructure provisioning using Infrastructure as Code (IaC). What's so special about Infrastrucure as Code is that we are able to run tests because these infrastructure components are written in code.&lt;/p&gt;

&lt;p&gt;In this demo, we will be integrating AWS CloudFormation with Github Action and do the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Any branch other than main:

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Linting&lt;/strong&gt;: Check the CloudFormation linting format using &lt;code&gt;cfn-lint&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security check&lt;/strong&gt;: Check the security of each components in CloudFormation using &lt;code&gt;cfn-nag&lt;/code&gt; to validate if they are adhered to security best practices from AWS.&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;li&gt;Main branch (protected):

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Deploy&lt;/strong&gt;: Deploy the CloudFormation template to S3.&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisite
&lt;/h2&gt;

&lt;p&gt;You may need the following to run on your projects:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; AWS credentials with programmatic access, so you need your AWS Access Key, AWS Secret Key&lt;/li&gt;
&lt;li&gt;AWS S3 bucket with bucket policy such that a put action is allowed and ACL is allowed, you need the AWS S3 bucket name and make sure it is globally unique.&lt;/li&gt;
&lt;li&gt;Github account&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
├── .github
│   └── workflows
│       ├── master.yml
│       └── test.yml
└── cloudformation
    ├── conformance-pack.yaml
    └── example.yaml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For this project demo, I write two CloudFormation templates:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;conformance-pack.yaml&lt;/code&gt; to deploy a set of config rules and remediation actions.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;example.yaml&lt;/code&gt; to deploy a simple S3 bucket with SNS trigger&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Nevertheless, the main focus is on the Github Actions. Let's take a look on what does the Github Action do. &lt;/p&gt;

&lt;h3&gt;
  
  
  Github Actions
&lt;/h3&gt;

&lt;p&gt;As you can see from the project structure, there are mainly two templates, &lt;code&gt;master.yaml&lt;/code&gt; and &lt;code&gt;test.yaml&lt;/code&gt; under the &lt;code&gt;.github/workflow&lt;/code&gt; directory. GitHub Actions uses YAML syntax to define the events, jobs, and steps and these files are stored in a directory called .github/workflows. In order for you to run the Github Actions, you need to create this directory so that whenever you push a commit into your repository, Github Actions will run automatically. &lt;/p&gt;

&lt;p&gt;In this directory, create two files, &lt;code&gt;master.yaml&lt;/code&gt; and &lt;code&gt;test.yaml&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# test.yaml&lt;/span&gt;

&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Testing&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Github&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Actions'&lt;/span&gt;


&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;Cloudformation-checker&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Check linting and security concerns&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v2&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;cfn-lint-action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ScottBrenner/cfn-lint-action@1.6.1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;args&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;cloudformation/**/*.yaml"&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;cfn-nag-action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;minchao/cfn-nag-action@v0.1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;args&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;--input-path&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;cloudformation/&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This Github Action will be triggered if a commit is pushed into this repository for all the branches. It will validate the template syntax using &lt;code&gt;cfn-lint-action&lt;/code&gt; and run security check using &lt;code&gt;cfn-nag-action&lt;/code&gt; if the commit passed the &lt;code&gt;cfn-lint-action&lt;/code&gt;. What's good abou this Github Action is that all of these steps are ready-made plugin from Github Action Marketplace and they have been created and used by the Github communities. For this project, I'm using the plugins from &lt;code&gt;ScottBrenner/cfn-lint-action@1.6.1&lt;/code&gt; for &lt;code&gt;cfn-lint-action&lt;/code&gt; and &lt;br&gt;
&lt;code&gt;minchao/cfn-nag-action@v0.1&lt;/code&gt; for &lt;code&gt;cfn-nag-action&lt;/code&gt;. There are several similar plugins in the Marketplace, please makes sure you read their &lt;code&gt;README.md&lt;/code&gt; in order to enable the actions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;S3Deploy&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;
  &lt;span class="na"&gt;pull_request&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Checkout"&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v2&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;S3 Sync&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;jakejarvis/s3-sync-action@v0.5.1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;args&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;--acl bucket-owner-read --follow-symlinks&lt;/span&gt;
        &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;AWS_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{secrets.AWS_ACCESS_KEY}}&lt;/span&gt;
          &lt;span class="na"&gt;AWS_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{secrets.AWS_SECRET_KEY}}&lt;/span&gt;
          &lt;span class="na"&gt;AWS_S3_BUCKET&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{secrets.AWS_BUCKET}}&lt;/span&gt;
          &lt;span class="na"&gt;SOURCE_DIR&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;./cloudformation'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lastly, this Github Action will copy the files in the &lt;code&gt;cloudformation&lt;/code&gt; directory to S3 bucket if a commit or a pull request is made into the main branch. As you can see from the &lt;code&gt;env&lt;/code&gt; section, there are several secrets,  AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_S3_BUCKET. Please refer to &lt;a href="https://docs.github.com/en/actions/reference/encrypted-secrets"&gt;this documentation site&lt;/a&gt; to learn how to input secrets into your Github Actions.&lt;/p&gt;

&lt;p&gt;What's good about this is that you don't have to manually push the templates into S3 bucket by running multiple command line or browsing into the AWS management console. &lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Throughout this simple project, you can learn how to run a simple CI/CD pipeline to run simple template syntax check and security check on the CloudFormation templates and then push them into a dedicated S3 bucket. Of course, there are some improvements, for example, auto provision the infrastructures into different environments after a file is pushed to S3 bucket. Feel free to extend from here and hope to see more on CI/CD for Infrastructure as Code.&lt;/p&gt;

&lt;h3&gt;
  
  
  Reference
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.github.com/en/actions/learn-github-actions"&gt;Github Action documentaions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.github.com/en/actions/reference/encrypted-secrets"&gt;Github Action secrets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/marketplace?type=actions"&gt;Github Action Marketplace&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/cloudformation/"&gt;CloudFormation&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>github</category>
    </item>
  </channel>
</rss>
