<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Alok Kumar</title>
    <description>The latest articles on DEV Community by Alok Kumar (@jhawithu).</description>
    <link>https://dev.to/jhawithu</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jhawithu"/>
    <language>en</language>
    <item>
      <title>Agile and DevOps</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Sat, 25 Nov 2023 04:51:12 +0000</pubDate>
      <link>https://dev.to/aws-builders/agile-and-devops-47nd</link>
      <guid>https://dev.to/aws-builders/agile-and-devops-47nd</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzb3bq5lmcxow9183bzds.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzb3bq5lmcxow9183bzds.jpeg" alt="Image description" width="800" height="371"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Agile and DevOps
&lt;/h2&gt;

&lt;p&gt;Agile and DevOps, we can understand it by simple example.&lt;/p&gt;

&lt;p&gt;Imagine you're planning a dinner party with friends. Instead of deciding everything at once, you might plan it step by step, being open to changes along the way.&lt;/p&gt;

&lt;p&gt;You decide on the basics first - date, guest list, and location. This is like Agile's "sprints" where you focus on small parts of a bigger project.&lt;/p&gt;

&lt;p&gt;As you plan, you're open to adjustments. Just like Agile encourages adapting plans as you go, maybe a friend can't make it or you find a better location.&lt;/p&gt;

&lt;p&gt;During the party, you might notice some things could be better, like the music or food choice. Agile works similarly; after each sprint (or phase), you get feedback and improve for the next one.&lt;/p&gt;

&lt;p&gt;Agile is like planning a gathering step by step, staying flexible, and adapting as needed. DevOps is more like a well-coordinated kitchen where teams collaborate, use efficient tools, and continuously improve their process. &lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Test Automation and Shift-Left&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Test Automation and Shift-Left both are one of the strong piller of Agile and Devops methodology.&lt;/p&gt;

&lt;p&gt;We can imagine a test automation as a robot assistant that helps to do our job faster than you could and save our lots of time and effort and Shift-left is planning ahead and gathering supplies and reduce the risks. &lt;/p&gt;

&lt;p&gt;Both actually aim to streamline processes, improve quality, and reduce risks in their respective domains.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Serverless&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Serverless is just like laundry service 😀 &lt;br&gt;
Where you drop your clothes without worrying about the washing machine, detergent, or timing. 😎 &lt;/p&gt;

&lt;p&gt;Laundry service ( or cloud provider in devops prospect) will manage the infrastructure and they can scale it accordingly.&lt;br&gt;
If your laundry load increases, the service handles it without you worrying about machine capacity. &lt;/p&gt;

&lt;p&gt;Serverless automatically scales based on demand, handling varying workloads seamlessly.&lt;/p&gt;

&lt;p&gt;It simplify tasks and allow users to focus more on their core activities without worrying about underlying processes.&lt;/p&gt;

</description>
      <category>agile</category>
      <category>aws</category>
      <category>devops</category>
      <category>programming</category>
    </item>
    <item>
      <title>Certified Kubernetes Administrator (CKA) - Several Successful Story</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Sat, 18 Nov 2023 05:10:35 +0000</pubDate>
      <link>https://dev.to/aws-builders/certified-kubernetes-administrator-cka-several-successful-story-41f0</link>
      <guid>https://dev.to/aws-builders/certified-kubernetes-administrator-cka-several-successful-story-41f0</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9-m3fcTf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rckyvbckoadf94u8h45x.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9-m3fcTf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rckyvbckoadf94u8h45x.JPG" alt="Image description" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oX4KcIB2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gynkdys5fe3vdj9gz9x7.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oX4KcIB2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gynkdys5fe3vdj9gz9x7.JPG" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jNgJ0AZl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c50l6dnv9p4iwzl6tf00.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jNgJ0AZl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c50l6dnv9p4iwzl6tf00.JPG" alt="Image description" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Are you ready to conquer the Kubernetes CKA (Certified Kubernetes Administrator) Exam?&lt;/p&gt;

&lt;p&gt;Introducing an exciting opportunity: my brand-new CKA Exam Questions Challenge Course, exclusively available on &lt;a href="https://www.udemy.com/course/certified-kubernetes-administrator-cka-lab-course/?couponCode=CKA-NOV-2"&gt;Udemy&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;This dynamic question challenge set is meticulously crafted to help you achieve remarkable scores in the CKA Exam.&lt;/p&gt;

&lt;p&gt;Each question is thoughtfully designed with varying levels of complexity, ensuring you're well-prepared before sitting for the exam.&lt;/p&gt;

&lt;p&gt;But wait, there's more! Our comprehensive video series is a must-watch, providing you with invaluable insights and guidance to tackle every question effectively.&lt;/p&gt;

&lt;p&gt;Still skeptical? Rest assured that over 1,000 individuals have already received notifications celebrating their successful completion of the CKA Exam, all thanks to our sample question sets.&lt;/p&gt;

&lt;p&gt;Some friends tagged me on LinkedIn after successful completing of CKA certification after completing my Udemy Course (You will be the next one :)), linking below few links for your reference:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7130758885799247872/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7127885010463846400/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7125145789059805185/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7120104625663320064/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7014130574432890880/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7005599330849955840/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7003013821833330688/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7002167131349925888/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7042381410388615168/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7040156541072654336/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7024419197044629504/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7046114059624013824/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7045756388374966272/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7050634770531135488/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7048354948291399680/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7084374694405033984/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7071203195460407296/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7091971533207715840/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7086764583595503616/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7113089648091922433/"&gt;CKA Tagging&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:7108042720786456577/"&gt;CKA Tagging&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;and many more ....&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;But we don't stop there. Each question comes with a complete lab, allowing you to gain hands-on experience and solidify your understanding of Kubernetes.&lt;/p&gt;

&lt;p&gt;Embark on your path to mastery and unlock the boundless potential of Kubernetes! &lt;/p&gt;

&lt;p&gt;Enroll now in the Kubernetes CKA Exam Questions &lt;a href="https://www.udemy.com/course/certified-kubernetes-administrator-cka-lab-course/?couponCode=CKA-NOV-2"&gt;Udemy Course&lt;/a&gt; using the heavy discount link.&lt;/p&gt;

&lt;h1&gt;
  
  
  kubernetes #CKA #DevOps #CKAD #learningeveryday #AWS #Cloud #devopscommunity #cloudcomputing
&lt;/h1&gt;

</description>
      <category>kubernetes</category>
      <category>cka</category>
      <category>certification</category>
      <category>devops</category>
    </item>
    <item>
      <title>Terraform - Interview Question Series</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Tue, 14 Nov 2023 13:22:22 +0000</pubDate>
      <link>https://dev.to/aws-builders/terraform-interview-question-series-1enh</link>
      <guid>https://dev.to/aws-builders/terraform-interview-question-series-1enh</guid>
      <description>&lt;p&gt;**&lt;/p&gt;

&lt;h2&gt;
  
  
  Please find the list of interview question on Terraform
&lt;/h2&gt;

&lt;p&gt;**&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Question-1: What is Terraform and what is it used for?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Answer: Terraform is an open-source infrastructure as code (IAC) software tool used for provisioning and managing cloud infrastructure, on-premises infrastructure, and other infrastructure resources in a consistent and efficient manner.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Question-2: What are the key features of Terraform?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Answer: Declarative Configuration: Terraform uses a declarative syntax to describe infrastructure resources, making it easier to manage complex infrastructure and ensuring consistency.&lt;/p&gt;

&lt;p&gt;Resource Management: Terraform provides a unified way to manage multiple resources, such as virtual machines, DNS entries, databases, and more.&lt;/p&gt;

&lt;p&gt;Versioning and History: Terraform maintains a history of changes and versions, making it easier to collaborate and roll back changes if necessary.&lt;/p&gt;

&lt;p&gt;Multi-Cloud Support: Terraform supports multiple cloud providers, including Amazon Web Services (AWS), Google Cloud Platform (GCP), Microsoft Azure, and more.&lt;/p&gt;

&lt;p&gt;Modular Design: Terraform allows users to define reusable components, making it easier to manage complex infrastructure and promoting consistency and organization.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;...&lt;/strong&gt;   &lt;a href="https://devdivetech.blogspot.com/search/label/Terraform-Interview"&gt;Visit my blog on Interview Question Series&lt;/a&gt; &lt;strong&gt;...&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Question-59: What is the difference between Terraform and other infrastructure as code tools such as Ansible and Chef?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Answer: Terraform, Ansible, and Chef are all popular tools for infrastructure as code, but they have different focuses and use cases.&lt;/p&gt;

&lt;p&gt;Terraform focuses on provisioning and managing infrastructure, providing a high-level description of the desired state of your infrastructure, and automating the process of creating and updating infrastructure to match that desired state. Terraform is best suited for tasks such as creating and managing cloud infrastructure, networking, and storage.&lt;/p&gt;

&lt;p&gt;Ansible, on the other hand, is focused on configuration management and deployment, providing a way to automate the deployment and configuration of software and applications. Ansible is best suited for tasks such as deploying and configuring applications and services, and managing the configuration of servers and other infrastructure components.&lt;/p&gt;

&lt;p&gt;Chef is another popular tool for infrastructure as code and configuration management, providing a way to automate the deployment and configuration of software and applications. Chef provides a more extensive and flexible automation framework, but requires a more significant learning curve and investment in terms of time and resources to use effectively.&lt;/p&gt;

&lt;p&gt;In short, Terraform, Ansible, and Chef are all valuable tools for infrastructure as code, but have different focuses and use cases, and can be used together in a complementary fashion to manage infrastructure and applications more effectively.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Question-60: How does Terraform handle rollbacks in case of failures during infrastructure changes?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Answer: Terraform provides a "plan and apply" approach, which makes it easy to preview and control changes to your infrastructure before they are actually made. This helps to reduce the risk of failures during infrastructure changes.&lt;/p&gt;

&lt;p&gt;In the event of a failure during an infrastructure change, Terraform provides several mechanisms for rolling back changes:&lt;/p&gt;

&lt;p&gt;Terraform state: The Terraform state file keeps track of the current state of your infrastructure, and can be used to revert changes in the event of a failure.&lt;/p&gt;

&lt;p&gt;Terraform destroy: The Terraform destroy command can be used to revert changes made by a Terraform apply, removing the resources that were created.&lt;/p&gt;

&lt;p&gt;Terraform taint: The Terraform taint command can be used to mark a specific resource as "tainted", which indicates to Terraform that the resource should be destroyed and recreated the next time Terraform apply is run.&lt;/p&gt;

&lt;p&gt;In addition, Terraform provides state management features, such as state backup and state import, which make it easier to manage and maintain the state file over time, and to revert changes in the event of a failure.&lt;/p&gt;

&lt;p&gt;In short, Terraform provides a flexible and comprehensive mechanism for handling rollbacks in case of failures during infrastructure changes, making it easier to ensure the stability and reliability of your infrastructure over time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;...&lt;/strong&gt;   &lt;a href="https://devdivetech.blogspot.com/search/label/Terraform-Interview"&gt;Visit my blog on Interview Question Series&lt;/a&gt; &lt;strong&gt;...&lt;/strong&gt; &lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>interview</category>
      <category>devops</category>
    </item>
    <item>
      <title>YouTube 13k Subscriber crossed! Exciting Giveaway</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Sat, 04 Nov 2023 06:27:52 +0000</pubDate>
      <link>https://dev.to/aws-builders/youtube-13k-subscriber-crossed-exciting-giveaway-3gna</link>
      <guid>https://dev.to/aws-builders/youtube-13k-subscriber-crossed-exciting-giveaway-3gna</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gVTC5jyI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8l91iincv3whky45pnmx.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gVTC5jyI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8l91iincv3whky45pnmx.JPG" alt="Image description" width="340" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Dear Friends,&lt;/p&gt;

&lt;p&gt;My YouTube channel, &lt;a href="https://www.youtube.com/@AlokKumar"&gt;@alokkumar&lt;/a&gt;, has just reached a remarkable milestone of 13,000 subscribers. &lt;/p&gt;

&lt;p&gt;To celebrate this incredible achievement, I'm thrilled to announce a special giveaway. &lt;/p&gt;

&lt;p&gt;I'll be offering a generous gift of 100 free enrollments for my highly acclaimed Udemy course, "Certified Kubernetes Administrator (CKA): 100% Lab Course," exclusively to those candidates who have scheduled their CKA exam on or before January 31, 2024. &lt;/p&gt;

&lt;p&gt;My best wishes and good luck to all of you amazing individuals aiming for CKA success! 🚀🎉&lt;/p&gt;

&lt;p&gt;Please DM me on LinkedIn and provide me your CKA exam enrollment acknowledgement receipt.&lt;/p&gt;

&lt;p&gt;Please share it with your friends and repost this post.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/alok-kumar-devops/"&gt;LinkedIn&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/@AlokKumar"&gt;YouTube&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/course/certified-kubernetes-administrator-cka-lab-course/"&gt;CKA Udemy Course Link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thanks&lt;/p&gt;

&lt;h1&gt;
  
  
  kubernetes #cka #aws #devopscommunity #giveaway #keeplearning #youtube #udemycoupon #udemyfree #udemy
&lt;/h1&gt;

</description>
      <category>kubernetes</category>
      <category>devops</category>
      <category>cka</category>
      <category>aws</category>
    </item>
    <item>
      <title>DevOps Interview Bootcamp: Sharpen Your Q&amp;A Skills - 900 Q&amp;A</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Wed, 01 Nov 2023 15:04:46 +0000</pubDate>
      <link>https://dev.to/aws-builders/devops-interview-bootcamp-sharpen-your-qa-skills-900-qa-34a4</link>
      <guid>https://dev.to/aws-builders/devops-interview-bootcamp-sharpen-your-qa-skills-900-qa-34a4</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzz1ka1smjtvsdex96no0.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzz1ka1smjtvsdex96no0.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Dear Friends,&lt;/p&gt;

&lt;p&gt;Welcome to our exciting 2nd course on Udemy "DevOps Interview Bootcamp: Sharpen Your Q&amp;amp;A Skills - 900 Q&amp;amp;A".&lt;/p&gt;

&lt;p&gt;Are you tired of struggling with technical questions during DevOps interviews? Do you want to gain confidence and master the art of answering difficult questions? Look no further! Our comprehensive course is designed to help you excel in your DevOps interviews and land your dream job using MCQ question set with 900 Questions.&lt;/p&gt;

&lt;p&gt;Our expert instructors have crafted a detailed question and answer set for you to practice with. You'll learn how to interpret each question, understand the underlying concepts, and formulate the correct response. You'll also receive valuable feedback on your answers, helping you to identify and correct mistakes.&lt;/p&gt;

&lt;p&gt;This course is not just about memorizing answers. It's about developing a deep understanding of DevOps concepts and learning how to apply them to real-world situations. With our course, you'll be able to confidently tackle even the most challenging technical questions that may come your way during an interview.&lt;/p&gt;

&lt;p&gt;Our unique approach to DevOps interview preparation ensures that you will not only be well-prepared for your interviews but also excel in your role as a DevOps professional. Don't miss out on this opportunity to take your career to the next level. Enroll now and start your journey to DevOps Interview Mastery!&lt;/p&gt;

&lt;p&gt;Questions Covered From:&lt;/p&gt;

&lt;p&gt;Jenkins&lt;br&gt;
Docker&lt;br&gt;
Kubernetes&lt;br&gt;
Terraform&lt;br&gt;
Ansible&lt;br&gt;
Git&lt;br&gt;
AWS&lt;br&gt;
Others&lt;/p&gt;

&lt;p&gt;Enroll now in the Udemy Course using the link with heavy discount coupon:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/course/devops-interview-bootcamp-question-practice/?couponCode=DEVOPS-OCT-1" rel="noopener noreferrer"&gt;DevOps Interview Bootcamp: Sharpen Your Q&amp;amp;A Skills - 900 Q&amp;amp;A&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  kubernetes #DevOps #learningeveryday #AWS #Cloud #devopscommunity #cloudcomputing #Jenkins #Docker #Terraform #Ansible #Git
&lt;/h1&gt;

</description>
      <category>devops</category>
      <category>kubernetes</category>
      <category>terraform</category>
      <category>aws</category>
    </item>
    <item>
      <title>'Kubernetes Complete Course In 10 Hours' – your ultimate beginner's guide to Kubernetes FREE!</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Fri, 06 Oct 2023 02:29:38 +0000</pubDate>
      <link>https://dev.to/aws-builders/kubernetes-complete-course-in-10-hours-your-ultimate-beginners-guide-to-kubernetes-free-58cb</link>
      <guid>https://dev.to/aws-builders/kubernetes-complete-course-in-10-hours-your-ultimate-beginners-guide-to-kubernetes-free-58cb</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0Mayt2xJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mw75jyg1zba43ucy2571.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0Mayt2xJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mw75jyg1zba43ucy2571.jpg" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Dear Friends,&lt;/p&gt;

&lt;p&gt;The long-awaited moment has finally arrived! Introducing the 'Kubernetes Complete Course In 10 Hours' – your ultimate beginner's guide to Kubernetes.&lt;/p&gt;

&lt;p&gt;Let's spread the word! Share this video link far and wide to help those in need and build a vibrant learning community together.&lt;/p&gt;

&lt;p&gt;Kubernetes, often abbreviated as K8s, is like a super-smart manager for your computer programs. Imagine you have lots of applications and want to make sure they run smoothly, scale when needed, and recover from problems automatically. &lt;/p&gt;

&lt;p&gt;Kubernetes helps you do that by organizing and managing your applications, like a traffic cop for your software, making sure everything runs as it should on a cluster of computers or servers. &lt;/p&gt;

&lt;p&gt;It's like having a team of experts to take care of your software, so you don't have to worry about it constantly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What all topics of Kubernetes, we are going to cover in this course?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Kubernetes Cluster Architecture &lt;br&gt;
EKS Cluster Setup &lt;br&gt;
Simple Pod&lt;br&gt;
ReplicaSet &lt;br&gt;
Deployments&lt;br&gt;
DaemonSet &lt;br&gt;
Lebel and Selector &lt;br&gt;
Static Pod&lt;br&gt;
Graceful and Forceful Termination of Resource&lt;br&gt;
NodeName and NodeSelector &lt;br&gt;
NodeAffinity&lt;br&gt;
Taint and Tolerations&lt;br&gt;
Namespaces &lt;br&gt;
Service Part-1 &lt;br&gt;
Service Part-2 &lt;br&gt;
Init Containers &lt;br&gt;
Multiple-Scheduler &lt;br&gt;
ConfigMaps &lt;br&gt;
Secrets &lt;br&gt;
Persistent Volume &amp;amp; Claim &lt;br&gt;
Network Policy &lt;br&gt;
ETCD Database backup and restore &lt;br&gt;
Cluster Upgrade&lt;br&gt;
EKS Cluster Scaling&lt;br&gt;
CKA Exam Tips&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;About Course&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;"We designed this course to be 80% hands-on labs and only 20% theory. This way, you can get the most out of it, even if you don't know anything about Kubernetes yet. All you need is a bit of familiarity with Docker, and you'll be good to go!"&lt;/p&gt;

&lt;h1&gt;
  
  
  Kubernetes #CKA #DevOps #CKAD #learningeveryday #AWS #Cloud #devopscommunity #cloudcomputing
&lt;/h1&gt;

&lt;p&gt;Don’t Forget To Like, Comment, Share &amp;amp; Subscribe to my Channel, It always motivates me.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/toLAU_QPF6o"&gt;Kubernetes Complete Course In 10 Hours&lt;/a&gt;&lt;/p&gt;

</description>
      <category>kubernetes</category>
      <category>aws</category>
      <category>devops</category>
      <category>cka</category>
    </item>
    <item>
      <title>Kubernetes CKA Exam Question Bank .. 17 Questions with Complete Lab .. In 3 hours .. FREE!</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Sun, 24 Sep 2023 16:27:48 +0000</pubDate>
      <link>https://dev.to/aws-builders/kubernetes-cka-exam-question-bank-17-questions-with-complete-lab-in-3-hours-free-4hie</link>
      <guid>https://dev.to/aws-builders/kubernetes-cka-exam-question-bank-17-questions-with-complete-lab-in-3-hours-free-4hie</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffp9vsoh6lme1ffk7dxzs.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffp9vsoh6lme1ffk7dxzs.jpg" alt="Kubernetes CKA"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hello friends, &lt;/p&gt;

&lt;p&gt;let me share my journey with you in a way that's both relatable and exciting. It's a story about my experience preparing for the Certified Kubernetes Administrator (CKA) exam and how it led to something amazing.&lt;/p&gt;

&lt;p&gt;Picture this: I was gearing up for the CKA exam, eager to conquer it. The only problem was, I had no clue about the type of questions that would pop up in the test. To make things challenging, there was hardly any content available on YouTube back in those days that could guide me. Frustrating, right?&lt;/p&gt;

&lt;p&gt;So, I made a bold decision. I decided to bridge this gap myself. I thought, "Why not create a course that provides questions similar to those asked in the CKA exam, along with step-by-step lab sections to tackle them?" That way, aspiring CKA candidates could not only understand the problems but also learn how to solve them effortlessly.&lt;/p&gt;

&lt;p&gt;As time rolled on, something incredible happened. People from all corners of the world started appreciating my efforts. They began clearing the CKA certification, all thanks to my "CKA Challenge Question Labs" course. You see, the CKA exam isn't a walk in the park, and my course wasn't just about questions. It was about boosting confidence.&lt;/p&gt;

&lt;p&gt;I didn't stop there, though. I kept adding more and more questions to the course, and my YouTube subscribers grew by the day. People were tagging me on LinkedIn, celebrating their certification victories. Over a thousand of them! But here's the kicker – my reach was still somewhat limited.&lt;/p&gt;

&lt;p&gt;That's when I had a vision. I needed to take my mission to the next level. So, I decided to transform my course into something bigger and better on Udemy. I put together a comprehensive CKA practice set course, and guess what? Friends from all over the world showed a keen interest and enrolled in the course.&lt;/p&gt;

&lt;p&gt;But wait, there's more. I wanted to make sure that everyone had access to these valuable resources. So, I did something special. I distributed some of my courses for free and even hosted exciting giveaways. Of course, there were paid options too, for those who wanted to dive deeper.&lt;/p&gt;

&lt;p&gt;In this video series, I cover a massive 17 challenging CKA exam questions. Each question is accompanied by a lab section right next to it, and they come in varying difficulty levels, all neatly packaged into one video. It's a breeze to watch and prepare for the exam.&lt;/p&gt;

&lt;p&gt;Now, here's the deal: If you appreciate my dedication and want to tackle more of these challenging questions, I invite you to enroll in my Udemy course. I'm constantly adding fresh content, so you'll stay connected with me and continue to supercharge your CKA exam preparation.&lt;/p&gt;

&lt;p&gt;Join me on this journey, and let's conquer the CKA exam together!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/udA3OWkmMUY" rel="noopener noreferrer"&gt;Kubernetes CKA Exam Question Bank&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/course/certified-kubernetes-administrator-cka-lab-course/?couponCode=CRACK-IT" rel="noopener noreferrer"&gt;Full Course On Udemy&lt;/a&gt;&lt;/p&gt;

</description>
      <category>kubernetes</category>
      <category>cka</category>
      <category>ckad</category>
      <category>aws</category>
    </item>
    <item>
      <title>Terraform Full Course in 9 hours .. Zero to Hero Series #FREE#</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Thu, 21 Sep 2023 04:26:04 +0000</pubDate>
      <link>https://dev.to/aws-builders/terraform-full-course-in-9-hours-zero-to-hero-series-5492</link>
      <guid>https://dev.to/aws-builders/terraform-full-course-in-9-hours-zero-to-hero-series-5492</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KCPjGGb2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/427bwkoqg0fwz4p532m0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KCPjGGb2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/427bwkoqg0fwz4p532m0.jpg" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Dear Friends,&lt;/p&gt;

&lt;p&gt;Please find my &lt;strong&gt;Terraform Full Course in 9 hours .. Zero to Hero Series&lt;/strong&gt; on YouTube for free .. #terraform #devops @AlokKumar&lt;/p&gt;

&lt;p&gt;Terraform is a tool that helps you create and manage your computer infrastructure, like servers, databases, and networks, in a way that's easy to understand and repeatable. It uses code to define your infrastructure, which means you can treat your infrastructure like software, allowing you to automate its creation and changes.&lt;/p&gt;

&lt;p&gt;Let's say you want to set up a web server on a cloud platform like Amazon Web Services (AWS). Instead of manually clicking through the AWS console to create the server, you can use Terraform to write a simple configuration file that describes the server's specifications, like its size, operating system, and security settings.&lt;/p&gt;

&lt;p&gt;By using Terraform, you can automate the provisioning and management of your infrastructure, making it easier to scale, update, and maintain your applications and services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What all topics of Terraform, we are going to cover in this course?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Creating First EC2 Instance&lt;br&gt;
tfstate file .. backup state file .. destroy target flag&lt;br&gt;
Terraform Resource Attribute and Output Values&lt;br&gt;
Terraform Provider Version handling&lt;br&gt;
Terraform Format, Validate, EIP Association with EC2&lt;br&gt;
Terraform Shared Credential, AWS CLI, Comments&lt;br&gt;
Running Nginx From Docker Container using Terraform&lt;br&gt;
Terraform Input Variable Part – 1&lt;br&gt;
Terraform Variable, Count and Generating and Applying Terraform Plan Part – 2&lt;br&gt;
Input Variable from terraform.tfvars, *.tfvars, *.auto.tfvars and environment variables Part – 3&lt;br&gt;
Implement Variable Type as String, Number, List and Map. Part – 4&lt;br&gt;
Terraform Meta-Argument for_each&lt;br&gt;
Terraform Meta-Argument – lifecycle&lt;br&gt;
Terraform Data Resource&lt;br&gt;
Terraform State Management using S3 and DynamoDB&lt;br&gt;
Terraform Workspace .. Help to keep Infrastructure consistent&lt;br&gt;
Terraform Taint .. Recreate a degraded or damaged resources&lt;br&gt;
Numeric, String and Collection Function&lt;br&gt;
Encoding &amp;amp; FileSystem&lt;br&gt;
Terraform Provisioners .. File, Local-exec,  Remote-exec, Creation &amp;amp; Destroy-Time and Failure Behavior&lt;br&gt;
Terraform Modules&lt;br&gt;
Terraform Locals and Module Source&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;About this course&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We designed this course to be 80% hands-on labs and only 20% theory. This way, you can get the most out of it, even if you don't know anything about Terraform yet. All you need is a bit of familiarity with AWS, and you'll be good to go!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;About me&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hi, I'm Alok Kumar, and I've been working in the IT industry for over 15 years. I'm not only an instructor for three courses on Udemy but also share my knowledge through my YouTube channel. What's really exciting is that over 1000 students have already earned certifications through my courses and connected with me on LinkedIn!&lt;/p&gt;

&lt;h1&gt;
  
  
  Terraform #DevOps #learningeveryday #AWS #Cloud #devopscommunity #cloudcomputing
&lt;/h1&gt;

&lt;p&gt;&lt;strong&gt;Don’t Forget To Like, Comment, Share &amp;amp; Subscribe to my Channel, It always motivates me.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/8ajLARoQwHM"&gt;Free YouTube Course - Terraform Full Course in 9 hours&lt;/a&gt;&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>devops</category>
      <category>aws</category>
      <category>cloudcomputing</category>
    </item>
    <item>
      <title>Certified Kubernetes Administrator (CKA) - Practice Question with Complete Details Lab</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Thu, 13 Apr 2023 02:46:45 +0000</pubDate>
      <link>https://dev.to/aws-builders/certified-kubernetes-administrator-cka-practice-question-with-complete-details-lab-3l9l</link>
      <guid>https://dev.to/aws-builders/certified-kubernetes-administrator-cka-practice-question-with-complete-details-lab-3l9l</guid>
      <description>&lt;p&gt;If you're looking to become a certified Kubernetes administrator, then our CKA course is an absolute must-watch! Kubernetes has become the de facto standard for container orchestration, and a CKA certification is a valuable asset to have on your resume. &lt;/p&gt;

&lt;p&gt;Our course is designed to give you the knowledge and skills you need to pass the CKA exam with flying colors.&lt;/p&gt;

&lt;p&gt;In this course, you will learn to answer, how to deploy and manage Kubernetes clusters, configure networking, and use Kubernetes APIs to automate tasks. &lt;/p&gt;

&lt;p&gt;You'll gain hands-on experience with important concepts like containerization, pod deployment, and service management. &lt;/p&gt;

&lt;p&gt;As an expert instructors, I will guide you through each step of the process, giving you the confidence you need to succeed on exam day.&lt;/p&gt;

&lt;p&gt;Whether you're just starting out with Kubernetes or looking to take your skills to the next level, our CKA course on complete hands-on is the perfect choice. &lt;/p&gt;

&lt;p&gt;With comprehensive lab session, real-world examples, and interactive labs, this course is designed to help you master the skills you need to become a successful Kubernetes administrator. &lt;/p&gt;

&lt;p&gt;Sign up today and take your first step towards CKA certification!&lt;/p&gt;

&lt;p&gt;This is 100% Lab Series with proper solution to each questions.&lt;/p&gt;

&lt;p&gt;Remember that this course gives you more than 90% guarantee that, you will pass the Certified Kubernetes Administrator Exam. &lt;/p&gt;

&lt;p&gt;You need to understand, each and every question carefully and try to answer your own first, in case you feel difficulties in solving it.&lt;/p&gt;

&lt;p&gt;A proper solution, provided to that problem in details in next video.&lt;/p&gt;

&lt;p&gt;After achieving great success on &lt;a href="https://www.youtube.com/@AlokKumar"&gt;YouTube&lt;/a&gt;, I'm bringing this valuable course to a wider audience.&lt;/p&gt;

&lt;p&gt;Don't miss out on this amazing opportunity - click on the link below to enroll now and take your career to the next level with the Certified Kubernetes Administrator - CKA certification.&lt;/p&gt;

&lt;p&gt;Your satisfaction is my top priority, and your positive rating and feedback are incredibly valuable to me. &lt;/p&gt;

&lt;p&gt;Not only do they give me a boost of confidence, but they also help me improve my skills and better serve you in the future. &lt;/p&gt;

&lt;p&gt;So, please don't hesitate to leave a rating and share your thoughts - I am eager to hear what you have to say!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/course/certified-kubernetes-administrator-cka-lab-course/?couponCode=CKA-NOV-2"&gt;My Udemy Course Link&lt;/a&gt; &lt;/p&gt;

</description>
      <category>cka</category>
      <category>kubernetes</category>
      <category>certification</category>
      <category>devops</category>
    </item>
    <item>
      <title>Creating Thumbnail Image using Terraform, Lambda, IAM, CloudWatch and S3 Bucket</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Tue, 13 Dec 2022 07:42:21 +0000</pubDate>
      <link>https://dev.to/aws-builders/creating-thumbnail-image-using-terraform-lambda-iam-cloudwatch-and-s3-bucket-23jp</link>
      <guid>https://dev.to/aws-builders/creating-thumbnail-image-using-terraform-lambda-iam-cloudwatch-and-s3-bucket-23jp</guid>
      <description>&lt;p&gt;In this Blog, we are going to use Terraform as IAC tools for resource provisioning.&lt;/p&gt;

&lt;p&gt;Through, Terraform we are going to create below resources and establish the connectivity between them:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Lambda&lt;/li&gt;
&lt;li&gt; IAM&lt;/li&gt;
&lt;li&gt; CloudWatch&lt;/li&gt;
&lt;li&gt; S3 bucket (Original and Thumbnail)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We are going to see how to use terraform to create an AWS Lambda function and configure an AWS S3 trigger.&lt;/p&gt;

&lt;p&gt;The use case is to generate a thumbnail image whenever an image is uploaded to S3 Original bucket.&lt;/p&gt;

&lt;p&gt;We will first write a Lambda function which will open the original image from a source bucket and create a thumbnail and store it in thumbnail bucket.&lt;/p&gt;

&lt;p&gt;Then we will configure an S3 trigger to trigger this Lambda function whenever an image is uploaded to the S3 bucket.&lt;/p&gt;

&lt;p&gt;The prerequisite for this project is to have terraform and AWS CLI configured.&lt;/p&gt;

&lt;p&gt;I have already configured my CLI in my system using “aws configure” command.&lt;/p&gt;

&lt;p&gt;I will create a new directory for the project, we’ll call it as “thumbnail_generation_lambda-main” we will open this project in the visual studio code.&lt;/p&gt;

&lt;p&gt;We will create a Lambda function in python, which I will put it in a directory called src.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7f53wt4e2dg9qq1xfzis.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7f53wt4e2dg9qq1xfzis.JPG" alt="Image description" width="751" height="650"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will start writing the code, first we need to define a Lambda Handler, which will be the entry point for the Lambda function, which takes two parameters event and context.&lt;/p&gt;

&lt;p&gt;I will import the logging Library and configure the logger.&lt;br&gt;
As we begin, we will print the event and the context.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu6ocadgvnpjyuar4g6bq.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu6ocadgvnpjyuar4g6bq.JPG" alt="Image description" width="800" height="534"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we need to extract the bucket name and the key from the event. Whenever this Lambda function is called from an S3 trigger, the trigger will pass the event.&lt;/p&gt;

&lt;p&gt;Object with the bucket and the key, which is a file name being uploaded, we need to extract the bucket name and key.&lt;/p&gt;

&lt;p&gt;The first item in this record is S3 bucket name. This will be the bucket name and the key name. Then we will define the destination bucket name and the key.&lt;/p&gt;

&lt;p&gt;We will call this bucket name as "alok-thumbnail-image-bucket-0007" and the destination file name we will just append underscore thumbnail to the original file name, for which I need to use OS dot path split, for that I will import OS library and I will pass the key which will return me the name and extension separately, thumbnail name extension from this I will build the thumbnail key which would be thumbnail name underscore thumbnail and the extension, this will be the destination key.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe5buykpgsc3czbnsc3ya.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe5buykpgsc3czbnsc3ya.JPG" alt="Image description" width="800" height="211"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we will get access to S3 for which, I need to use the import library boto3.&lt;/p&gt;

&lt;p&gt;Open the file in S3 by passing bucket and key value, I will extract the body of this and read it as a byte string.&lt;/p&gt;

&lt;p&gt;We will load this image in memory bytes IO for that we will import BytesIO Library, from IO import bytes IO then load the image in memory.&lt;/p&gt;

&lt;p&gt;This will load the image in memory and then we will use pillow image object.&lt;/p&gt;

&lt;p&gt;We will pass this byte IO image dot open and passing this byte IO.  I'm passing this image, just for the information purpose we will log the size of this image before we actually compress.&lt;/p&gt;

&lt;p&gt;To generate thumbnail this is the method IMG dot thumbnail and mention the preferred size, here I will mention 500 comma 500.&lt;/p&gt;

&lt;p&gt;With image anti layers option passed the image would have been now compressed we will again log the size now to see what is the to see what is the final results after compression and now we need to dump this image into a destination bucket for that we need to write the image in a buffer and then pass this buffer directly to the destination bucket. &lt;/p&gt;

&lt;p&gt;Buffer equal to bytes IO, we create a new by for object and then save the image to this buffer in the in the preferred format.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw1brue2giwsn19fvhza4.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw1brue2giwsn19fvhza4.JPG" alt="Image description" width="800" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now I will be dumping it into the destination bucket. We'll use this same data to validate whether the success whether the upload is successful or not.&lt;/p&gt;

&lt;p&gt;Status code is not equal to 200 which is Success you will raise exception and then finishing the Lambda function by return success.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7zmaeayimuwghkq1q86.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7zmaeayimuwghkq1q86.JPG" alt="Image description" width="800" height="142"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, in this case I am just returning the original event object with this we have finished writing the Lambda function, below is the final code of the lambda.py file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import logging
import boto3
from io import BytesIO
from PIL import Image
import os

logger = logging.getLogger()
logger.setLevel(logging.INFO)

def lambda_handler(event, context):
    logger.info(f"event: {event}")
    logger.info(f"context: {context}")

    bucket = event["Records"][0]["s3"]["bucket"]["name"]
    key = event["Records"][0]["s3"]["object"]["key"]

    thumbnail_bucket = "alok-thumbnail-image-bucket-0007"
    thumbnail_name, thumbnail_ext = os.path.splitext(key)
    thumbnail_key = f"{thumbnail_name}_thumbnail{thumbnail_ext}"

    logger.info(f"Bucket name: {bucket}, file name: {key}, Thumbnail Bucket name: {thumbnail_bucket}, file name: {thumbnail_key}")

    s3_client = boto3.client('s3')

    # Below snippet of code will Load and open image from S3
    file_byte_string = s3_client.get_object(Bucket=bucket, Key=key)['Body'].read()
    img = Image.open(BytesIO(file_byte_string))
    logger.info(f"Size before compression: {img.size}")

    # Below snippet of code will Generate thumbnail
    img.thumbnail((500,500), Image.ANTIALIAS)
    logger.info(f"Size after compression: {img.size}")

    # Below snippet of code will Dump and save image to S3
    buffer = BytesIO()
    img.save(buffer, "JPEG")
    buffer.seek(0)

    sent_data = s3_client.put_object(Bucket=thumbnail_bucket, Key=thumbnail_key, Body=buffer)

    if sent_data['ResponseMetadata']['HTTPStatusCode'] != 200:
        raise Exception('Failed to upload image {} to bucket {}'.format(key, bucket))

    return event

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will then move on to configure part of Terraform (Main.tf).&lt;/p&gt;

&lt;p&gt;We are going to mention all the details in the terraform for provisioning the resources like:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. AWS Provider&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "4.36.1"
    }
    archive = {
      source  = "hashicorp/archive"
      version = "~&amp;gt; 2.2.0"
    }
  }
  required_version = "~&amp;gt; 1.0"
}

provider "aws" {
  region = var.aws_region
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. Archive Provider - Used for bundling the source to zip file and upload it to lambda.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Original and Thumbnail S3 bucket.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_s3_bucket" "thumbnail_original_image_bucket" {
  bucket = "alok-original-image-bucket-0007"
}

resource "aws_s3_bucket" "thumbnail_image_bucket" {
  bucket = "alok-thumbnail-image-bucket-0007"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;4. Setting the policy of Get (Original) and Put object (Thumbnail).&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_iam_policy" "thumbnail_s3_policy" {
  name = "thumbnail_s3_policy"
  policy = jsonencode({
    "Version" : "2012-10-17",
    "Statement" : [{
      "Effect" : "Allow",
      "Action" : "s3:GetObject",
      "Resource" : "arn:aws:s3:::alok-original-image-bucket-0007/*"
      }, {
      "Effect" : "Allow",
      "Action" : "s3:PutObject",
      "Resource" : "arn:aws:s3:::alok-thumbnail-image-bucket-0007/*"
    }]
  })
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;5. Lambda IAM Role to assume the role&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_iam_role" "thumbnail_lambda_role" {
  name = "thumbnail_lambda_role"
  assume_role_policy = jsonencode({
    "Version" : "2012-10-17",
    "Statement" : [{
      "Effect" : "Allow",
      "Principal" : {
        "Service" : "lambda.amazonaws.com"
      },
      "Action" : "sts:AssumeRole"
    }]
  })
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;6. IAM Policy Attachment, role for s3 and role for lambda&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_iam_policy_attachment" "thumbnail_role_s3_policy_attachment" {
  name       = "thumbnail_role_s3_policy_attachment"
  roles      = [aws_iam_role.thumbnail_lambda_role.name]
  policy_arn = aws_iam_policy.thumbnail_s3_policy.arn
}
resource "aws_iam_policy_attachment" "thumbnail_role_lambda_policy_attachment" {
  name       = "thumbnail_role_lambda_policy_attachment"
  roles      = [aws_iam_role.thumbnail_lambda_role.name]
  policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;7. Configuration to zip the lambda file to upload “my-lambda-code.zip”&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_lambda_function" "thumbnail_lambda" {
  function_name = "thumbnail_generation_lambda"
  filename      = "${path.module}/my-lambda-code.zip"

  runtime     = "python3.9"
  handler     = "lambda.lambda_handler"
  memory_size = 256

  source_code_hash = data.archive_file.thumbnail_lambda_source_archive.output_base64sha256

  role = aws_iam_role.thumbnail_lambda_role.arn

  layers = [
    "arn:aws:lambda:ap-south-1:770693421928:layer:Klayers-p39-pillow:1"
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;8. Setting lambda permission to Thumbnail bucket to put the thumbnail.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_lambda_permission" "thumbnail_allow_bucket" {
  statement_id  = "AllowExecutionFromS3Bucket"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.thumbnail_lambda.arn
  principal     = "s3.amazonaws.com"
  source_arn    = aws_s3_bucket.thumbnail_original_image_bucket.arn
}

resource "aws_s3_bucket_notification" "thumbnail_notification" {
  bucket = aws_s3_bucket.thumbnail_original_image_bucket.id

  lambda_function {
    lambda_function_arn = aws_lambda_function.thumbnail_lambda.arn
    events              = ["s3:ObjectCreated:*"]
  }

  depends_on = [
    aws_lambda_permission.thumbnail_allow_bucket
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;9. Creation of CloudWatch log group for logging purpose and monitoring traces.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_cloudwatch_log_group" "thumbnail_cloudwatch" {
  name = "/aws/lambda/${aws_lambda_function.thumbnail_lambda.function_name}"

  retention_in_days = 30
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Side by side, we created 2 more files:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.variables.tf&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variable "aws_region" {
  description = "AWS region for all resources."

  type    = string
  default = "ap-south-1"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2.    output.tf&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;output "iam_arn" {
  description = "IAM Policy ARN"
  value       = aws_iam_policy.thumbnail_s3_policy.arn
}

output "function_name" {
  description = "Lambda function name"
  value       = aws_lambda_function.thumbnail_lambda.function_name
}

output "cloud_watch_arn" {
  description = "Cloudwatch ARN"
  value       = aws_cloudwatch_log_group.thumbnail_cloudwatch.arn
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, its time to do the deployment by executing below command:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. terraform Init =&amp;gt;&lt;/strong&gt; This is where you initialize your code to download the requirements mentioned in your code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frtzwzgaqmsyoqqob4btx.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frtzwzgaqmsyoqqob4btx.JPG" alt="Image description" width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. terraform fmt =&amp;gt;&lt;/strong&gt; This is used to format the written terraform code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F78whqh67fddmdtqmkeyr.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F78whqh67fddmdtqmkeyr.JPG" alt="Image description" width="800" height="102"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.terraform validate =&amp;gt;&lt;/strong&gt; This is to validate the written terraform code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrc8f9urbgytopyo14y8.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrc8f9urbgytopyo14y8.JPG" alt="Image description" width="800" height="114"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4.terraform plan =&amp;gt;&lt;/strong&gt; This is where you review changes and choose whether to simply accept them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft73tpr1phgvwxqqdni86.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft73tpr1phgvwxqqdni86.JPG" alt="Image description" width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5.terraform apply =&amp;gt;&lt;/strong&gt; This is where you accept changes and apply them against real infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hqv9i36ofbekiujly0d.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hqv9i36ofbekiujly0d.JPG" alt="Image description" width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verification of S3 Bucket:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwqryhh3oh8jxqkt837n.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwqryhh3oh8jxqkt837n.JPG" alt="Image description" width="800" height="309"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verification of Lambda Creation:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsg2j667an4djk3lxl9gp.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsg2j667an4djk3lxl9gp.JPG" alt="Image description" width="800" height="184"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verification of CloudWatch Creation:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnbqo91jfyxivwtgyxphc.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnbqo91jfyxivwtgyxphc.JPG" alt="Image description" width="800" height="219"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now uploading one image “sample.jpg” of size 1.1MB into original S3 bucket “alok-original-image-bucket-0007”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5c117961km7x4t7h8upe.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5c117961km7x4t7h8upe.JPG" alt="Image description" width="720" height="83"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verifying S3 Bucket for image uploaded:&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Original:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5gjojpaer37913c13oa8.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5gjojpaer37913c13oa8.JPG" alt="Image description" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Thumbnail&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F52jlu4m9zyeuaojg81p5.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F52jlu4m9zyeuaojg81p5.JPG" alt="Image description" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;See the image size decreased from 1.1 MB to 20.7 KB.&lt;br&gt;
You can destroy the resources created using “terraform destroy” command but before that you need to clean both the bucket. Otherwise terraform will raise an exception.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sxhcc9nz6oesxqqllws.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sxhcc9nz6oesxqqllws.JPG" alt="Image description" width="800" height="308"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One of the best use cases of this project is to use in shortlisting candidate resume for some selected keyword, pick and move that resume to another s3 bucket.&lt;/p&gt;

&lt;p&gt;Candidate will upload the resume and it will go into S3 original bucket and Lambda will get trigger after upload and on the basis of some selected keyword it will shortlist the resume and move the selected resume to shortlisted_s3 bucket 😊.&lt;/p&gt;

</description>
      <category>html</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Amazon EKS Clusters Setup – Step by Step Instructions</title>
      <dc:creator>Alok Kumar</dc:creator>
      <pubDate>Fri, 02 Dec 2022 06:44:12 +0000</pubDate>
      <link>https://dev.to/aws-builders/amazon-eks-clusters-setup-step-by-step-instructions-2gp</link>
      <guid>https://dev.to/aws-builders/amazon-eks-clusters-setup-step-by-step-instructions-2gp</guid>
      <description>&lt;p&gt;A production-quality Kubernetes cluster requires planning and preparation. If your Kubernetes cluster is to run critical workloads, it must be configured to be resilient.&lt;/p&gt;

&lt;p&gt;There are several deployment tools to do the Kubernetes cluster steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Bootstrapping clusters with kubeadm&lt;/li&gt;
&lt;li&gt; Installing Kubernetes with KOPS&lt;/li&gt;
&lt;li&gt; Installing Kubernetes with Kubespray&lt;/li&gt;
&lt;li&gt; Turnkey Cloud Solution&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We are going to use Turnkey Cloud Solution (Amazon EKS), using EKSCTL utility we are going to manage the production ready cluster.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7x5uxk1tyn0ylpkl3lni.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7x5uxk1tyn0ylpkl3lni.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are several cloud providers who support creating of Kubernetes cluster setup, this blog concentrates on creating of cluster on amazon EKS.&lt;/p&gt;

&lt;p&gt;First, we are going to launch a Linux instance (t2 micro), this instance is used to manage the k8s cluster and give all the necessary instruction to Kubernetes cluster using EKSCTL and KUBECLT utility.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3wujrvvldc981lkq5cb.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3wujrvvldc981lkq5cb.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Allow SSH, HTTP and HTTPs traffic to that instance, rest leave as default and launch instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgy4cdjthk17hpqy6a6yx.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgy4cdjthk17hpqy6a6yx.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SSH to machine using public IPv4 address.&lt;/p&gt;

&lt;p&gt;Check the AWS CLI version, using “aws –version” command. Better to refer below link to use the current AWS CLI version:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/eks/latest/userguide/getting-started-console.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/eks/latest/userguide/getting-started-console.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can easily upgrade it to the latest by using below command:&lt;br&gt;
For Linux, execute the below command, for other OS refer to below link:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Above command will download the binaries and install the AWS CLI in your instance or override in case you have the older one. &lt;/p&gt;

&lt;p&gt;In case new version does not appear then exit and re-login to machine as demonstrated below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyxy8e4m0p4re28y335y.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyxy8e4m0p4re28y335y.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, we are going to install below item as prerequisites:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; kubectl&lt;/li&gt;
&lt;li&gt; Eksctl&lt;/li&gt;
&lt;li&gt; Required IAM Permissions to Instance&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can refer below link for more details: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Kubectl is a command line tool for working with Kubernetes cluster.&lt;/p&gt;

&lt;p&gt;For Linux machine, execute the below command, for other OS refer the link below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -o kubectl https://s3.us-west-2.amazonaws.com/amazon-eks/1.24.7/2022-10-31/bin/linux/amd64/kubectl
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/eks/latest/userguide/install-kubectl.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/eks/latest/userguide/install-kubectl.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, apply execute permissions to the binary, and copy the binary to a folder in your PATH, you can check the PATH by executing “echo $PATH” as mentioned below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;chmod +x ./kubectl
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctr5ozz06ifiayqkpdxw.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctr5ozz06ifiayqkpdxw.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Check kubectl version …&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9dfr2tv495cfk4r6hf2h.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9dfr2tv495cfk4r6hf2h.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, install eksctl by executing below command for Linux machine, for other OS refer the link below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --silent --location "https://github.com/weaveworks/eksctl/releases/latest/download/eksctl_$(uname -s)_amd64.tar.gz" | tar xz -C /tmp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/eks/latest/userguide/eksctl.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/eks/latest/userguide/eksctl.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Binaries got download into the tmp folder, go to tmp and copy the binary to a folder in your PATH, same as mentioned below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft94t2cy8dduyw5460yuy.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft94t2cy8dduyw5460yuy.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, its time to create an IAM Role with required permissions and attached it to the Linux EC2 Instance.&lt;/p&gt;

&lt;p&gt;Eks_role, get created with below permissions attached, I am giving the full access but you can refine the permission related things at your end accordingly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi77sxfcyeq03pdl96jcz.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi77sxfcyeq03pdl96jcz.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At backend, EKSCTL utility is using CloudFormation to do all the cluster creation, so EC2 machine must have all the required permission to perform the actions.&lt;/p&gt;

&lt;p&gt;Now, assign the role to EC2 Linux instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frgnavnhlenxrsa9dmfz1.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frgnavnhlenxrsa9dmfz1.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffvod0jnoggj039gtoqi0.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffvod0jnoggj039gtoqi0.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, using eksctl utility, we are going to create an EKS Cluster.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bg1hrwg0lr7cb1lbd1p.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bg1hrwg0lr7cb1lbd1p.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This will create a cluster with name “alok-devops” in Mumbai region with 2 worker node of type t2.small.&lt;/p&gt;

&lt;p&gt;Please be aware. this not comes into free-tier.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5j7japqtgpkdbhnb578.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5j7japqtgpkdbhnb578.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;CloudFormation Stacks get created for Kubernetes cluster using eksctl utility. During creation of stack and work efficiently it creates several roles, this is the reason we provided EC2 machine with permission of IAM full access. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzz9q4z4dvfxinsbn0cr.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzz9q4z4dvfxinsbn0cr.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feknmhf34zaeeyll0ap2a.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feknmhf34zaeeyll0ap2a.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fizpez00f98bnqegkhelf.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fizpez00f98bnqegkhelf.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Successfully created the EKS cluster with 2 worker node. It take approximately 18 min. &lt;/p&gt;

&lt;p&gt;Now, its time to validate the cluster using few Kubernetes command using kubectl utility.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyqayskiwegi7tquqfoqo.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyqayskiwegi7tquqfoqo.JPG" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can view complete setup on my youtube video:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/1O-I7NtwCeE"&gt;
&lt;/iframe&gt;
.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloud</category>
      <category>kubernetes</category>
    </item>
  </channel>
</rss>
