<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Joshua Muriki</title>
    <description>The latest articles on DEV Community by Joshua Muriki (@joshwizard).</description>
    <link>https://dev.to/joshwizard</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/joshwizard"/>
    <language>en</language>
    <item>
      <title>Build docker images with this Jenkins pipeline. Check it out.</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Thu, 12 Dec 2024 18:51:15 +0000</pubDate>
      <link>https://dev.to/joshwizard/build-docker-images-with-this-jenkins-pipeline-check-it-out-4fp1</link>
      <guid>https://dev.to/joshwizard/build-docker-images-with-this-jenkins-pipeline-check-it-out-4fp1</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/joshwizard" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F1248072%2F3dcaaf7a-ca18-4acd-a8bb-fa3a867e0450.jpg" alt="joshwizard"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="/joshwizard/how-to-automate-docker-build-and-push-with-jenkins-pipeline-33p6" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;How to Automate Docker Build and Push with Jenkins Pipeline.&lt;/h2&gt;
      &lt;h3&gt;Joshua Muriki ・ Dec 6&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#jenkins&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#docker&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#automation&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#devops&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
    </item>
    <item>
      <title>How to Automate Docker Build and Push with Jenkins Pipeline.</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Fri, 06 Dec 2024 11:02:38 +0000</pubDate>
      <link>https://dev.to/joshwizard/how-to-automate-docker-build-and-push-with-jenkins-pipeline-33p6</link>
      <guid>https://dev.to/joshwizard/how-to-automate-docker-build-and-push-with-jenkins-pipeline-33p6</guid>
      <description>&lt;p&gt;&lt;strong&gt;Jenkins&lt;/strong&gt; is an automation server that helps automate software development by building, testing, and deploying in continuous integration, and continuous delivery/deployment (CI/CD).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Docker&lt;/strong&gt; is a software platform used by developers to build, test, and deploy applications into packages called containers containing all the application's dependencies to run on.&lt;/p&gt;

&lt;h3&gt;
  
  
  Importance of automation in the CI/CD pipeline.
&lt;/h3&gt;

&lt;p&gt;Automation uses machines to accomplish tasks and minimize human intervention—DevOps operations to achieve reliable systems before automation was very hectic. Writing the same codes repeatedly to achieve seamless operations has been time-consuming and costly. Therefore, automation was incorporated into these operations to achieve a more flexible and reliable system with minimal human intervention thus saving on labor and time. &lt;/p&gt;

&lt;p&gt;The following are some of the importance of a well-developed CI/CD pipeline;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Accelerate software delivery and development cycles &lt;/li&gt;
&lt;li&gt;Minimal errors as major and complex tasks are automated.&lt;/li&gt;
&lt;li&gt;Faster review times and resolutions to test code as a large code base is reviewed. &lt;/li&gt;
&lt;li&gt;Minimize high labor costs for building and testing code. &lt;/li&gt;
&lt;li&gt;The software released is stable and of high quality. &lt;/li&gt;
&lt;li&gt;Good time management since the automation process is triggered.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This article explains in detail how to automate docker image creation using Jenkins. Instead of reviewing and building images using docker manually, Jenkins plays a crucial role in automating the process and thus helping the user reduce errors and make time for other important tasks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Dockerfile
&lt;/h3&gt;

&lt;p&gt;You will need to have your Dockerfile and its dependencies ready and upload the project to GitHub. To get started on how to create a Dockerfile, refer to this guide &lt;a href="https://dev.to/scorcism/dockerfile-12ho"&gt;Dockerfile&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Before moving forward to automate the Docker Image creation with Jenkins, make sure the Dockerfile can be executed manually to create an image. &lt;/p&gt;

&lt;h3&gt;
  
  
  Jenkins installations
&lt;/h3&gt;

&lt;p&gt;As explained above, Jenkins is an automation server used to automate building, testing, and deploying software. &lt;/p&gt;

&lt;p&gt;In this article, we will install our Jenkins on Windows OS. Visit this link &lt;a href="https://www.jenkins.io/doc/book/installing/windows/" rel="noopener noreferrer"&gt;Jenkins&lt;/a&gt; to get started on Jenkins installation.&lt;/p&gt;

&lt;p&gt;After successfully installing Jenkins, you should be able to access Jenkins via localhost:8080 on your browser as shown in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0f5x5cagzp9bobpxe46.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0f5x5cagzp9bobpxe46.jpg" alt="Image description" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Docker installation
&lt;/h3&gt;

&lt;p&gt;Docker, a platform for containerized applications can comfortably be installed and run on Windows OS. Refer to this guide to get started with  &lt;a href="https://docs.docker.com/desktop/install/windows-install/" rel="noopener noreferrer"&gt;Docker&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After successfully installing Docker Desktop, your Docker Desktop app will be similar to the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxndips6az2ya71c1pd0f.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxndips6az2ya71c1pd0f.jpg" alt="Image description" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Docker Plugin for Jenkins
&lt;/h3&gt;

&lt;p&gt;Docker plugin for Jenkins is a Jenkins plugin developed to support Docker that automates Docker image creation and containerization when Jenkins jobs are triggered. Use this guide &lt;a href="https://plugins.jenkins.io/docker-plugin/" rel="noopener noreferrer"&gt;Docker-Jenkins plugin&lt;/a&gt; to learn more.&lt;/p&gt;

&lt;p&gt;Follow the steps below to install the Docker plugin from the Jenkins plugin manager.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access the Jenkins dashboard in a web browser by using the default URL &lt;a href="http://localhost:8080" rel="noopener noreferrer"&gt;http://localhost:8080&lt;/a&gt; or your configured URL and type in your username and password to log in to your administrator account. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F14ev3vp2wcmkaykc0fg8.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F14ev3vp2wcmkaykc0fg8.jpg" alt="Image description" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Jenkins dashboard, navigate to the left side menu and click &lt;code&gt;Manage Jenkins&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs16ypdz6bnn1jhtbv2u1.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs16ypdz6bnn1jhtbv2u1.jpg" alt="Image description" width="800" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Under &lt;strong&gt;System Configuration&lt;/strong&gt;, click&lt;code&gt;Plugins&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4rcplumwo62t3an8m67.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4rcplumwo62t3an8m67.jpg" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Then, in the &lt;strong&gt;Plugins Section&lt;/strong&gt;, click &lt;code&gt;Available Plugins&lt;/code&gt; &lt;em&gt;(displays all the available plugins)&lt;/em&gt;. There is a search box for searching all suitable plugins that match your needs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvawdb5tmc8p1da2bqvuz.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvawdb5tmc8p1da2bqvuz.jpg" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the &lt;strong&gt;search box&lt;/strong&gt;, search Docker to output all Docker-related plugins. &lt;/li&gt;
&lt;li&gt;Look for these plugins; &lt;code&gt;Docker Pipeline Plugin&lt;/code&gt;, &lt;code&gt;CloudBees Docker Build and Push plugin&lt;/code&gt;, &lt;code&gt;Docker Build Step,&lt;/code&gt; and &lt;code&gt;Docker Plugin&lt;/code&gt;. Check all their respective checkboxes and click &lt;code&gt;Install&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The plugins will start to download immediately and after successful installation, the respective plugins will show a success message on the right with a green tick.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08dm8wrq7mqjvyiwuncr.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08dm8wrq7mqjvyiwuncr.jpg" alt="Image description" width="800" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Restart Jenkins for changes to take effect. &lt;/li&gt;
&lt;li&gt;After restarting, navigate back to &lt;code&gt;Manage Plugins&lt;/code&gt; to check if the Docker plugins are listed in the &lt;code&gt;Installed Plugins&lt;/code&gt; tab. Type Docker on the search box to display all installed Docker plugins.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkz9b9tyr0w5wzdjvmh6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkz9b9tyr0w5wzdjvmh6.jpg" alt="Image description" width="800" height="434"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up Docker for Jenkins
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Configuring Docker Host in Jenkins
&lt;/h3&gt;

&lt;p&gt;After successfully installing the Docker plugins and verifying their installation, you will need to configure Jenkins to use Docker as a building tool.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the &lt;strong&gt;Jenkins Dashboard&lt;/strong&gt;, click on &lt;strong&gt;Manage Jenkins,&lt;/strong&gt; and under &lt;strong&gt;system configuration&lt;/strong&gt;, click on &lt;strong&gt;Cloud&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frienud6az25ty0n5da88.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frienud6az25ty0n5da88.jpg" alt="Image description" width="800" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Under &lt;strong&gt;Cloud&lt;/strong&gt;, select &lt;code&gt;Add New Cloud&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl11rzbx7s48fevwklwj.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl11rzbx7s48fevwklwj.jpg" alt="Image description" width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose &lt;code&gt;Docker&lt;/code&gt; as your cloud provider and click &lt;code&gt;create&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3ovhlu1soaphc5kt7oc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3ovhlu1soaphc5kt7oc.jpg" alt="Image description" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Jenkins has to communicate with Docker Desktop via a URL.&lt;/li&gt;
&lt;li&gt;Open &lt;strong&gt;Docker Desktop &amp;gt;, Settings &amp;gt;, General&lt;/strong&gt;. &lt;/li&gt;
&lt;li&gt;Check the &lt;code&gt;Expose Daemon on tcp://localhost:2375 without TLS&lt;/code&gt;. &lt;em&gt;Ignore the warning&lt;/em&gt;. &lt;/li&gt;
&lt;li&gt;Apply and restart Docker.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4bl5b37mhu0mzh3wgomh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4bl5b37mhu0mzh3wgomh.jpg" alt="Image description" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go back to Jenkins Docker settings and paste the link &lt;code&gt;tcp://localhost:2375.&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ihy2vu0vs112g8puy9u.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ihy2vu0vs112g8puy9u.jpg" alt="Image description" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click &lt;code&gt;Test connection&lt;/code&gt; to ensure Jenkins can Communicate with Docker. &lt;/li&gt;
&lt;li&gt;If the connection is successful, it will display the &lt;code&gt;Docker version&lt;/code&gt; and &lt;code&gt;API&lt;/code&gt;. Click &lt;code&gt;Save&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F68x7lydeambutps0fgq6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F68x7lydeambutps0fgq6.jpg" alt="Image description" width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Adding Docker Credentials to Jenkins
&lt;/h3&gt;

&lt;p&gt;This step is crucial for pushing the Docker image to Dockerhub. Jenkins will use the docker logins to gain access to the Docker Hub.&lt;/p&gt;

&lt;p&gt;On the **Jenkins Dashboard **click&lt;code&gt;Manage Jenkins&lt;/code&gt;, under &lt;code&gt;Security&lt;/code&gt;, click &lt;code&gt;Credentials&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiz82topnvjnl0wfgqlnz.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiz82topnvjnl0wfgqlnz.jpg" alt="Image description" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Add your &lt;code&gt;Docker Registry credentials&lt;/code&gt; that is, set your &lt;code&gt;Docker hub username&lt;/code&gt; and &lt;code&gt;password&lt;/code&gt; and add a suitable ID then click &lt;code&gt;Create&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsu59o6chb9i386y06tx.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsu59o6chb9i386y06tx.jpg" alt="Image description" width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Adding other necessary tools in Jenkins
&lt;/h3&gt;

&lt;p&gt;For the Jenkins pipeline to run smoothly, install tools such as Maven, JDK, and Docker on the Jenkins server. &lt;/p&gt;

&lt;p&gt;On the &lt;code&gt;Jenkins Dashboard&lt;/code&gt; &amp;gt; &lt;code&gt;Manage Jenkins&lt;/code&gt; &amp;gt; &lt;code&gt;system configuration&lt;/code&gt;, click &lt;code&gt;Tools&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;On JDK installations, add JDK and give it a name, for example, &lt;code&gt;OpenJDK8&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Check the &lt;code&gt;installation automatically&lt;/code&gt;. Then choose the installer as &lt;code&gt;install for adoptium.net&lt;/code&gt; and select the latest version of JDK.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdewoeqi9698964biryl9.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdewoeqi9698964biryl9.jpg" alt="Image description" width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scroll down to &lt;code&gt;Maven installations&lt;/code&gt; and add &lt;code&gt;Maven&lt;/code&gt;. Check the &lt;code&gt;installation automatically&lt;/code&gt; checkbox and choose the installer as &lt;code&gt;install from Apache&lt;/code&gt; and select the latest version of &lt;code&gt;Maven&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ir6ih4slso9xm53gjwk.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ir6ih4slso9xm53gjwk.jpg" alt="Image description" width="800" height="377"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, scroll down to &lt;code&gt;Docker installations&lt;/code&gt; and select &lt;code&gt;Add Docker&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Choose a suitable name, check the &lt;code&gt;install automatically&lt;/code&gt; checkbox, and choose the installer as a &lt;code&gt;download from docker.com&lt;/code&gt;. Choose the latest version.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq6b2e9rwgc7a7xi0zs6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq6b2e9rwgc7a7xi0zs6.jpg" alt="Image description" width="800" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuring Maven POM file.
&lt;/h3&gt;

&lt;p&gt;We will use Maven as a build agent for Jenkins.&lt;/p&gt;

&lt;p&gt;On your GitHub repository, create a new file and name it &lt;code&gt;POM.xml&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Paste the below code and save it. (This is a sample Maven Project. Learn more &lt;a href="https://maven.apache.org/" rel="noopener noreferrer"&gt;here&lt;/a&gt;)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"&amp;gt;
    &amp;lt;modelVersion&amp;gt;4.0.0&amp;lt;/modelVersion&amp;gt;

    &amp;lt;!-- Basic project information --&amp;gt;
    &amp;lt;groupId&amp;gt;com.example&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;my-project&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;1.0-SNAPSHOT&amp;lt;/version&amp;gt;
    &amp;lt;packaging&amp;gt;jar&amp;lt;/packaging&amp;gt; &amp;lt;!-- Can also be 'war' for web projects --&amp;gt;

    &amp;lt;name&amp;gt;My Maven Project&amp;lt;/name&amp;gt;
    &amp;lt;description&amp;gt;A simple Maven project.&amp;lt;/description&amp;gt;
    &amp;lt;url&amp;gt;http://www.example.com&amp;lt;/url&amp;gt;

    &amp;lt;!-- Dependencies --&amp;gt;
    &amp;lt;dependencies&amp;gt;
        &amp;lt;!-- Example: Add dependencies here --&amp;gt;
        &amp;lt;dependency&amp;gt;
            &amp;lt;groupId&amp;gt;org.springframework.boot&amp;lt;/groupId&amp;gt;
            &amp;lt;artifactId&amp;gt;spring-boot-starter&amp;lt;/artifactId&amp;gt;
            &amp;lt;version&amp;gt;3.1.4&amp;lt;/version&amp;gt;
        &amp;lt;/dependency&amp;gt;
    &amp;lt;/dependencies&amp;gt;

    &amp;lt;!-- Build configuration --&amp;gt;
    &amp;lt;build&amp;gt;
        &amp;lt;plugins&amp;gt;
            &amp;lt;!-- Example: Maven Compiler Plugin --&amp;gt;
            &amp;lt;plugin&amp;gt;
                &amp;lt;groupId&amp;gt;org.apache.maven.plugins&amp;lt;/groupId&amp;gt;
                &amp;lt;artifactId&amp;gt;maven-compiler-plugin&amp;lt;/artifactId&amp;gt;
                &amp;lt;version&amp;gt;3.10.1&amp;lt;/version&amp;gt;
                &amp;lt;configuration&amp;gt;
                    &amp;lt;source&amp;gt;11&amp;lt;/source&amp;gt; &amp;lt;!-- Java source version --&amp;gt;
                    &amp;lt;target&amp;gt;11&amp;lt;/target&amp;gt; &amp;lt;!-- Java target version --&amp;gt;
                &amp;lt;/configuration&amp;gt;
            &amp;lt;/plugin&amp;gt;
        &amp;lt;/plugins&amp;gt;
    &amp;lt;/build&amp;gt;
&amp;lt;/project&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above code is a Maven Project, a build automation tool for Java projects.&lt;/p&gt;

&lt;h3&gt;
  
  
  Code Breakdown
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;XML Declaration and Schema&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;```
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&lt;br&gt;
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"&amp;gt;&lt;br&gt;
    4.0.0&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;


This code declares the XML namespace for the POM schema and ensures the file conforms to Maven’s POM structure version 4.0.0.

**Project Data.**



    ```
&amp;lt;!-- Basic project information --&amp;gt;
    &amp;lt;groupId&amp;gt;com.example&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;my-project&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;1.0-SNAPSHOT&amp;lt;/version&amp;gt;
    &amp;lt;packaging&amp;gt;jar&amp;lt;/packaging&amp;gt; &amp;lt;!-- Can also be 'war' for web projects --&amp;gt;

    &amp;lt;name&amp;gt;My Maven Project&amp;lt;/name&amp;gt;
    &amp;lt;description&amp;gt;A simple Maven project.&amp;lt;/description&amp;gt;
    &amp;lt;url&amp;gt;http://www.example.com&amp;lt;/url&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;code&gt;groupId&lt;/code&gt;: A unique project group identifier.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;artifactId&lt;/code&gt;: Unique name of the project within the same group.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;version&lt;/code&gt;: This has the project version. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;packaging&lt;/code&gt;: This part shows the type of artifact to produce for example &lt;code&gt;jar&lt;/code&gt; for libraries.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;name&lt;/code&gt;: Name of the Project.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;description&lt;/code&gt;: Contains description of the project.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;url&lt;/code&gt;: Project’s website.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dependencies&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;```
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;dependencies&amp;gt;
    &amp;lt;!-- Example: Add dependencies here --&amp;gt;
    &amp;lt;dependency&amp;gt;
        &amp;lt;groupId&amp;gt;org.springframework.boot&amp;lt;/groupId&amp;gt;
        &amp;lt;artifactId&amp;gt;spring-boot-starter&amp;lt;/artifactId&amp;gt;
        &amp;lt;version&amp;gt;3.1.4&amp;lt;/version&amp;gt;
    &amp;lt;/dependency&amp;gt;
&amp;lt;/dependencies&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;


This part declares the libraries required for the project. It specifies Spring Boot Starter dependency version 3.1.4.

**Build Configuration**



    ```
&amp;lt;!-- Build configuration --&amp;gt;
    &amp;lt;build&amp;gt;
        &amp;lt;plugins&amp;gt;
            &amp;lt;!-- Example: Maven Compiler Plugin --&amp;gt;
            &amp;lt;plugin&amp;gt;
                &amp;lt;groupId&amp;gt;org.apache.maven.plugins&amp;lt;/groupId&amp;gt;
                &amp;lt;artifactId&amp;gt;maven-compiler-plugin&amp;lt;/artifactId&amp;gt;
                &amp;lt;version&amp;gt;3.10.1&amp;lt;/version&amp;gt;
                &amp;lt;configuration&amp;gt;
                    &amp;lt;source&amp;gt;11&amp;lt;/source&amp;gt; &amp;lt;!-- Java source version --&amp;gt;
                    &amp;lt;target&amp;gt;11&amp;lt;/target&amp;gt; &amp;lt;!-- Java target version --&amp;gt;
                &amp;lt;/configuration&amp;gt;
            &amp;lt;/plugin&amp;gt;
        &amp;lt;/plugins&amp;gt;
    &amp;lt;/build&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code snippet configures the project build process. The plugin adds build plugins to extend Maven’s functionality targeting Java 11.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Jenkins pipeline
&lt;/h2&gt;

&lt;p&gt;Jenkins pipelines are a set of plugins that contain automated steps that Jenkins executes to build, test, and deploy applications. It is from these pipelines that continuous delivery and integration are implemented to achieve highly flexible and robust applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Writing a Jenkins file
&lt;/h3&gt;

&lt;p&gt;Below is a sample of a Jenkins pipeline designed to automate the process of checking the code, building a Docker image, and pushing the created image to a Docker hub. &lt;/p&gt;

&lt;h4&gt;
  
  
  Sample Jenkins Code.
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent any 

    stages {
        stage ('Checkout Code') {
            steps {
                checkout scmGit(branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/joshwizard/Docker-Image-Build-Automation-with-Jenkins.git']])
            bat 'mvn clean install'
            }
        }

        stage('Build Docker Image') {
            steps {
                script{
                    bat 'docker build -t joshmurih/flaskapp:latest .'
                }
            }
        }

        stage('Push Image to Hub') {
            steps {
                script {
                    withCredentials([usernamePassword(credentialsId: 'dockerhubpassword', usernameVariable: 'DOCKER_USER', passwordVariable: 'DOCKER_PASS')]) {
                        bat """
                        echo $DOCKER_PASS | docker login -u $DOCKER_USER --password-stdin
                        docker push joshmurih/flaskapp:latest
                        """
            }
        }
    }
}

    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Explanation of each stage in the pipeline. &lt;/p&gt;

&lt;h3&gt;
  
  
  Pipeline Overview
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent any
    ...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Pipeline&lt;/strong&gt;: This is what defines the entire Jenkins pipeline. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Agent any&lt;/strong&gt;: Communicates with Jenkins to run the pipeline on any available agents. You can specify the containers if need be. &lt;/p&gt;

&lt;h3&gt;
  
  
  Stages and Steps
&lt;/h3&gt;

&lt;p&gt;The stages in the pipeline show phases that the Jenkins pipeline follows to accomplish specific tasks. &lt;/p&gt;

&lt;h4&gt;
  
  
  Stage 1: Checkout Code
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stage ('Checkout Code') {
            steps {
                checkout scmGit(branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/joshwizard/Docker-Image-Build-Automation-with-Jenkins.git']])
            bat 'mvn clean install'
            }
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Checkout Code&lt;/code&gt;: This part retrieves the latest code from a Git Repository and builds it using Maven. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;steps&lt;/code&gt;: Contains the tasks to be executed in this stage. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Checkout scmGit&lt;/code&gt;&lt;strong&gt;: &lt;/strong&gt;Checks out the code from the specified Git repository branch (Main).

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;branches&lt;/code&gt;: Specifies the branch to fetch the code.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;userRemoteConfigs&lt;/code&gt;: Contains the GitHub repository URL.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt; &lt;code&gt;bat 'mvn clean install'&lt;/code&gt; : Runs Maven command to clean and build the project. &lt;code&gt;bat&lt;/code&gt; is a command for Windows, &lt;code&gt;sh&lt;/code&gt; is built for Unix/Linux platforms.&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  Stage 2: Build Docker Image
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stage('Build Docker Image') {
            steps {
                script{
                    bat 'docker build -t joshmurih/flaskapp:latest .'
                }
            }
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Build Docker Image&lt;/code&gt;: This stage will build a Docker image using the project’s &lt;code&gt;Dockerfile&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;script&lt;/code&gt;: specifies code to run for execution.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;bat 'docker build -t joshmurih/flaskapp:latest .'&lt;/code&gt;: This command executes a Docker build image command. &lt;/p&gt;

&lt;h4&gt;
  
  
  Stage 3: Push Docker Image
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stage('Push Image to Hub') {
            steps {
                script {
                    withCredentials([usernamePassword(credentialsId: 'dockerhubpassword', usernameVariable: 'DOCKER_USER', passwordVariable: 'DOCKER_PASS')]) {
                        bat """
                        echo $DOCKER_PASS | docker login -u $DOCKER_USER --password-stdin
                        docker push joshmurih/flaskapp:latest
                        """
            }
        }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Push Docker Image&lt;/code&gt;: After building the Docker image, this stage pushes the Docker image to the Docker Hub. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;withCredentials&lt;/code&gt;: Securely injects credentials from the Jenkins credentials system server. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;credentialsId&lt;/code&gt;: This contains Docker Hub credentials (username and password).&lt;/p&gt;

&lt;p&gt;&lt;code&gt;usernameVariable&lt;/code&gt;: This step binds the Docker Hub username to &lt;code&gt;DOCKER_USER&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;passwordVariable&lt;/code&gt;: This step binds the Docker Hub password to &lt;code&gt;DOCKER_PASS&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The code runs under &lt;code&gt;bat&lt;/code&gt; command, logs into the Docker Hub.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker push joshmurih/flaskapp:latest&lt;/code&gt;: This command pushes the Docker image to Docker Hub successfully. &lt;/p&gt;

&lt;h2&gt;
  
  
  Running the pipeline Job
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Creating the pipeline job in Jenkins
&lt;/h3&gt;

&lt;p&gt;Now after everything is set, a job must be created to enable the &lt;code&gt;build and push&lt;/code&gt; to take place effectively. &lt;/p&gt;

&lt;p&gt;When a job is created, you can schedule build and push by setting a trigger of your choice or do it manually. &lt;/p&gt;

&lt;p&gt;To create a Jenkins pipeline;&lt;/p&gt;

&lt;p&gt;Sign in to your &lt;code&gt;Jenkins dashboard&lt;/code&gt; and click the &lt;code&gt;new item&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcfptlunxy5snqjzwr9bj.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcfptlunxy5snqjzwr9bj.jpg" alt="Image description" width="800" height="565"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A new page is displayed and asks for an &lt;code&gt;item name&lt;/code&gt;. Enter a suitable name for your Jenkins pipeline. &lt;/p&gt;

&lt;p&gt;On the &lt;code&gt;Select Item Type&lt;/code&gt;, choose &lt;code&gt;Pipeline&lt;/code&gt; and click &lt;code&gt;OK&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fby2pzig7eg0poemk3trm.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fby2pzig7eg0poemk3trm.jpg" alt="Image description" width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A new configuration page is displayed where all the settings of the pipeline take place. &lt;/p&gt;

&lt;p&gt;Type a short and brief description of your project in the text area provided.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrwullrwnmh54eqzt3b0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrwullrwnmh54eqzt3b0.jpg" alt="Image description" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Check the &lt;code&gt;Discard old builds&lt;/code&gt; to prevent accumulating old builds that have no use in your dashboard. &lt;/p&gt;

&lt;p&gt;Put a maximum of 2 builds (It depends on your preference.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1grs06zzkxjd0qaod3y2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1grs06zzkxjd0qaod3y2.jpg" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Under Build Triggers&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Check the checkbox for the &lt;code&gt;GitHub hook trigger for GITScm polling&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2kmvmtbs9yyagc7ydhjg.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2kmvmtbs9yyagc7ydhjg.jpg" alt="Image description" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: To understand better the build triggers, at the end of every command there is a question mark that contains more detailed information of what that particular command does.&lt;/p&gt;

&lt;p&gt;On the &lt;code&gt;Pipeline&lt;/code&gt; part, select &lt;code&gt;pipeline script&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Copy and paste the sample Pipeline Jenkins code on the script area click &lt;code&gt;apply&lt;/code&gt; to capture changes and save to close the window. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feps44m0jepzdapj3liy8.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feps44m0jepzdapj3liy8.jpg" alt="Image description" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A page is displayed and on the left side of the page, click &lt;code&gt;Build Now&lt;/code&gt; to start building your project. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevf4rdu6t5nz7m4y2t3e.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevf4rdu6t5nz7m4y2t3e.jpg" alt="Image description" width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When building the project, open your pipeline project, click on the project building number starting with #, and click &lt;code&gt;Console Output&lt;/code&gt; to monitor the project's progress. (&lt;em&gt;It also contains error logs if the project fails to execute properly&lt;/em&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94ehvkvk2inucw3dih1d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94ehvkvk2inucw3dih1d.jpg" alt="Image description" width="800" height="348"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If the process is successful, open Docker Desktop and confirm the docker image. Thereafter, open Docker Hub on your browser and check if your image has been published.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best practices
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Security Considerations
&lt;/h3&gt;

&lt;p&gt;Publishing Docker images to Docker Hub can be vulnerable especially if Docker credentials are not securely configured. &lt;/p&gt;

&lt;p&gt;Use Jenkins credentials to manage sensitive information like APIs, usernames and passwords, and Docker Hub credentials.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    environment {
        DOCKER_CREDS = credentials('docker-registry-credentials-id')
    }
    stages {
        stage('Login to Docker Registry') {
            steps {
                sh 'echo $DOCKER_CREDS_PSW | docker login -u $DOCKER_CREDS_USR --password-stdin'
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Pipeline Simplicity
&lt;/h3&gt;

&lt;p&gt;Use clean and structured code syntax which is easier to debug in case an error occurs.&lt;/p&gt;

&lt;p&gt;Make sure to maintain clean indentation and flow of code syntax which can also be used in future projects by simply changing some commands, steps, or stages. &lt;/p&gt;

&lt;h3&gt;
  
  
  Docker Image Management
&lt;/h3&gt;

&lt;p&gt;Ensure a proper workspace cleanup to free up space to avoid disk space-related issues. &lt;/p&gt;

&lt;p&gt;Cleaning the Docker workspace creates an efficient platform to run other newly created images without taking into consideration space issues.&lt;/p&gt;

&lt;p&gt;Use the code below in your pipeline script to clean up space after the image has been built.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stage('Clean Up') {
    steps {
        sh 'docker system prune -f'
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This article has covered all the relevant steps to build and push the created Docker image to Docker Hub using well defined Jenkins Pipeline from checkout code, building the image, and pushing to Docker Hub. &lt;/p&gt;

&lt;p&gt;The pace of technological innovation is accelerating and as a result, always counter-check for new updates on plugins to avoid unnecessary errors. &lt;/p&gt;

&lt;p&gt;Also, save the pipeline script as &lt;code&gt;Jenkinsfile&lt;/code&gt; in your GitHub Repository to use later by running the &lt;code&gt;Pipeline Script from SCM&lt;/code&gt; on the Jenkins pipeline instead of writing a new script. &lt;/p&gt;

&lt;p&gt;Explore other automation possibilities with Jenkins and Docker to make work easier and pave the way for other project priorities. &lt;/p&gt;

</description>
      <category>jenkins</category>
      <category>docker</category>
      <category>automation</category>
      <category>devops</category>
    </item>
    <item>
      <title>Understanding the Differences Between Blue-Green Deployment and Canary Deployment</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Wed, 28 Aug 2024 04:31:20 +0000</pubDate>
      <link>https://dev.to/joshwizard/understanding-the-differences-between-blue-green-deployment-and-canary-deployment-3oec</link>
      <guid>https://dev.to/joshwizard/understanding-the-differences-between-blue-green-deployment-and-canary-deployment-3oec</guid>
      <description>&lt;p&gt;Effective deployment of software to end users is one of the most vital aspects of the software development lifecycle. Thorough testing and validation allow DevOps Engineers to reduce errors or conflicts thus increasing collaboration and management during the deployment process.&lt;/p&gt;

&lt;p&gt;The most famous deployment strategies used in the industry are Blue-Green deployment and Canary deployment. Companies like &lt;a href="https://research.facebook.com/blog/2018/8/building-switch-software-at-facebook-scale/" rel="noopener noreferrer"&gt;Facebook&lt;/a&gt; and &lt;a href="https://cloud.google.com/deploy/docs/deploy-app-canary#:~:text=A%20canary%20deployment%20splits%20traffic,to%20a%20proportion%20of%20pods." rel="noopener noreferrer"&gt;Google&lt;/a&gt; leverage the use of canary deployment while &lt;a href="https://www.quali.com/blog/netflix-like-approach-to-devops-environment-delivery/" rel="noopener noreferrer"&gt;Netflix&lt;/a&gt; and &lt;a href="https://docs.aws.amazon.com/whitepapers/latest/blue-green-deployments/welcome.html" rel="noopener noreferrer"&gt;Amazon&lt;/a&gt; opt to use blue-green deployment to update their services consistently without delay.&lt;/p&gt;

&lt;h3&gt;
  
  
  Deployment strategies
&lt;/h3&gt;

&lt;p&gt;Deployment strategies are ways to implement changes and upgrade existing applications. The need to upgrade to a new version or make changes to existing applications is very important since thorough research has been carried out to ensure the end user enjoys every feature of the application for ease of use. &lt;/p&gt;

&lt;h3&gt;
  
  
  Importance of Safe Deployment
&lt;/h3&gt;

&lt;p&gt;When the deployment is carried out, several factors must be taken into account including but not limited to;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Security of the user data to the third party.&lt;/li&gt;
&lt;li&gt;Minimizing downtime to ensure high availability of services.&lt;/li&gt;
&lt;li&gt;Decrease in errors that may result in loss of data.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Blue-Green Deployment
&lt;/h2&gt;

&lt;p&gt;Blue-Green deployment is a software release strategy that involves the creation of two identical environments (blue and green) where traffic is often directed to the blue environment and the new version is deployed to the green environment. After thorough testing and changes are confirmed, the traffic is switched to the green environment.&lt;/p&gt;

&lt;p&gt;The main advantages of this deployment are that there is minimal downtime whereby updates can be rolled out perfectly without disrupting the user experience and simplified testing in a production-like environment. &lt;/p&gt;

&lt;p&gt;The drawbacks of this deployment strategy are that it is expensive since it requires two complete environments to run and managing the two environments can be complex leading to delayed updates and high labor cost. &lt;/p&gt;

&lt;h2&gt;
  
  
  Canary Deployment
&lt;/h2&gt;

&lt;p&gt;Canary deployment is a software release practice where an application is deployed in stages to a limited subset of users first. These users monitor and test the application’s performance and provide user feedback accordingly. Once updates and changes are approved, the updates are released to all the users.&lt;/p&gt;

&lt;p&gt;The advantages of this deployment are that there is minimal risk where the impact of damage is limited to a small group, it is flexible and adaptive, and also engineers can monitor usage and catch concerns early.&lt;/p&gt;

&lt;p&gt;Its disadvantages are that it requires robust monitoring tools, has potential user segmentation issues, and has a slower release compared to blue-green deployment. &lt;/p&gt;

&lt;h2&gt;
  
  
  Comparisons
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Resource Requirements
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Load balancer
&lt;/h4&gt;

&lt;p&gt;In canary deployment, the load balancer routes limited traffic to the new version until testing is complete while most traffic is directed to the stable version (old).&lt;/p&gt;

&lt;p&gt;In blue-green deployment, the load balancer switches between two environments which are the current version (blue) and the new version (green). Upon successful testing, the load balancer shifts all the traffic to the green environment (new version).&lt;/p&gt;

&lt;p&gt;The main tools used as load balancers include AWS Elastic Load Balancing (ELB), Google Cloud Load Balancing, and NGINX.&lt;/p&gt;

&lt;h4&gt;
  
  
  Application environments
&lt;/h4&gt;

&lt;p&gt;Canary deployment involves the use of a single environment to host the new version and the stable version simultaneously while blue-green deployment, two identical environments are created separately for one environment to host the new version and the other to host the live production.&lt;/p&gt;

&lt;p&gt;The common tools used in creating application environments for these deployments are Kubernetes, Docker Swarm, Virtual Machines, and Cloud environments such as Google Cloud, AWS, and Azure.&lt;/p&gt;

&lt;h4&gt;
  
  
  Traffic Management
&lt;/h4&gt;

&lt;p&gt;Traffic in canary deployment is shifted from the stable version to the canary version based on set-out rules and tools to control the traffic. &lt;/p&gt;

&lt;p&gt;In blue-green deployment, traffic is more straightforward whereby traffic is either fully in the Blue environment or fully in the Green environment. It requires precise timing to manage the traffic. &lt;br&gt;
Common tools used in traffic management are Istio, Envoy, AWS App Mesh, and Consul.&lt;/p&gt;

&lt;h4&gt;
  
  
  Monitoring and Observability
&lt;/h4&gt;

&lt;p&gt;Monitoring is critical in software deployment because the main goal is to collect metrics, logs, and user feedback to detect any issues early and prevent disasters. &lt;/p&gt;

&lt;p&gt;In canary deployment, monitoring will ensure traffic conditions for both versions are collected and analyzed simultaneously to detect any malfunctions while in blue-green deployment, monitoring focuses on ensuring the green environment is functioning before and after the switch from the blue environment. &lt;/p&gt;

&lt;p&gt;The main tools used to achieve this aspect are Prometheus, Grafana, Datadog, ELK (Elasticsearch, Logstash, Kibana), and New Relic.&lt;/p&gt;

&lt;h4&gt;
  
  
  CI/CD Pipelines
&lt;/h4&gt;

&lt;p&gt;CI/CD pipelines streamline every stage of software delivery by automating building, testing, and deployment. Continuous Integration ensures the code changes are integrated and tested and Continuous Deployment ensures the software changes are deployed automatically into production. &lt;/p&gt;

&lt;p&gt;In canary deployment, the CI/CD (Continuous Integration/ Continuous Deployment) pipeline needs to support regular deployment processes by automating traffic, rolling back, or scaling up the canary.&lt;/p&gt;

&lt;p&gt;In this scenario, a Docker image is deployed on the staging environment in Kubernetes where tests are run frequently to validate the image. The Canary version receives 10% of the traffic for testing. Jenkins is used to control traffic distribution between the two versions. &lt;br&gt;
After all the tests are done and the canary version is stable, all the traffic is directed to the canary version entirely. &lt;/p&gt;

&lt;p&gt;In blue-green deployment, the CI/CD pipeline supports deploying to a separate environment and also manages the clean-up process where the old environment can be used for updates for the next version release.&lt;/p&gt;

&lt;p&gt;Here, a Docker image is deployed to a staging environment similar to Canary Pipeline but a green environment is set up as a copy of the Blue environment. The new Docker image is deployed to the Green environment in Kubernetes where tests are run endlessly to ensure the image functions correctly. If the application passes the tests, a load balancer is used to switch all traffic from the Old Version (Blue) to the new version (Green).&lt;/p&gt;

&lt;p&gt;The main tools used for CI/CD processes are Jenkins, GitLab CI/CD, CircleCi, Spinnaker, and ArgoCD.&lt;/p&gt;

&lt;h2&gt;
  
  
  Risk management
&lt;/h2&gt;

&lt;p&gt;Risk management in software deployment involves identifying and mitigating the risks associated with the release of new versions with new features into a production environment. &lt;/p&gt;

&lt;p&gt;Potential risks can be identified from code bugs, compatibility issues, performance impact, user experience, and security vulnerabilities where the main sources of risk are new features and functionality, infrastructure changes such as servers, databases or networks, and third-party dependencies i.e. third-party libraries or services. &lt;/p&gt;

&lt;p&gt;In canary deployment, the risk is mitigated by deploying the new version to a small subset of users for feedback. If issues arise, then a small number of the user base is affected and rollback is easier. Monitoring tools are set up to detect issues early and notify the team of any vulnerabilities allowing for quick intervention. Feature flags are also used to enable and disable specific features that may be causing malfunctioning. &lt;/p&gt;

&lt;p&gt;In blue-green deployment, a dual environment is created where the new version of the software is deployed on the Green environment, and the live environment is deployed on the Blue environment. Functional testing, performance testing, and security testing are thoroughly performed before switching live traffic to the Green environment. In case issues are detected, a quick rollback is quickly redirected to the original environment (blue).&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation complexity
&lt;/h2&gt;

&lt;p&gt;When deploying software, complexities involved in the process are a key factor that influences the choice put into place. &lt;/p&gt;

&lt;p&gt;Canary deployment offers significant benefits in traffic distribution where traffic is directed to a small subset of users, provides continuous feedback in real-time, and issues can be detected earlier to minimize damage impact.&lt;/p&gt;

&lt;p&gt;Canary deployment complexity is very high since it requires sophisticated tools to manage and route traffic between the canary and stable versions, monitoring and alert tools used to output real-time analytics on when to move to the new version can be complex, automation and CI/CD integration involves writing complex scripts using specialized tools thus adding another layer of complexity and finally, rollback mechanisms need to scale back traffic gradually if issues are detected adding further complexity.&lt;/p&gt;

&lt;p&gt;Blue-Green deployment offers outstanding benefits in software deployment as it has simplified traffic switching from one environment (Blue) to another environment (Green) once testing is complete, isolation of environments ensures risks of conflicts between the new version and old version are reduced and rollback is simplified reducing the complexity of the recovery process.&lt;/p&gt;

&lt;p&gt;Blue-Green deployment complexity is moderate whereby coordinating and testing across two environments require more careful planning and resources, synchronization of databases and configurations between the two environments can be complex, and higher costs can be incurred when managing the infrastructure of two separate environments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Uses cases
&lt;/h2&gt;

&lt;p&gt;The choice between Canary and Blue-Green deployment strategies depends on the requirements of the software deployed, the type of infrastructure to be used, and the ability to mitigate risk effectively in case a threat arises.&lt;/p&gt;

&lt;h3&gt;
  
  
  Canary deployment
&lt;/h3&gt;

&lt;p&gt;This type of deployment strategy is suitable in the following scenarios;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;High Traffic&lt;/strong&gt;: When you have applications with high traffic, some issues may cause system failure, and its impact could be massive hence a new feature is introduced to a small segment of users and feedback can be gathered fast. An example is a social media platform. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Gradual Rollout&lt;/strong&gt;: Applications that need gradual rollout are achieved by deploying a new feature to a small subset of users to monitor its performance and slowly increasing the user base if no threats or issues have been detected. An example is an E-commerce platform.&lt;/p&gt;
&lt;h3&gt;
  
  
  Blue-Green deployment
&lt;/h3&gt;

&lt;p&gt;This type of deployment strategy is suitable in the following scenarios;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;High Availability&lt;/strong&gt;: This deployment allows switching between environments without downtime and this ensures that a 100% uptime is constant ensuring high availability. An example of a high availability software is a financial service application.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Infrastructure Migration&lt;/strong&gt;: Suitable for organizations that need to migrate their services to new infrastructure allowing for thorough testing before the switch is done.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Regulations and Compliance&lt;/strong&gt;: Suitable when a clear variance is needed between new features and old versions which can be audited and documented to meet regulatory requirements. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Long Deployment Applications&lt;/strong&gt;: Some applications have a long deployment process that requires more time to be deployed and hence cannot withstand interruptions. Blue-Green deployment ensures the long process is complete whereby the new version deployment happens in a separate environment. &lt;/p&gt;
&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The main difference between Canary deployment and Blue-Green deployment is that in Canary Deployment, the application is deployed in stages to a small subset of users before actual release to the users while Blue-Green deployment involves deploying software in two separate environments one for the live version (Blue) and the other on the new version (Green).&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The type of deployment to choose depends on the infrastructure used, the type of application used such as applications with a higher user base, the type of traffic to experience, risk mitigation strategies that may affect application deployment, and the complexity of software deployed. &lt;/p&gt;

&lt;p&gt;Before deciding on what type of deployment strategy to put into place for your software application, it is vital to do thorough research on the type of deployments used and consider its advantages and disadvantages.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>cloud</category>
      <category>softwaredevelopment</category>
      <category>software</category>
    </item>
    <item>
      <title>Integration of Artificial Intelligence and Machine Learning in Cloud Security</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Tue, 16 Jul 2024 20:57:07 +0000</pubDate>
      <link>https://dev.to/joshwizard/integration-of-artificial-intelligence-and-machine-learning-in-cloud-security-62b</link>
      <guid>https://dev.to/joshwizard/integration-of-artificial-intelligence-and-machine-learning-in-cloud-security-62b</guid>
      <description>&lt;p&gt;Lately, companies, startups, governments, and other organizations have opted to use &lt;a href="https://en.wikipedia.org/wiki/Cloud_computing" rel="noopener noreferrer"&gt;cloud computing&lt;/a&gt; only for its reliability to store their data without the risk of losing the data either by computer viruses, theft, human errors, software corruption, natural disasters, or hardware impairment among others. &lt;/p&gt;

&lt;p&gt;Artificial Intelligence and Machine Learning have become very crucial in enhancing cloud security by having automated threat detection whereby potential threats can be identified in real-time and machine learning models activated to surpass the danger. It also has improved automated response to block malicious activities and trigger automated workflows to detach affected systems and activate recovery processes, hence more efficiency and reduced human errors. Last but not least, AI and ML are self-learning, this aspect improves with time as they learn and adapt to new emerging threats hence detection and response capabilities are strengthened. &lt;/p&gt;

&lt;p&gt;In this article, I will explore how Artificial Intelligence and Machine learning are transforming and impacting cloud security despite the challenges that will be catered for in the future.&lt;/p&gt;

&lt;h2&gt;
  
  
  The role of AI and ML in cloud security
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Enhancing Threat Detection and Response
&lt;/h3&gt;

&lt;h4&gt;
  
  
  AI-Driven Threat Detection
&lt;/h4&gt;

&lt;p&gt;AI algorithms can analyze large amounts of data to identify unusual patterns of hackers or an attempt to access data unauthorized. A good example of AI algorithms is Google’s Chronicles and Amazon’s Macie. They are all cloud-based security software that helps organizations to detect potential security incidents on their entire network infrastructure. &lt;/p&gt;

&lt;h4&gt;
  
  
  Automated Incident Response
&lt;/h4&gt;

&lt;p&gt;Machine Learning models are well modified to enable automated responses to respond to already identified threats to mitigate damage that would be caused by a breach of security to access unauthorized data. These models are very fast thus reducing response time and potential damage that could arise as a result of security breach.&lt;/p&gt;

&lt;p&gt;AI-driven Security Information and Event Management (SIEM) Systems like Splunk and IBM QRadar leverage Machine Learning for real-time threat detection and automated responses. They get operational insights into vulnerabilities and return feedback about security breaches. &lt;/p&gt;

&lt;h3&gt;
  
  
  Predictive Analytics for Proactive Security
&lt;/h3&gt;

&lt;p&gt;Artificial Intelligence models are designed to predict potential threats based on historical data and emerging trends. These models are so important since they are used in ethical hacking to detect vulnerabilities in systems before they are exploited by hackers. &lt;/p&gt;

&lt;p&gt;Machine Learning models analyze users' behavior to establish a baseline and identify deviations that may indicate compromised accounts or insider threats. &lt;/p&gt;

&lt;h3&gt;
  
  
  Enhanced Access Management
&lt;/h3&gt;

&lt;p&gt;Artificial Intelligence systems can adjust account authentication processes based on real-time potential risk assessments. Most companies have incorporated this technique to counterattack any malicious attempt on their systems. An example is Microsoft’s Azure Active Directory Identity Protection which uses Artificial Intelligence to detect and respond to suspicious sign-ins. Google also has embraced this technique where when signing in to your Gmail Account on a different device, a message confirming the sign-in activity is sent to your email or your phone to confirm its authenticity.&lt;/p&gt;

&lt;p&gt;Artificial Intelligence and Machine Learning models are used for continuous monitoring to detect access patterns to systems and give an alert on any possible irregularities. AI and ML models are trained on the normality of sign-ins and in case an abnormality is detected, a response action is activated to the access attempt before damage is inflicted on the system. A service such as Amazon GuardDuty provides intelligent threat detection for Amazon Web Services (AWS) resources by continuously monitoring your account activity within the AWS environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenges and Considerations
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Data Privacy and Ethics
&lt;/h4&gt;

&lt;p&gt;The main challenge when working with Artificial Intelligence and Machine Learning models is ensuring that it complies with data privacy and ethical standards. These systems require a lot of large datasets for training which include access to personal information. To achieve maximum data privacy regulations, compliance with GDPR, HIPAA, and CCPA is a necessity whereby terms and conditions are adhered to.&lt;/p&gt;

&lt;p&gt;If AI and ML models are not fully customized to achieve the end goal, these AI and ML models could lead to negatives by breaching data privacy, and how conclusions are reached thus all potential biases in Machine Learning models must be fully addressed. &lt;/p&gt;

&lt;h4&gt;
  
  
  Scalability and Integration
&lt;/h4&gt;

&lt;p&gt;AI and ML models require high functional computation resources to handle large datasets for training purposes which makes it a challenge to handle increased data volume and ensure complex computations are attained efficiently. &lt;/p&gt;

&lt;p&gt;Due to the high need for computational demands to balance performance, AI and ML systems lead to high operational costs which are critical to run and maintain. &lt;/p&gt;

&lt;h4&gt;
  
  
  Continuous Learning and Adaptation
&lt;/h4&gt;

&lt;p&gt;Attacks on systems and technology are ever-evolving, necessitating frequent updates and retraining AI and ML models to handle these new types of attacks. Ensuring regular updates and personnel to constantly retrain AI models is a big challenge resulting in high costs of management and keeping operations undisrupted.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Trends in AI and ML for Cloud Security
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Advanced Threat Detection Response
&lt;/h3&gt;

&lt;p&gt;Future AI and ML systems will advance their intellectual capabilities by integrating more resources to detect potential threats with high accuracy. These models will correlate on multiple environments for detailed threat detection and automatically execute the best response protocols to counter identified threats hence reducing human intervention and errors. &lt;/p&gt;

&lt;h3&gt;
  
  
  Integration with Quantum Computing
&lt;/h3&gt;

&lt;p&gt;With quantum computing technology, all the traditional encryption methods will become vulnerable to threats therefore, AI and ML models will be trained to use cryptographic algorithms (blockchain technology) to protect and secure data systems against future quantum attacks.&lt;/p&gt;

&lt;p&gt;Quantum computing will also solve the need to solve complex models and process large datasets by boosting Artificial Intelligence abilities. &lt;/p&gt;

&lt;h3&gt;
  
  
  Federated Learning
&lt;/h3&gt;

&lt;p&gt;Federated learning will enhance data privacy whereby it allows AI and ML models to be trained on multiple decentralized devices without actually sharing personal information publicly. In this perspective, organizations will benefit from collaborative learning while maintaining data locally and only sharing AI and ML model updates thus keeping data privacy regulations intact. &lt;/p&gt;

&lt;h3&gt;
  
  
  Improved Transparency and Trust
&lt;/h3&gt;

&lt;p&gt;Future AI and ML models will be explainable making AI security solutions more transparent, and understandable to humans thereby increasing trust in these systems. These AI models will provide clear explanations for automated response actions taken by security systems, therefore, facilitating top-notch regulatory compliance requirements.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enhanced User Behavior Analytics (UBA)
&lt;/h3&gt;

&lt;p&gt;The evolving nature of AI and ML in monitoring user behavior will advance its capabilities in detecting anomalies by constantly learning new users' behavior and adapting to the patterns thus improving the detection of newly invented threats. &lt;/p&gt;

&lt;p&gt;UBA will also integrate with IAM (Identity Access Management) to provide real-time and up-to-date authentication based on user activity hence decreasing data breaches.&lt;/p&gt;

&lt;h3&gt;
  
  
  AI for IoT Security
&lt;/h3&gt;

&lt;p&gt;As IoT devices escalate, AI and ML models will play a vital role in making sure the endpoints of these devices are secured. It will help detect and respond to threats at the endpoints ensuring remote and distributed environments are always protected against any form of cyber threats. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Artificial Intelligence and Machine Learning have positively impacted security in the cloud computing sector by reducing human intervention in systems attacks, using advanced monitoring tools to track unusual user behavior, they can trigger automated workflows to respond quickly to affected systems, and can also activate recovery processes in case of an attack. These awesome benefits will enable organizations to have improved security and save on operation and management costs for human intervention.&lt;/p&gt;

&lt;p&gt;The future of AI and ML in cloud computing is revolutionary, ushering in a new era of innovations to enhance security to systems that are intelligent and adaptive to emerging trends. &lt;/p&gt;

</description>
      <category>cybersecurity</category>
      <category>devops</category>
      <category>machinelearning</category>
      <category>ai</category>
    </item>
    <item>
      <title>How to create an AWS DynamoDB database and add data using Python Boto3 library.</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Wed, 15 May 2024 19:30:05 +0000</pubDate>
      <link>https://dev.to/joshwizard/how-to-create-an-aws-dynamodb-database-and-add-data-using-python-boto3-library-31mk</link>
      <guid>https://dev.to/joshwizard/how-to-create-an-aws-dynamodb-database-and-add-data-using-python-boto3-library-31mk</guid>
      <description>&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/dynamodb/" rel="noopener noreferrer"&gt;Amazon DynamoDB&lt;/a&gt; is a managed NoSQL serverless database service providing fast and predictable performance. With Amazon DynamoDB you do not need to provision or manage servers since no installation or operating software is needed. It also scales in and out to adjust the client’s usage in capacity. &lt;/p&gt;

&lt;p&gt;The benefits of adding your data programmatically to DynamoDB is that it reduces errors that may occur when adding data manually and also saves a lot of time because tasks are automated. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/pythonsdk/" rel="noopener noreferrer"&gt;Boto3&lt;/a&gt; is the official AWS SDK for Python. With well structured Python Scripts, the user can comfortably create, update and delete AWS resources. Boto3 is widely used to manage other AWS services such as creating instances, uploading and downloading files among others. &lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;AWS Account (Create one &lt;a href="https://portal.aws.amazon.com/billing/signup?refid=em_127222&amp;amp;redirect_url=https%3A%2F%2Faws.amazon.com%2Fregistration-confirmation#/start/email" rel="noopener noreferrer"&gt;here&lt;/a&gt; if not created)&lt;br&gt;
Python installed (Click this &lt;a href="https://www.python.org/downloads/" rel="noopener noreferrer"&gt;link&lt;/a&gt; if Python is not installed)&lt;br&gt;
VS Studio Code installed (Install using this &lt;a href="https://code.visualstudio.com/download" rel="noopener noreferrer"&gt;link&lt;/a&gt; if not installed)&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up Boto3 and AWS DynamoDB
&lt;/h2&gt;

&lt;p&gt;On your AWS root account, create an IAM user. Avoid using the AWS root user account for security purposes because, in AWS, the customer is responsible for security in the cloud. &lt;/p&gt;

&lt;h3&gt;
  
  
  Create an IAM user
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Search for IAM users on the AWS management console and on users, create a new user and a new password to sign in. &lt;/li&gt;
&lt;li&gt;Attach necessary policies such as Administrative Access to the user. &lt;/li&gt;
&lt;li&gt;Review your policies and click the create user button. &lt;/li&gt;
&lt;li&gt;View the new user you have created and create a new access key.&lt;/li&gt;
&lt;li&gt;Follow the steps and select the use of the Access key by choosing the Command Line Interface (CLI). &lt;/li&gt;
&lt;li&gt;Download .csv file for ease of use. 
### Configuring AWS credentials on your terminal
Open a new VS Code editor and open a new terminal. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Type this command on your terminal:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

aws configure


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Enter your &lt;code&gt;Access Key ID&lt;/code&gt; and &lt;code&gt;Secret access key&lt;/code&gt; as prompted. &lt;/p&gt;

&lt;p&gt;If you are okay with the default &lt;code&gt;region&lt;/code&gt; listed, press &lt;code&gt;Enter&lt;/code&gt; on your keyboard otherwise, use this command to change the &lt;code&gt;region&lt;/code&gt; &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

aws configure --region 'your_region'


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Press &lt;code&gt;Enter&lt;/code&gt; on the default format &lt;code&gt;[json]&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a DynamoDB table
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Search &lt;code&gt;DynamoDB&lt;/code&gt; on the &lt;code&gt;AWS console&lt;/code&gt; and create a new table. &lt;/li&gt;
&lt;li&gt;Give your table a new name e.g. &lt;code&gt;graduates&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Set the partition key (primary key) as &lt;code&gt;grad_id&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Proceed to creating the table. 
## Writing Python Code with Boto3
Install &lt;code&gt;boto3&lt;/code&gt; library by running this command on your terminal:&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

pip install boto3


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After successfully installing &lt;code&gt;boto3&lt;/code&gt;, create a new &lt;code&gt;Python&lt;/code&gt; file and name it &lt;code&gt;python_boto3.py&lt;/code&gt; and import the necessary libraries and write the whole python code as follows;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

import boto3

# Create a DynamoDB resource object
db = boto3.resource('dynamodb')

# Specify the DynamoDB table
table = db.Table('graduates')

# Add the first item
table.put_item(
    Item = {
        'grad_id': "1001",
        'Name': "John Doe",
        'Gender': "Male",
        'Phone' : "030 903",
        'email' : "johndoe@mail.com",
        'address' : "Nairobi, Kenya"
    }
)

# Add another Item
table.put_item(
    Item = {
        'grad_id': "1002",
        'Name': "Precious Ellie",
        'Gender': "Female",
        'Phone' : "0789 90878",
        'email' : "elliep@mail.com",
        'address' : "NewYork, USA"
    }
)

print ("Data added successfully")


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;On the above code snippet;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Import&lt;/code&gt; the necessary libraries to provide the interface to work with &lt;code&gt;AWS services&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Create a DynamoDB resource object named &lt;code&gt;db&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Specify the &lt;code&gt;DynamoDB&lt;/code&gt; table named &lt;code&gt;graduates&lt;/code&gt; using the &lt;code&gt;Table&lt;/code&gt; method of the &lt;code&gt;DynamoDB resource object&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Add an item to the &lt;code&gt;DynamoDB table&lt;/code&gt; using the &lt;code&gt;put_item&lt;/code&gt; method, providing the item details as a dictionary. &lt;/p&gt;

&lt;p&gt;Finally, set the print function to output &lt;code&gt;Data added successfully&lt;/code&gt; after effectively adding the item to the &lt;code&gt;table&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;To add the data above directly to the DynamoDB database on AWS you created, run the following command on your terminal. &lt;em&gt;(python then followed by the name of your python file)&lt;/em&gt; &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

python python_boto3.py


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;When the data has been uploaded successfully, you should see the data clearly after opening the &lt;code&gt;graduates&lt;/code&gt; table on AWS DynamoDB. For example,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqyftd7xabeooaasvi85o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqyftd7xabeooaasvi85o.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Best Practices and Considerations for using Boto3 with DynamoDB
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Configuring AWS IAM with proper Permissions
&lt;/h3&gt;

&lt;p&gt;Since you are responsible for security in the cloud, ensure your AWS IAM users have the minimum permissions necessary to perform their tasks. Use the &lt;code&gt;Least Privilege Principle&lt;/code&gt; to grant specific permissions only limited to DynamoDB operations and regularly reviewing and updating the IAM policies to ensure alignment with the principle of least privilege. Do not forget to grant temporary access roles rather than long term access to enhance security. &lt;/p&gt;

&lt;h3&gt;
  
  
  Optimizing Performance with Batch Operations
&lt;/h3&gt;

&lt;p&gt;Not only, when you need to add multiple items to a DynamoDB table, use the &lt;code&gt;BatchWriteItem&lt;/code&gt; operation to reduce the number of HTTP requests and improve performance. With this operation, you can delete up to 20 items in a single request. &lt;/p&gt;

&lt;p&gt;But also, use &lt;code&gt;BatchGetItem&lt;/code&gt; for retrieving multiple items at once. This will help minimize the number of requests and reduce latency.&lt;/p&gt;

&lt;h3&gt;
  
  
  Handling Large Datasets Efficiently
&lt;/h3&gt;

&lt;p&gt;Use &lt;code&gt;Parallel Processing&lt;/code&gt; to distribute the load across multiple threads or processes. This operation will reduce the time required to process large amounts of data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Managing DynamoDB Capacity and Throughput
&lt;/h3&gt;

&lt;p&gt;Avoid throttling and ensure optimal performance by using Amazon CloudWatch and use Auto-Scaling for your DynamoDB tables to automatically adjust capacity based on traffic patterns. This ensures your table can handle variable workloads without manual intervention. &lt;/p&gt;

&lt;p&gt;Make sure to configure appropriate scaling policies to balance cost and performance. &lt;/p&gt;

&lt;h3&gt;
  
  
  Security considerations
&lt;/h3&gt;

&lt;p&gt;It is a necessity to protect sensitive data on your DynamoDB. Use AWS server side encryption (SSE) with AWS key Management Service (KMS). You can also restrict access to specific items to enhance data security and privacy.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Modelling and Design
&lt;/h3&gt;

&lt;p&gt;Design your DynamoDB tables and indexes to support efficient queries. Plan your data access patterns in advance to avoid inefficient queries by appropriately using primary keys, sort keys and secondary indexes to optimize data retrieval.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We have explored how to add data to Amazon DynamoDB using Boto3 library in Python which started with the basics of properly installing necessary features and configuring AWS credentials and regions settings. We also explained the Python code in detail and finalized our project by highlighting the best practices to ensure efficient usage of DynamoDB.&lt;/p&gt;

&lt;p&gt;To ensure your DynamoDB operations are efficient, scalable and secure, adhere to best practices carefully considering performance. Using Boto3 to manage DynamoDB allows powerful and flexible top tier data handling capabilities.&lt;/p&gt;

&lt;p&gt;I encourage you to further explore other Boto3 functionalities to fully leverage its capabilities on DynamoDB for your specific use. &lt;/p&gt;

</description>
      <category>boto3</category>
      <category>python</category>
      <category>aws</category>
      <category>devops</category>
    </item>
    <item>
      <title>Creating a Chatbot using Python Tkinter GUI toolkit</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Tue, 07 May 2024 07:02:13 +0000</pubDate>
      <link>https://dev.to/joshwizard/creating-a-chatbot-using-python-tkinter-gui-toolkit-1cdo</link>
      <guid>https://dev.to/joshwizard/creating-a-chatbot-using-python-tkinter-gui-toolkit-1cdo</guid>
      <description>&lt;p&gt;&lt;a href="https://www.googleadservices.com/pagead/aclk?sa=L&amp;amp;ai=DChcSEwiX2enVjeiFAxXVREECHcf0D9wYABABGgJ3cw&amp;amp;ase=2&amp;amp;gclid=Cj0KCQjwir2xBhC_ARIsAMTXk86lRABiiGgYB1p7eEeg3Y9J4aJR9M5pbLB-xB8Fz40mXBvb_AdRdqEaAuzpEALw_wcB&amp;amp;ohost=www.google.com&amp;amp;cid=CAESV-D2w6fWWsPHqRqqBaR3EuNKS9oQngLjc5hZ4JtCdxlnb7yBRdJjBXgvd1dKeB9EqkIKrbfuTdNUnYOZpetPo3edB0es6lQ59zNSIKuxku6JdHkFxpG73w&amp;amp;sig=AOD64_0SVpqtVL2w55qPFe5scjmJLTlZUA&amp;amp;q&amp;amp;nis=4&amp;amp;adurl&amp;amp;ved=2ahUKEwiw5-HVjeiFAxX2caQEHeYvDj4Q0Qx6BAgNEAE" rel="noopener noreferrer"&gt;Chatbot&lt;/a&gt; is a software application that simulates human conversation through text or voice. They are created to answer FAQs (Frequently Asked Questions), provide full-time support to customers, handle inquiries, and reach out to agents for further support on issues that require human intervention.&lt;/p&gt;

&lt;p&gt;In this project, I will show you how to create a Python chatbot powered by the Tkinter GUI (Graphical User Interface) toolkit. This chatbot will handle inquiries and answer questions on the DevOps Engineering Field.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up the Environment
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Install &lt;code&gt;Python&lt;/code&gt; on your operating system by following these steps &lt;a href="https://www.python.org/downloads/" rel="noopener noreferrer"&gt;here&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Install &lt;code&gt;Tkinter&lt;/code&gt; by running this command on your bash/ VS Code terminal&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

pip install tk


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;Create a folder and name it &lt;code&gt;Chatbot&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Open the folder with &lt;code&gt;VS Code&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Create a Python file and call it &lt;code&gt;Chatbot.py&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Designing the User Interface
&lt;/h2&gt;

&lt;p&gt;For convenient User interactions with the chatbot, consider designing a simple Graphical User Interface like this one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7a37dktjfjr3ynacgfnm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7a37dktjfjr3ynacgfnm.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To achieve the above layout, you need to import the necessary libraries and put your code in order as follows:&lt;/p&gt;

&lt;h4&gt;
  
  
  Importing the necessary libraries
&lt;/h4&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

from tkinter import *
import tkinter as tk
from tkinter import PhotoImage
from tkinter import Label, Tk
from PIL import Image, ImageTk
import datetime


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Create a date and time function to greet the client as per visiting time.
&lt;/h4&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# greetings based on time opening chatbot
current_time = datetime.datetime.now()

if 6 &amp;lt;= current_time.hour &amp;lt;= 12:
    greeting = "Good Morning!"
elif 12 &amp;lt;= current_time.hour &amp;lt;= 16:
    greeting = "Good Afternoon!"
else:
    greeting = "Good Evening!"


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Create the main Tkinter GUI based on your size, color, and font.
&lt;/h4&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Tkinter UI/UX design
root=Tk()
root.title('DevOps Chatbot')
root.geometry('720x400+150+100')
root.configure(bg='light blue')
root.resizable(False, False)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Create the heading of the chatbot
&lt;/h4&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Heading label to inform the user what to do
heading = Label(root, text=f"{greeting} 😄. I'm DevOps Chatbot. I have DevOps answers for you from the drop-down menu.", fg='#000', bg='white', font=('Microsoft YaHei UI Light', 10, 'bold'))
heading.place(x=10, y=5)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After the heading, there is the dropdown menu. Design appropriately adding necessary questions that the users are interested in or may ask. Write the questions as precisely as possible because they will fetch data from a JSON file.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Dropdown Menu for the user to select
dropdown = tk.StringVar()
dropdown.set("Select a response")
dropdown_menu = tk.OptionMenu(root, dropdown, "What is DevOps", "DevOps Principles", "Benefits of adopting DevOps", "DevOps Career Paths", "DevOps Tools", "Learning Resources")
dropdown_menu.grid(row=1, column=0, padx=100, pady=100)
dropdown_menu.place(x=10, y=40)
dropdown_menu.config(width=70)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Create an area where answers will be displayed
&lt;/h4&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#This is where the user answers will be displayed
chat_history = Text(root, fg='black', border=2, bg='white', height=17, width=57, font=("cambria", 11))
chat_history.place(x=10, y=90)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Create necessary buttons for navigation and interaction with the chatbot.
&lt;/h3&gt;

&lt;p&gt;This Chatbot will have three buttons;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;Click for answers&lt;/code&gt; button will display answers corresponding to the selection of the question.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;Clear Screen&lt;/code&gt; button will clear all the data in the chat area. &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;Close Window&lt;/code&gt; button will close the chatbot.&lt;/p&gt;

&lt;p&gt;The following code shows how to design the buttons. &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Button to ask the question
ask = Button(root, width=25, pady=7, text="Click for Answers", fg='black').place(x=500, y=90)

# Button to clear the screen
clear = Button(root, width=25, pady=7, text="Clear Screen", fg='black')
clear.place(x=500, y=150)

# Button to exit the chatbot
exit = Button(root, width=25, pady=7, text="Close Window", fg='black')
exit.place(x=500, y=210)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Displaying DevOps image
&lt;/h3&gt;

&lt;p&gt;Download the image and copy it to your current directory &lt;/p&gt;

&lt;p&gt;Resize the image to specific dimensions as you may need.&lt;/p&gt;

&lt;p&gt;Set the place to display your image. &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Open the image using Pillow
pil_img = Image.open('devops.png')

# Resize the image to specific dimensions (adjust width and height as needed)
width, height = 180, 110
pil_img = pil_img.resize((width, height), Image.BICUBIC)

# Convert Pillow image to Tkinter-compatible format
tk_img = ImageTk.PhotoImage(pil_img)

# Create a Label image to display the image
Label(root, image=tk_img, bg="white").place(x=500, y=270)

root.mainloop()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Run your code using the &lt;code&gt;VS Code&lt;/code&gt; run command or by running the following command;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

python Chatbot.py


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  Building the Chatbot logic
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Create a Backend
&lt;/h3&gt;

&lt;p&gt;This chatbot will load its data from a JSON file store the data in a dictionary and then create lists inside the dictionary. &lt;/p&gt;

&lt;p&gt;On your directory, create a new file and call it &lt;code&gt;answers.json&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;After coming up with the desired FAQs from clients, create topics from the questions. &lt;br&gt;
The questions will be used on the dropdown menu. Further elaborations on the topics will be stored in &lt;code&gt;answers.json&lt;/code&gt; file.&lt;/p&gt;

&lt;p&gt;An example of a question like &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;DevOps Career Paths&lt;/code&gt;&lt;br&gt;
will have an answer such as &lt;code&gt;DevOps Engineer&lt;/code&gt; with a little explanation of the roles like &lt;code&gt;Plays a central role in implementing and managing DevOps Practices. They are responsible for automating build, deployment and release processes, configuring and managing infrastructure and ensuring smooth collaboration between development and operations teams&lt;/code&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Choose a niche of your taste and work on it. &lt;/p&gt;
&lt;h3&gt;
  
  
  Handling user input and generating responses
&lt;/h3&gt;

&lt;p&gt;Our chatbot has a dropdown menu, a chat area, a click for answers button, a clear screen, and a close window. &lt;/p&gt;

&lt;p&gt;When the user selects a quiz from the &lt;code&gt;dropdown menu&lt;/code&gt; and &lt;code&gt;click for answers&lt;/code&gt; button, an answer must be displayed in the &lt;code&gt;chat area&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;The logic works as follows:&lt;/p&gt;
&lt;h4&gt;
  
  
  Click for answers
&lt;/h4&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Create a dictionary to keep track of the last selected response index
last_selected_index = {}

# Response answer based on the selected question
def random_answers():
    selected_response = dropdown.get()
    if selected_response in answers:
        responses_list = answers[selected_response]
        if responses_list:

            # Display the questions in order
            last_index = last_selected_index.get(selected_response, -1)
            last_index = (last_index + 1) % len(responses_list)

            selected_response_text = responses_list[last_index]
            last_selected_index[selected_response] = last_index

            # View answers on the chatbot
            chat_history.config(state=tk.NORMAL)
            chat_history.insert(tk.END, f"{selected_response_text}\n\n")
            chat_history.config(state=tk.DISABLED)
        else:
            chat_history.config(state=tk.NORMAL)
            chat_history.insert(tk.END, "I don't have a response for that.\n\n")
            chat_history.config(state=tk.DISABLED)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Create a function to retrieve the data in the JSON file in order and view the answers on the chat area when the &lt;code&gt;Click for answers&lt;/code&gt; button is clicked. &lt;/p&gt;

&lt;p&gt;Update the Tkinter user interface code to appear like this&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Button to ask the question
ask = Button(root, command=random_answers, width=25, pady=7, text="Click for Answers", fg='black').place(x=500, y=90)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Clear Screen
&lt;/h4&gt;

&lt;p&gt;When the user clicks this button, all the data in the chat area must be deleted. (The chat history will not be saved but it is possible to save the chat history).&lt;/p&gt;

&lt;p&gt;Create a function for the &lt;code&gt;Clear Screen&lt;/code&gt; button to delete chat history.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# Clear the chat history
def clear_screen():
    chat_history.config(state=tk.NORMAL)
    chat_history.delete('1.0', tk.END)
    chat_history.config(state=tk.DISABLED)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Close Window
&lt;/h4&gt;

&lt;p&gt;This button will exit the user interface and end all interactions with the user. &lt;/p&gt;

&lt;p&gt;The function behind this button is as hereunder;&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

# exit the chatbot
def exit_chat():
    root.quit()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After completing the above code, you should be able to select the &lt;code&gt;dropdown menu&lt;/code&gt;, &lt;code&gt;clear screen&lt;/code&gt;, and &lt;code&gt;close window&lt;/code&gt;. &lt;code&gt;Click for answers&lt;/code&gt; will not work since we have not integrated the frontend with the backend. &lt;br&gt;
Integrating the Frontend with the Backend.&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrating the Frontend with the Backend
&lt;/h2&gt;

&lt;p&gt;After designing the user interface and JSON file, the final goal is to achieve a functional chatbot that will return responses when the user interacts with your interface. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

import json

# Load initial responses from a JSON file
answers = {}

try:
    with open('answers.json', 'r') as file:
        answers = json.load(file)
except FileNotFoundError:
    answers = {}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This Python code loads initial responses from a JSON file named &lt;code&gt;answers.json&lt;/code&gt;. It first initializes an empty dictionary called &lt;code&gt;answers&lt;/code&gt;. Then, it attempts to open the JSON file in read mode using a &lt;code&gt;with&lt;/code&gt; statement. If the file is found, it loads the contents of the file into the &lt;code&gt;answers&lt;/code&gt; dictionary using the &lt;code&gt;json.load()&lt;/code&gt; function. If &lt;code&gt;FileNotFoundError&lt;/code&gt; is not raised, it simply keeps the &lt;code&gt;answers&lt;/code&gt; dictionary empty. &lt;/p&gt;

&lt;h3&gt;
  
  
  Testing the integration
&lt;/h3&gt;

&lt;p&gt;Run this command on your terminal to open the user interface and interact with the chatbot to verify that the chatbot is fully functional and ready for deployment and distribution. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

python chatbot.py


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If the chatbot is running without errors, you can now decide to use it on your machine or create an executable file for deployment and distribution. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Chatbots are ideal tools for communication in an organization since they can respond to Frequently Asked Questions (FAQs) with the user and the chatbot can be designed to redirect the user to seek further assistance elsewhere by providing links to websites and contacts to the organization’s customer care.&lt;/p&gt;

&lt;p&gt;Chatbots can be easily modified and updated depending on the user's or organization’s demands. Nowadays, Artificial Intelligence is integrated with chatbots and the chatbots can learn as the user interacts with it and adjust the responses accordingly.&lt;/p&gt;

</description>
      <category>chatbot</category>
      <category>python</category>
      <category>softwaredevelopment</category>
      <category>developer</category>
    </item>
    <item>
      <title>How to containerize a Flask Python application using Docker</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Tue, 23 Apr 2024 22:28:44 +0000</pubDate>
      <link>https://dev.to/joshwizard/how-to-containerize-a-flask-python-application-using-docker-2i2g</link>
      <guid>https://dev.to/joshwizard/how-to-containerize-a-flask-python-application-using-docker-2i2g</guid>
      <description>&lt;p&gt;Before &lt;a href="https://docs.docker.com/"&gt;Docker&lt;/a&gt;, developers, and operations engineers had a huge setback when deploying an application to the servers because dependencies were to be installed on every machine to run the desired application. This approach created a lot of errors and misunderstandings.&lt;/p&gt;

&lt;p&gt;Docker came in to solve this problem by containerizing the application to a single image which incorporates all its dependencies and configurations thus reducing errors and the amount of work and expertise to install on different machines. Images can run smoothly regardless of your operating system, it is lightweight since it can be scaled up or down and containers can be easily managed and automated.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;Docker Installed and running.&lt;br&gt;
Basic understanding of Python and Docker concepts&lt;br&gt;
Visual Studio Code&lt;br&gt;
Bash terminal&lt;/p&gt;
&lt;h2&gt;
  
  
  Setting up the Python application
&lt;/h2&gt;

&lt;p&gt;In this project, you will design a Python application using the Flask framework to handle HTTP requests. It’s a framework used to create web applications and has built-in features in development servers.&lt;/p&gt;
&lt;h3&gt;
  
  
  Project Files
&lt;/h3&gt;

&lt;p&gt;In this tutorial, I will show and guide you on how to create a simple Flask app that will include the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A basic structure for the Flask application&lt;/li&gt;
&lt;li&gt;A route that handles GET requests and returns a simple response.&lt;/li&gt;
&lt;li&gt;Running the Python Flask app on the terminal by using Python commands.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Follow these steps to design the application.&lt;/p&gt;
&lt;h3&gt;
  
  
  Set up an ideal environment
&lt;/h3&gt;

&lt;p&gt;Make sure you have the latest Python and Flask installed.&lt;br&gt;
Create a new directory for your Flask app.&lt;br&gt;
Navigate to that directory in your terminal.&lt;/p&gt;

&lt;p&gt;Create and activate a virtual environment&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m venv venv
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;On windows, use venv\Scripts\activate&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;source venv/bin/activate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Install Flask
&lt;/h3&gt;

&lt;p&gt;Run this command in your terminal&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install Flask
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Create a Flask app
&lt;/h3&gt;

&lt;p&gt;In your project directory, create a new file and call it &lt;code&gt;web.py&lt;/code&gt;&lt;br&gt;
In &lt;code&gt;web.py&lt;/code&gt;, &lt;code&gt;import Flask&lt;/code&gt; and create an instance of the Flask application.&lt;br&gt;
Define a route that listens to GET requests on the root URL ‘/’ and returns a simple message.&lt;/p&gt;

&lt;p&gt;Your code should look like this;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from flask import Flask

#Create an instance of the Flask application
web = Flask(__name__)

#Define a route for the root URL ("/")
@web.route('/')

def index():
    return 'Hello DevOps Engineer, This is your Python Flask App. Congratulations!'

# Run the application
if __name__ = '__main__':`
    `web.run(host="0.0.0.0", port=int("3000"), debug=True)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the above code snippet;&lt;/p&gt;

&lt;p&gt;You import the Flask &lt;code&gt;class&lt;/code&gt; and create an instance of the Flask application named &lt;code&gt;web&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;The route decorator &lt;code&gt;@web.route(‘/’)&lt;/code&gt; specifies that the function &lt;code&gt;index()&lt;/code&gt; returns a simple message: &lt;code&gt;“Hello, DevOps Engineer. This is your Python Flask App. Congratulations!”&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;web.run()&lt;/code&gt; method starts the application whereas &lt;code&gt;debug=True&lt;/code&gt; enables debugging mode.&lt;/p&gt;

&lt;h3&gt;
  
  
  Run the app
&lt;/h3&gt;

&lt;p&gt;For the app to run, execute the following command in your terminal.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python web.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;On the terminal, you should see an output indicating that your Flask app is running and how to access it via this link &lt;a href="http://127.0.0.1:3000/"&gt;URL&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Navigate to your browser with the specified URL and see your Flask app in action.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dockerfile
&lt;/h2&gt;

&lt;p&gt;A Dockerfile is a text document that encompasses all the commands a developer will call on the command line for it to build an image. It consists of the base image of your app, the directory of your app, the dependencies document, the environment that can also be specified, the port for your app, and relevant commands to run the image after building.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to create a Dockerfile.
&lt;/h3&gt;

&lt;p&gt;Since the Server is running, stop the server by pressing &lt;code&gt;Ctrl + C&lt;/code&gt; on your keyboard. &lt;/p&gt;

&lt;p&gt;Navigate to the home directory and create a Dockerfile by running the following command;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;touch Dockerfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After the successful creation of the Dockerfile, the next step is creating commands for them to build an image. &lt;/p&gt;

&lt;h3&gt;
  
  
  Creating Dockerfile Instructions for the Python application
&lt;/h3&gt;

&lt;p&gt;The Dockerfile have the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
# Use the official Python image as a base
FROM python

# Set the working directory in the container
WORKDIR /app

# Copy the requirements.txt file to the working directory
COPY . /app

# Install the required dependencies
RUN pip install -r requirements.txt

EXPOSE 3000

# Define the command to run the app and expose the port the app runs on
CMD python ./web.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Explanation of each Dockerfile instruction
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM python
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use the official &lt;code&gt;Python base image&lt;/code&gt;. Not specifying the Python image, it will automatically use the latest Python-based image.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;WORKDIR /app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Sets the working directory to &lt;code&gt;/app&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;COPY . /app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command copies all files from the current directory (Dockerfile location) to the &lt;code&gt;/app&lt;/code&gt; directory in the container.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;RUN pip install -r requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will install the dependencies specified in the &lt;code&gt;requirements.txt&lt;/code&gt; and ensure all required packages are installed.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;EXPOSE 3000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command exposes port &lt;code&gt;3000&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CMD python ./web.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command defines the command to run the &lt;code&gt;Flask app&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Since there is no file named &lt;code&gt;requirements.txt&lt;/code&gt;, it can be created easily by the command &lt;code&gt;pip freeze&lt;/code&gt; which lists all the dependencies of the image. To store the dependencies on a file you have to specify the text file running this command on your terminal.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip freeze &amp;gt; requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Building the Docker image
&lt;/h2&gt;

&lt;p&gt;Now, after the &lt;code&gt;Dockerfile&lt;/code&gt; is set and no typo errors, create an image by running this command on your terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker build -t &amp;lt;username:version&amp;gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;docker build&lt;/code&gt; is a docker command to build the image.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;-t&lt;/code&gt; will tag the image.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;username&lt;/code&gt; is your docker registry username plus your desired container name separated by a forward slash.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;version&lt;/code&gt; can be expressed in different styles e.g. &lt;code&gt;v1.0.0, 0.0.1 RELEASE&lt;/code&gt; among others.&lt;/p&gt;

&lt;p&gt;Assume my command to build my container was this;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker build -t joshmurih/webapp:0.0 .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Running the Container
&lt;/h2&gt;

&lt;p&gt;After successfully creating a &lt;code&gt;Docker image&lt;/code&gt;, it is necessary to run the container to check for errors and make sure that it works as expected so that the container can later be pushed to the &lt;code&gt;Docker hub&lt;/code&gt; for deployment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Suitable commands to run the image
&lt;/h3&gt;

&lt;p&gt;On the terminal run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker container run -d -p 3000:3000 &amp;lt;username:version&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;docker container run&lt;/code&gt;: This is a Docker command to run images.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;-d&lt;/code&gt; means you run the image in “detach” mode. (You cannot be able to visit the site and test the docker image on your terminal).&lt;/p&gt;

&lt;p&gt;&lt;code&gt;-p 3000:3000&lt;/code&gt;: This command means that you are binding port 3000 to the docker default port.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&amp;lt;username:version&amp;gt;&lt;/code&gt; This will be the exact name you defined when creating the Docker image. &lt;/p&gt;

&lt;h3&gt;
  
  
  Testing the final containerized application
&lt;/h3&gt;

&lt;p&gt;To verify the image is running in the background, run this command on your terminal.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker container ls
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It displays all the containers that are running. The current container is the first one. &lt;/p&gt;

&lt;p&gt;Open your browser and visit this &lt;code&gt;URL&lt;/code&gt;: &lt;/p&gt;

&lt;p&gt;After successfully running the URL, you should see the data below otherwise repeat the process counter checking for minor typos and other errors.&lt;/p&gt;

&lt;h2&gt;
  
  
  Docker Hub
&lt;/h2&gt;

&lt;p&gt;Docker Hub is a software platform built for developers to find, use, and share containers from anywhere on the continent.&lt;/p&gt;

&lt;p&gt;For your team to use the final image you have already created, you have to push the container to &lt;code&gt;Docker Hub&lt;/code&gt; for deployment and they in return will &lt;code&gt;docker pull&lt;/code&gt; the final containerized application to their operating system for further use. &lt;/p&gt;

&lt;h3&gt;
  
  
  Pushing the docker image for deployment
&lt;/h3&gt;

&lt;p&gt;Since you created the &lt;code&gt;docker image&lt;/code&gt; with the tag of your &lt;code&gt;Docker Hub username&lt;/code&gt; there is no need to tag it again. &lt;/p&gt;

&lt;p&gt;Proceed to &lt;code&gt;docker login&lt;/code&gt; and enter the required username and password when prompted. &lt;/p&gt;

&lt;p&gt;Once logged in you can push your &lt;code&gt;image&lt;/code&gt; to &lt;code&gt;Docker Hub&lt;/code&gt; using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker push &amp;lt;username:version&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Containerized applications run on isolated environments thus making it easier to manage dependencies and avoid conflicts between applications. Its portability and lightweight nature make deployment and testing easier. Since the containers share the host operating system’s kernel, utilisation of hardware resources creates efficiency. Lastly, the containers can be started and stopped quickly. This speed enables the developers to run new updates hence boosting the market of new features.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Deep Dive into the Pros of Amazon Route 53</title>
      <dc:creator>Joshua Muriki</dc:creator>
      <pubDate>Wed, 03 Jan 2024 22:03:22 +0000</pubDate>
      <link>https://dev.to/joshwizard/deep-dive-into-the-pros-of-amazon-route-53-85f</link>
      <guid>https://dev.to/joshwizard/deep-dive-into-the-pros-of-amazon-route-53-85f</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Amazon Route 53 is a highly available and scalable Domain Name System (DNS) with a very reliable domain registration, routing capabilities, internet traffic management, consistent health checks and failover and importantly, it is secure. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Features of Amazon Route 53&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Domain Registration&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;Domain is a unique address used to access websites&lt;/em&gt;. &lt;/p&gt;

&lt;p&gt;First, check if your domain is available to use. If your chosen domain is already taken, try to change the top-level domain such as .com to another top-level domains like .ke, .us, .uk based on country or .org as an organization others like .io. &lt;br&gt;
During registration, you will provide your names and contact details. After registration process, Amazon will send your information to the domain registrar for your domain registration. The registry will finalize the task by storing your domain information into their database. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DNS Routing:&lt;/strong&gt;&lt;br&gt;
It ensures that queries and responses between devices and DNS Servers is distributed according the configured weights.&lt;br&gt;
It is achieved when a user clicks on a link in a browser and then searches for the corresponding IP address by sending DNS request to the servers therefore a response is generated according to the user’s request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Traffic Management:&lt;/strong&gt;&lt;br&gt;
Amazon Route 53 uses rules to dictate how to route traffic on the web by using direct percentages of data traffic to particular endpoints or by defining a reserve endpoint when the server is not healthy. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Health Checks and Failover&lt;/strong&gt;&lt;br&gt;
Amazon Route 53 has a health monitor that checks the health and performance of your web applications, web servers, Amazon CloudWatch alarm and other resources. &lt;br&gt;
Carrying out health checks often to services is very important as it shows whether or not a particular server is capable of performing work successfully. &lt;br&gt;
You can view the current status of your health checks on Route 53 console, AWS SDKs, AWS CLI or the Route 53 API. &lt;br&gt;
You can also configure Amazon CloudWatch alarm to receive a notification on the status of a health check. &lt;br&gt;
In case for a failover, you can configure your Route 53 to route your traffic from an unhealthy resource to a health resource. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DSN Security:&lt;/strong&gt;&lt;br&gt;
Security in Amazon Route 53 is the highest priority. Security is a shared responsibility between you and AWS which means AWS is responsible for protection of AWS services and its infrastructure and your responsibility is determined by the AWS service that you use. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;COMMON USES&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hosting websites&lt;/strong&gt;&lt;br&gt;
Amazon Route 53 enables user to host websites after purchasing a domain name and then uploading your website on S3 bucket and finally directing your domain to the S3 bucket. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Load balancing traffic&lt;/strong&gt;&lt;br&gt;
Route 53 helps you to distribute network traffic equally across a pool of resources that support an application. It controls internet traffic between the application servers and clients therefore improves its scalability, availability and performance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Disaster Recovery&lt;/strong&gt;&lt;br&gt;
Route 53 is well designed using the architected framework to ensure data is available in case of a crisis. By using the automatic failover method, you can ensure your resources are still available and easily accessible by establishing a failover routing policy if Route 53. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Global Content Delivery&lt;/strong&gt;&lt;br&gt;
By creating a domain name using Amazon Route 53, the domain is easily accessible all over the world by a single click on a link by the user. Going Global is achieved by building your infrastructure and configuring your AWS Regions and Availability Zones which are connected with low-latency and high-throughput networking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Routing based of health checks&lt;/strong&gt;&lt;br&gt;
Health checks helps maintain a suitable web service that is reliable and available. With Amazon Route 53 Application Recovery Controller, you can set up routing control health checks to manage internet traffic failover for your application.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Failover routing&lt;/strong&gt;&lt;br&gt;
Failover routing will redirect your internet production traffic from the affected region to the recovery region. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DNS ROUTING&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Simple routing policy:&lt;/strong&gt;&lt;br&gt;
This is the use of a single resource to perform a given function for your domain. This policy can be used to create records in a private hosted zone.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Weighted routing:&lt;/strong&gt;&lt;br&gt;
This is used to route traffic to multiple resources based on your assigned proportions or weights. Use in a scenario where you want to control the distribution of traffic among different resources.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Latency-based routing:&lt;/strong&gt;&lt;br&gt;
Use this service when you happen to have resources in multiple AWS Regions but a certain region offers the best latency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Geolocation based routing:&lt;/strong&gt;&lt;br&gt;
This type of routing is used when your users are in different locations and you want to route traffic to best suit their needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-value routing:&lt;/strong&gt;&lt;br&gt;
Use this service when you have more than eight healthy records selected and you want Route 53 to respond to its DNS queries. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Failover routing:&lt;/strong&gt;&lt;br&gt;
Use this service to automatically redirect from an unhealthy resource to a healthy resource in case of a crisis to enhance resilience of your applications. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Amazon Route 53 is an amazing web service for hosting websites and other cutting edge technologies that makes sure the developer and the client receives the best quality service at a go.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
