<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Lidor Ettinger</title>
    <description>The latest articles on DEV Community by Lidor Ettinger (@naturalett).</description>
    <link>https://dev.to/naturalett</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/naturalett"/>
    <language>en</language>
    <item>
      <title>Amplify Your Tech Stack with Jenkins Shared Libraries</title>
      <dc:creator>Lidor Ettinger</dc:creator>
      <pubDate>Tue, 23 May 2023 17:18:47 +0000</pubDate>
      <link>https://dev.to/naturalett/amplify-your-tech-stack-with-jenkins-shared-libraries-434k</link>
      <guid>https://dev.to/naturalett/amplify-your-tech-stack-with-jenkins-shared-libraries-434k</guid>
      <description>&lt;p&gt;With the growing adoption of the Jenkins Pipeline in organizations, common patterns emerge, necessitating the sharing of reusable components to enhance code management and eliminate redundancies. Jenkins’ “Shared Libraries” feature facilitates this by allowing the definition of libraries in external repositories, seamlessly integrated into existing Pipelines.&lt;/p&gt;

&lt;p&gt;In this post, I will guide you through the process of harnessing the full potential of shared functions in Groovy pipelines. By implementing these shared functions, you will be able to unlock scalability, maintainability, and efficiency in your pipelines, resulting in heightened productivity and accelerated development processes.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;What should you expect to learn?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ol&gt;
&lt;li&gt;Achieving consistency and modularity with Shared Libraries in DevOps processes&lt;/li&gt;
&lt;li&gt;Orchestrating pipeline workflows by modularizing components&lt;/li&gt;
&lt;li&gt;Reusability and benefits of shared common functions for collaboration&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oo48AyGV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hgwatnynumz2uhml8fhk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oo48AyGV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hgwatnynumz2uhml8fhk.png" alt="Image description" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Consistency through Modularization
&lt;/h2&gt;

&lt;p&gt;By leveraging Shared Libraries, DevOps teams can modularize processes across the entire project lifecycle, enabling tasks like configuration changes, quality checks, and dependency gathering to be encapsulated within reusable library functions. This approach minimizes error risks and fosters consistency throughout the development process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Orchestrating the Pipeline Workflow
&lt;/h2&gt;

&lt;p&gt;As DevOps faces a myriad of requests for creating and customizing pipelines, becoming the central focal point, it is crucial to advance towards a more sophisticated orchestration approach. Rather than writing numerous repetitive pipelines, we can deconstruct workflow processes into modular components.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;(root)
+- src
|   +- org
|       +- foo
|            +- functions
|.               +- docker.groovy
|            +- pipelines
|                +- backend.groovy
+- vars
|   +- commonPipeline.groovy
+- resources
|   +- org
|       +- foo
|           +- docker.yaml (not discussed yet)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Unveiling the skeleton pipeline
&lt;/h2&gt;

&lt;p&gt;Let’s lay the foundation for a standardized skeleton pipeline that provides developers with a clear framework to follow.&lt;/p&gt;

&lt;p&gt;Hers is the Core Pipeline Blueprint:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;/vars/commonPipeline.groovy&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/usr/bin/env groovy
def call(sharedLibrary, svcName, buildCommands, pod, slackChannel) {
...
    pipeline {
        agent {
...
        }
        stages {
            stage('Initialization') {
...
            }
            stage('Compilation') {
                when { expression { buildCommands['compileData'].run } }
                steps {
                    echo "Starting Compilation stage"
                    script {
                        try {
                            sharedLibrary.executeStage("compile", buildCommands['compileData'])
                        }  catch(Exception e) {
                            echo "Failed in compilation stage: ${e.toString()}"
                            throw e
                        }
                    }
                }
            }
            stage('Unit test') {
...
            }
            stage('Build and Upload Artifact') {
...
            }
            stage('Integration Tests') {
...
            }
            stage('Deployment') {
...
            }
        }
        post {
...
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The Reusability of Common Functions
&lt;/h2&gt;

&lt;p&gt;Unleashing the benefits of Shared Common Functions can foster collaboration and ensure high professional development practices.&lt;/p&gt;

&lt;p&gt;A Repetitive Function Example for Reference:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;/src/org/foo/functions/docker.groovy&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def buildImage(Map args=[:]) {
    if (!args.containerName) args.containerName = containerName
    if (!args.version) args.version = "latest"
    container(args.containerName) {
        return docker.withRegistry(
                urlRegistry,
                login
            ) {
            docker.build("${args.imageName}:${args.version}")
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The responsibility of the build image function is to build a docker image.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do you ponder the frequency of using this function throughout the lifespan of our workflow?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;If the answer is — yes&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You might want to consider organizing the function within the common function section, ensuring it is logically structured in a hierarchical manner.&lt;/p&gt;

&lt;p&gt;In our scenario, we have created a folder where we store shared functions related to Docker. This is where we create and store all the Docker functions for our various use cases. Subsequently, we can import the docker.groovy class and conveniently reuse all the functions it contains.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the pipeline workflow
&lt;/h2&gt;

&lt;p&gt;The workflow class, located in the shared library, is distinct and tailored to the specific needs of the team or product. It serves as an individualized class within the library and can potentially utilize other shared classes, such as the docker.groovy mentioned earlier. Essentially, the purpose of the pipeline workflow is to incorporate and build upon the skeleton structure we previously established.&lt;/p&gt;

&lt;p&gt;Here is an example:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;/src/org/foo/pipelines/backend.groovy&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package org.foo.pipelines;
import groovy.transform.Field

@Field String svcName = (scm.getUserRemoteConfigs()[0].getUrl().tokenize('/')[3].split("\\.")[0]).toLowerCase()
@Field String containerName = 'docker', organization = "naturalett"
@Field def dockerActions = new org.foo.functions.docker()

def executeStage(stageName, stageData, tag="") {
    switch (stageName) {
        case "initializaion":
            this.initializaion(stageData)
            break;
        case "compile":
            this.compile(stageData)
            break;
        case "test":
            this.test(stageData)
            break;
        case "artifact":
            this.artifact(stageData)
            break;
        case "int-test":
            this.intTest(stageData)
            break;
        case "deployment":
            this.deployment()
            break;
    }
}
def initializaion(stageData) {
    echo "TODO"
}
def compile(stageData) {
    container(containerName) {
        creds.setupCredentials()
        image = dockerActions.buildImage(
            imageName: "${organization}/${svcName}"
        )
        // Alternative way to build the image: image = docker.build("${organization}/${svcName}")
    }
}
def test(stageData) {
    echo "TODO"
}
def artifact(stageData) {
    echo "TODO"
}
def intTest(stageData) {
    echo "TODO"
}
def deployment(stageData) {
    echo "TODO"
}
def successStep() {
    echo "TODO"
}
return this
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differentiation between each pipeline occurs within the workflow itself. Each pipeline defines its own workflow and utilizes the shared classes from the shared library. This approach allows us to have a tailored workflow for each team or product while also promoting the reusability of functions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summarize
&lt;/h2&gt;

&lt;p&gt;We’ve gained an understanding and explored the concept of Jenkins Shared Libraries and their benefits in optimizing Jenkins Pipelines. Shared Libraries enable the reuse of components across projects, enhancing code management and eliminating redundancies. By modularizing processes and encapsulating them within reusable library functions, organizations can achieve consistency and modularity in their DevOps processes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Our next steps involve…
&lt;/h2&gt;

&lt;p&gt;Furthermore, there are additional practices that justify further discussion. In the upcoming post, I will delve into further functionality regarding automated pipeline decisions, including:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Configuring project language for initial setup&lt;/li&gt;
&lt;li&gt;Jenkinsfile implementation&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;&lt;em&gt;Are you looking to enhance your DevOps skills and learn how to implement continuous integration using Jenkins container pipelines? Join my course, where you’ll gain practical experience in various pipeline scenarios and master production skills. Take the first step towards becoming a DevOps expert today.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;a href="https://www.udemy.com/course/hands-on-mastering-devops-ci-and-jenkins-pipelines-in-aws/?referralCode=0F32127B68008428C23F"&gt;Hands-On Mastering DevOps — CI and Jenkins Container Pipelines&lt;/a&gt;
&lt;/h2&gt;

</description>
      <category>devops</category>
      <category>jenkins</category>
      <category>pipelines</category>
      <category>codemanagement</category>
    </item>
    <item>
      <title>Optimize Development with Jenkins Pipelines and Continuous Integration</title>
      <dc:creator>Lidor Ettinger</dc:creator>
      <pubDate>Fri, 19 May 2023 22:42:13 +0000</pubDate>
      <link>https://dev.to/naturalett/optimize-development-with-jenkins-pipelines-and-continuous-integration-1h49</link>
      <guid>https://dev.to/naturalett/optimize-development-with-jenkins-pipelines-and-continuous-integration-1h49</guid>
      <description>&lt;p&gt;Developing and releasing new software versions is an ongoing process that demands careful attention to detail. It’s essential to have the ability to track and analyze the entire process, even retrospectively, to identify any issues and take corrective measures.&lt;/p&gt;

&lt;p&gt;This is where the concept of continuous integration comes into the picture.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_BXjvC-L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4kkusfzghzytagui7434.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_BXjvC-L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4kkusfzghzytagui7434.png" alt="Continuous Integration and Jenkins Pipelines in AWS" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By adopting a continuous integration approach, we can effectively monitor the various stages of the software development process and analyze the results. This helps us to identify any potential issues, analyze them, and make the necessary adjustments to improve the overall development process.&lt;/p&gt;

&lt;p&gt;In this blog post, we will explore the concept of continuous integration and its benefits. We will discuss how to achieve continuous integration using Jenkins and provide insights on how this approach can help you streamline your software development process.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Want to make practice training about Continuous Integration and Jenkins?&lt;br&gt;
You are invited to take a look at the course I’ve created, where you will write a complete pipeline in AWS based on SDLC principles, with my guidance.&lt;br&gt;
&lt;a href="https://www.udemy.com/course/hands-on-mastering-devops-ci-and-jenkins-pipelines-in-aws/?referralCode=0F32127B68008428C23F"&gt;Continuous Integration and Jenkins Pipelines in AWS&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  What triggers continuous integration from the perspective of the development team?
&lt;/h2&gt;

&lt;p&gt;The development team triggers continuous integration by pushing code changes to the code repository. This action triggers an automated pipeline that builds, tests, and deploys the updated software version.&lt;/p&gt;

&lt;p&gt;The concept of a structured workflow is to establish a standardized order of operations that developers agree upon, enabling them to ensure that their subsequent versions are built in accordance with the software development life cycle (SDLC) defined by management. Let’s now examine some of the primary advantages of continuous integration:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Version history —&lt;/strong&gt; Developers can easily track their production versions and compare the performance of different versions during development. In addition, they can quickly rollback to a previous version in case of any production issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Quality testing —&lt;/strong&gt; Developers can test their versions on a staging environment to demonstrate how the new version performs in an environment similar to production. Instead of running the version on their local machine, which may not be comparable to the real environment, they can define a set of tests, such as unit tests, integration tests, and more, that will take their new version through a predefined workflow. This testing process will serve as their signature and ensure that the new version is safe to be deployed in a production environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Schedule triggering —&lt;/strong&gt; Developers no longer need to manually trigger their pipeline or define a new pipeline for each new project. As a DevOps team, it is our responsibility to create a robust system that will attach to each project its own pipeline. Whether it’s a common pipeline with slight changes to match the project or the same pipeline, developers can focus on writing code while the continuous integration takes care of the rest. Scheduling an automatic triggering, for example, every morning or evening, will ensure that the current code in Github is always ready for release.&lt;/p&gt;

&lt;h2&gt;
  
  
  Jenkins in the Age of Continuous Integration
&lt;/h2&gt;

&lt;p&gt;To achieve the desired pipeline workflow, we’ll deploy Jenkins and create a comprehensive pipeline that prioritizes version control, automated testing, and triggers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A virtual machine with a docker engine&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Containerize Jenkins&lt;/strong&gt;&lt;br&gt;
We will deploy Jenkins in a Docker container in order to simplify the deployment of our CI/CD pipelines.&lt;/p&gt;

&lt;p&gt;Let’s deploy Jenkins:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -d \
        --name jenkins -p 8080:8080 -u root -p 50000:50000 \
        -v /var/run/docker.sock:/var/run/docker.sock \
        naturalett/jenkins:2.387-jdk11-hello-world
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Validate the Jenkins Container:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker ps | grep -i jenkins&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Retrieve the Jenkins initial password:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker exec jenkins bash -c -- 'cat /var/jenkins_home/secrets/initialAdminPassword'&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Connect to Jenkins on the &lt;a href="http://localhost:8080"&gt;localhost&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you're looking for assistance with setting up Jenkins in a Docker container, I encourage you to visit my YouTube channel, where you'll find detailed tutorials and valuable insights. Feel free to subscribe for regular updates:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/4Fko2cLyDKo?start=119"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Building a Continuous Integration Pipeline
&lt;/h2&gt;

&lt;p&gt;I chose to use Groovy in Jenkins pipelines because it offers a number of benefits:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Groovy is a scripting language that is easy to learn and use&lt;/li&gt;
&lt;li&gt;Groovy offers features for writing concise, readable, and maintainable code&lt;/li&gt;
&lt;li&gt;Groovy’s syntax is similar to Java, making it easier for Java developers to adopt&lt;/li&gt;
&lt;li&gt;Groovy has excellent support for working with data formats commonly used in software development&lt;/li&gt;
&lt;li&gt;Groovy provides an efficient and effective way to build robust and flexible CI/CD pipelines in Jenkins
Our Pipeline Consists of Four Phases:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Phase 1 — The Agent&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To ensure the code is built as expected with no incompatible dependencies, each pipeline requires a virtual environment. We create an agent (virtual environment) in a Docker container during the following phase. Since Jenkins is also running in a Docker container, we’ll mount the Docker socket to enable agent execution.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent {
        docker {
            image 'docker:19.03.12'
            args '-v /var/run/docker.sock:/var/run/docker.sock'
        }
    }
...
...
...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Phase 2— The History of Versions&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Versioning is an essential practice that allows developers to track changes to their code and compare software performance, enabling them to make informed decisions on whether to roll back to a previous version or release a new one. In the next phase, we create a Docker image from our code and assign it a tag based on our predetermined set of definitions.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;For example: Date — Jenkins Build Number — Commit Hash&lt;br&gt;
&lt;/p&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent {
...
    }
    stages {
        stage('Build') {
            steps {
                script {
                  def currentDate = new java.text.SimpleDateFormat("MM-dd-yyyy").format(new Date())
                  def shortCommit = sh(returnStdout: true, script: "git log -n 1 --pretty=format:'%h'").trim()
                  customImage = docker.build("naturalett/hello-world:${currentDate}-${env.BUILD_ID}-${shortCommit}")
                }
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After completing the previous phase, a Docker image of our code was generated and can now be accessed in our local environment.&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker image | grep -i hello-world&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Phase 3— The Test&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Testing is a critical step to ensure that a new release version meets all functional and requirements tests. In the following stage, we run tests against the Docker image generated in the previous stage, which contains the potential next release.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent {
...
    }
    stages {
        stage('Test') {
            steps {
                script {
                    customImage.inside {
                        sh """#!/bin/bash
                        cd /app
                        pytest test_*.py -v --junitxml='test-results.xml'"""
                    }
                }
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Phase 4— The Scheduling Trigger&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Automating the pipeline trigger allows developers to focus on code writing while ensuring stability and readiness for the next release. To achieve this, we define a morning schedule that triggers the pipeline automatically as the development team starts their day.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent {
...
    }
    triggers {
        // https://crontab.guru
        cron '00 7 * * *'
    }
    stages {
...
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;An end-to-end pipeline of the process&lt;/strong&gt;&lt;br&gt;
To execute our pipeline, we have included a pre-defined pipeline into Jenkins. Simply initiate the “my-first-pipeline” Jenkins job to begin.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Agent —&lt;/strong&gt; is a virtual environment used for the pipeline&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Trigger —&lt;/strong&gt; is used for automatic scheduling in the pipeline&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Clone stage —&lt;/strong&gt; is responsible for cloning the project repository&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build stage —&lt;/strong&gt; involves creating the Docker image for the project
to access the latest commit and other Git features, we install Git package&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Test stage —&lt;/strong&gt; involves performing tests on our Docker image
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipeline {
    agent {
        docker {
            image 'docker:19.03.12'
            args '-v /var/run/docker.sock:/var/run/docker.sock'
        }
    }
    triggers {
        // https://crontab.guru
        cron '00 7 * * *'
    }
    stages {
        stage('Clone') {
            steps {
                git branch: 'main', url: 'https://github.com/naturalett/hello-world.git'
            }
        }
        stage('Build') {
            steps {
                script {
                  sh 'apk add git'
                  def currentDate = new java.text.SimpleDateFormat("MM-dd-yyyy").format(new Date())
                  def shortCommit = sh(returnStdout: true, script: "git log -n 1 --pretty=format:'%h'").trim()
                  customImage = docker.build("naturalett/hello-world:${currentDate}-${env.BUILD_ID}-${shortCommit}")
                }
            }
        }
        stage('Test') {
            steps {
                script {
                    customImage.inside {
                        sh """#!/bin/bash
                        cd /app
                        pytest test_*.py -v --junitxml='test-results.xml'"""
                    }
                }
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Summarize&lt;/strong&gt;&lt;br&gt;
We’ve gained an understanding of how continuous integration fits into our daily work, as well as some key pipeline workflows, which we’ve tried out firsthand.&lt;/p&gt;

&lt;p&gt;If you’re interested in further exploring different pipeline scenarios and learning practical production skills for implementing continuous integration in your own work, check out my course:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://www.udemy.com/course/hands-on-mastering-devops-ci-and-jenkins-pipelines-in-aws/?referralCode=0F32127B68008428C23F"&gt;Hands-On Mastering DevOps — CI and Jenkins Container Pipelines&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>jenkins</category>
      <category>devops</category>
      <category>pipelines</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
