<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kunal Das</title>
    <description>The latest articles on DEV Community by Kunal Das (@kunaldas).</description>
    <link>https://dev.to/kunaldas</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kunaldas"/>
    <language>en</language>
    <item>
      <title>Setting Up a Comprehensive Python Build Validation Pipeline in Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Thu, 03 Oct 2024 16:49:00 +0000</pubDate>
      <link>https://dev.to/kunaldas/setting-up-a-comprehensive-python-build-validation-pipeline-in-azure-devops-3d9k</link>
      <guid>https://dev.to/kunaldas/setting-up-a-comprehensive-python-build-validation-pipeline-in-azure-devops-3d9k</guid>
      <description>&lt;h1&gt;
  
  
  Setting Up a Comprehensive Python Build Validation Pipeline in Azure DevOps
&lt;/h1&gt;

&lt;p&gt;Hey there, &lt;br&gt;
Almost everyone who deploys python code wants to have it thoroughly checked and follow proper standards but sometimes setting up a code quality pipeline in your CI-CD process can be cumbersome due to various reasons.&lt;/p&gt;

&lt;p&gt;Mainly,&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You have to setup for each individual pipelines which can be an issue since many of us might have multiple repo multiple project and we need to have it standardize across projects to make sure it just works without tinkering in every other repo&lt;/li&gt;
&lt;li&gt;There are plenty of tools and finding the right set of tools and also making sure a certain set of tools work together without giving any issues&lt;/li&gt;
&lt;li&gt;Publishing a output also matters which will help the developers easily see what is lacking and they can start improving on the same.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Python developers and DevOps enthusiasts! Today, we're going to walk through setting up a robust build validation pipeline for your Python projects using Azure DevOps. This guide will help you ensure code quality, maintain consistency across multiple repositories, and streamline your development process - all without relying on external tools.&lt;/p&gt;

&lt;p&gt;I was tasked not long ago to find a FREE solution without any external tool which should have below features&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Must work with multiple repo.&lt;/li&gt;
&lt;li&gt;Must provide meaningful output.&lt;/li&gt;
&lt;li&gt;Should improve the overall Code Quality&lt;/li&gt;
&lt;li&gt;Developers can run locally as well before publishing to the testing tool&lt;/li&gt;
&lt;li&gt;All code in the main branch must follow the best in class industry standards&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;While I started developing the build validation pipeline ( Azure DevOps term basically means successful run of a pipeline to merge the pull request to a particular branch) I started looking for coding standards mainly for python I stumbled upon many articles and was in a bit of confusion which one to follow, but respecting the popularity and unanimous use of PEP-8 made the decision much easier.&lt;/p&gt;

&lt;p&gt;Ref :&lt;a href="https://peps.python.org/pep-0008/" rel="noopener noreferrer"&gt;https://peps.python.org/pep-0008/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ok on top of a pep-8 validator I thought of adding isort which is another great tool to validate if your imports are correctly sorted and on top of this I added a few things extra like BLACK &lt;br&gt;
if we look at the black defination we see&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Black is the uncompromising Python code formatter. By using it, you agree to cede control over minutiae of hand-formatting. In return, Black gives you speed, determinism, and freedom from pycodestyle nagging about formatting. You will save time and mental energy for more important matters."
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ref : &lt;a href="https://pypi.org/project/black/" rel="noopener noreferrer"&gt;https://pypi.org/project/black/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So from code quality part I ended up using three tools or validators&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Black (code formatter)&lt;/li&gt;
&lt;li&gt;isort (import sorter)&lt;/li&gt;
&lt;li&gt;Flake8 (linter)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I also added pytest and code coverage checks on this pipeline just to make sure developers are diligent to write unit test cases before they push the code to main branch&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;pytest (testing framework)&lt;/li&gt;
&lt;li&gt;Code coverage analysis&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;The steps here are for Azure Devops Pipeline but it can be easily replicated to githuba ctions or any other ci-cd tools with very minimal to minor tweaking&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;So let's look at the pipeline that I have been telling till now&lt;/p&gt;
&lt;h2&gt;
  
  
  Step-by-Step Setup
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Step 1: Create the Azure Pipeline YAML File
&lt;/h3&gt;

&lt;p&gt;First, create a file named &lt;code&gt;azure-pipelines.yml&lt;/code&gt; in the root of your repository. This file will define our pipeline.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;trigger&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;none&lt;/span&gt;

&lt;span class="na"&gt;variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sourcePath&lt;/span&gt;
    &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sourcecode'&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;coverageThreshold&lt;/span&gt;
    &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;80&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;group&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Tokens&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CodeDirectory&lt;/span&gt;
    &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(System.DefaultWorkingDirectory)'&lt;/span&gt;

&lt;span class="na"&gt;pool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;vm../images/image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ubuntu-latest'&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;job&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;BuildTestAndAnalyze&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🚀&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Build,&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Test,&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;and&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Analyze'&lt;/span&gt;
  &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;UsePythonVersion@0&lt;/span&gt;
    &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;versionSpec&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.9'&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🐍&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Use&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Python&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;3.9'&lt;/span&gt;

  &lt;span class="c1"&gt;# More steps will be added here&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This sets up the basic structure of our pipeline, specifying the Python version and defining some variables we'll use later.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Now if you see ths issue we need this build validation pipeline in multiple repositories which is a bit tricky since we all know if we add this yaml to a particular repo then it will only work on that and not any other repo instead I wanted it to be a seperate but clone the repo for which it get's triggered in the pull request.

The solution? 
look at he below step :)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: Clone the PR Repository
&lt;/h3&gt;

&lt;p&gt;Add this step to clone the PR repository:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;bash&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🚀 Initiating repository clone process..."&lt;/span&gt;
      &lt;span class="s"&gt;REPO_URL=$(System.PullRequest.SourceRepositoryURI)&lt;/span&gt;
      &lt;span class="s"&gt;AUTH_URL=$(echo $REPO_URL | sed "s|cyncly-engineering@|$(Code_Read_PAT)@|")&lt;/span&gt;
      &lt;span class="s"&gt;FULL_BRANCH=$(System.PullRequest.SourceBranch)&lt;/span&gt;
      &lt;span class="s"&gt;BRANCH=$(echo $FULL_BRANCH | sed 's|^refs/heads/||')&lt;/span&gt;
      &lt;span class="s"&gt;CLONE_DIR=$(System.DefaultWorkingDirectory)/cloned_repo&lt;/span&gt;

      &lt;span class="s"&gt;git clone --branch "$BRANCH" "$AUTH_URL" "$CLONE_DIR"&lt;/span&gt;

      &lt;span class="s"&gt;if [ $? -eq 0 ]; then&lt;/span&gt;
        &lt;span class="s"&gt;echo "✅ Git clone successful"&lt;/span&gt;
        &lt;span class="s"&gt;echo "##vso[task.setvariable variable=CodeDirectory]$CLONE_DIR"&lt;/span&gt;
      &lt;span class="s"&gt;else&lt;/span&gt;
        &lt;span class="s"&gt;echo "❌ Git clone failed"&lt;/span&gt;
        &lt;span class="s"&gt;exit 1&lt;/span&gt;
      &lt;span class="s"&gt;fi&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🔄&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Clone&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;PR&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Repository'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This step clones the PR repository, allowing us to work with the latest code.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;this is a bit tricky spart and for other ci-cd tool we need to modify this the most&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Install Dependencies
&lt;/h3&gt;

&lt;p&gt;Next, let's install the project dependencies and our code quality tools:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🔧 Setting up Python environment..."&lt;/span&gt;
      &lt;span class="s"&gt;python -m pip install --upgrade pip&lt;/span&gt;
      &lt;span class="s"&gt;if [ -f $(sourcePath)/requirements.txt ]; then&lt;/span&gt;
        &lt;span class="s"&gt;pip install -r $(sourcePath)/requirements.txt&lt;/span&gt;
      &lt;span class="s"&gt;fi&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;📦&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Install&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Project&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Dependencies'&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🛠️ Installing code quality and testing tools..."&lt;/span&gt;
      &lt;span class="s"&gt;pip install black flake8 isort pytest pytest-cov httpx&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🛠️&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Install&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Code&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Quality&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;and&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Testing&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Tools'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Run Code Quality Checks
&lt;/h3&gt;

&lt;p&gt;Now, let's add steps to run our code quality checks:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🖌️ Running Black formatter check..."&lt;/span&gt;
      &lt;span class="s"&gt;black --check --line-length 79 $(sourcePath) &lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🖌️&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Black'&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🔀 Running isort import sorter check..."&lt;/span&gt;
      &lt;span class="s"&gt;isort --check-only $(sourcePath)&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🔀&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;isort'&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🔍 Running Flake8 linter..."&lt;/span&gt;
      &lt;span class="s"&gt;flake8 $(sourcePath)&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🔍&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Flake8'&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These steps will check your code formatting, import sorting, and linting.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Run Tests and Generate Coverage Report
&lt;/h3&gt;

&lt;p&gt;Add this step to run tests and generate a coverage report:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "🧪 Running unit tests and generating coverage report..."&lt;/span&gt;
      &lt;span class="s"&gt;pytest --cov=$(sourcePath) --cov-report=xml --cov-report=html ./tests&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;🧪&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Tests&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;with&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;pytest'&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 6: Check Code Coverage Threshold
&lt;/h3&gt;

&lt;p&gt;Let's ensure our code meets a minimum coverage threshold:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
      &lt;span class="s"&gt;echo "📊 Checking code coverage threshold..."&lt;/span&gt;
      &lt;span class="s"&gt;coverage_percentage=$(python -c "import xml.etree.ElementTree as ET; tree = ET.parse('coverage.xml'); root = tree.getroot(); print(float(root.attrib['line-rate']) * 100)")&lt;/span&gt;
      &lt;span class="s"&gt;echo "Current code coverage: $coverage_percentage%"&lt;/span&gt;
      &lt;span class="s"&gt;echo "Coverage threshold: $COVERAGE_THRESHOLD%"&lt;/span&gt;
      &lt;span class="s"&gt;if (( $(echo "$coverage_percentage &amp;lt; $COVERAGE_THRESHOLD" | bc -l) )); then&lt;/span&gt;
        &lt;span class="s"&gt;echo "❌ ##vso[task.logissue type=error]Code coverage ($coverage_percentage%) is below the threshold of $COVERAGE_THRESHOLD%"&lt;/span&gt;
        &lt;span class="s"&gt;exit 1&lt;/span&gt;
      &lt;span class="s"&gt;else&lt;/span&gt;
        &lt;span class="s"&gt;echo "✅ Code coverage ($coverage_percentage%) meets or exceeds the threshold of $COVERAGE_THRESHOLD%"&lt;/span&gt;
      &lt;span class="s"&gt;fi&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;📊&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Check&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Code&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Coverage&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Threshold'&lt;/span&gt;
    &lt;span class="na"&gt;workingDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)&lt;/span&gt;
    &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;COVERAGE_THRESHOLD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(coverageThreshold)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 7: Publish Coverage Report
&lt;/h3&gt;

&lt;p&gt;Finally, let's publish our coverage report:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishCodeCoverageResults@1&lt;/span&gt;
    &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;codeCoverageTool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;cobertura'&lt;/span&gt;
      &lt;span class="na"&gt;summaryFileLocation&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(CodeDirectory)/coverage.xml'&lt;/span&gt;
      &lt;span class="na"&gt;reportDirectory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(CodeDirectory)/htmlcov'&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;📈&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Publish&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Coverage&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Report'&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;publish&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(CodeDirectory)/htmlcov&lt;/span&gt;
    &lt;span class="na"&gt;artifact&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CodeCoverageReport&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;📦&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Publish&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Coverage&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Report&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;as&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Artifact'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setting Up Local Development Environment
&lt;/h2&gt;

&lt;p&gt;To ensure developers can run these checks locally, add the following instructions to your project's README:&lt;/p&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.9&lt;/li&gt;
&lt;li&gt;pip (Python package manager)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Setting Up Your Environment
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Clone the repository:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   git clone &amp;lt;repository-url&amp;gt;
   cd &amp;lt;repository-name&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Create a virtual environment:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   python -m venv venv
   source venv/bin/activate  # On Windows, use `venv\Scripts\activate`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Install dependencies:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   pip install -r sourcecode/requirements.txt
   pip install black flake8 isort pytest pytest-cov httpx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Running Checks Locally
&lt;/h3&gt;

&lt;p&gt;To ensure your code passes the pipeline checks, run these commands from the project root:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Black (code formatting):
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   black --check --line-length 79 sourcecode
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;isort (import sorting):
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   isort --check-only sourcecode
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Flake8 (linting):
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   flake8 sourcecode
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;pytest (unit tests and coverage):
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   pytest --cov=sourcecode --cov-report=xml --cov-report=html ./tests
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setting Up Pre-commit Hooks
&lt;/h2&gt;

&lt;p&gt;To catch issues before they even make it to the pipeline, set up pre-commit hooks:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Install pre-commit:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   pip install pre-commit
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Create a &lt;code&gt;.pre-commit-config.yaml&lt;/code&gt; file in your project root:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;   &lt;span class="na"&gt;repos&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
   &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;repo&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://github.com/psf/black&lt;/span&gt;
     &lt;span class="na"&gt;rev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;22.3.0&lt;/span&gt;
     &lt;span class="na"&gt;hooks&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
     &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;black&lt;/span&gt;
       &lt;span class="na"&gt;args&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;--line-length=79&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
   &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;repo&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://github.com/PyCQA/isort&lt;/span&gt;
     &lt;span class="na"&gt;rev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;5.10.1&lt;/span&gt;
     &lt;span class="na"&gt;hooks&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
     &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;isort&lt;/span&gt;
   &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;repo&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://github.com/PyCQA/flake8&lt;/span&gt;
     &lt;span class="na"&gt;rev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;4.0.1&lt;/span&gt;
     &lt;span class="na"&gt;hooks&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
     &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;flake8&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Install the hooks:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   pre-commit install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Configuring isort
&lt;/h2&gt;

&lt;p&gt;To ensure consistent import sorting that aligns with Black's formatting, create a file named &lt;code&gt;.isort.cfg&lt;/code&gt; in the &lt;code&gt;sourcecode/&lt;/code&gt; directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ini"&gt;&lt;code&gt;&lt;span class="nn"&gt;[settings]&lt;/span&gt;
&lt;span class="py"&gt;profile&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;black&lt;/span&gt;
&lt;span class="py"&gt;multi_line_output&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;3&lt;/span&gt;
&lt;span class="py"&gt;include_trailing_comma&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;True&lt;/span&gt;
&lt;span class="py"&gt;force_grid_wrap&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;0&lt;/span&gt;
&lt;span class="py"&gt;use_parentheses&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;True&lt;/span&gt;
&lt;span class="py"&gt;ensure_newline_before_comments&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;True&lt;/span&gt;
&lt;span class="py"&gt;line_length&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;79&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;ok Now how can we setup this pipeline to make sure any of our code get's scanned properly before it goes to the main branch?&lt;/p&gt;

&lt;p&gt;First we need a PAT token to clone any repo across azure devops organisation so to get that first then set up the pipeline as a branch policy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Create a Personal Access Token (PAT)
&lt;/h2&gt;

&lt;p&gt;To clone repositories across your Azure DevOps organization, you'll need a PAT with appropriate permissions. Here's how to create one:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Sign in to your Azure DevOps organization (&lt;a href="https://dev.azure.com/%7Byour-organization%7D" rel="noopener noreferrer"&gt;https://dev.azure.com/{your-organization}&lt;/a&gt;).&lt;/li&gt;
&lt;li&gt;In the top right corner, click on your profile picture and select "Personal access tokens".
![alt text]../images/image.png)&lt;/li&gt;
&lt;li&gt;Click on "New Token".&lt;/li&gt;
&lt;li&gt;Give your token a name (e.g., "Build Validation Pipeline").&lt;/li&gt;
&lt;li&gt;Set the organization to your Azure DevOps organization.&lt;/li&gt;
&lt;li&gt;For expiration, choose an appropriate timeframe (e.g., 1 year).&lt;/li&gt;
&lt;li&gt;Under "Scopes", select "Custom defined" and then check the following permissions:

&lt;ul&gt;
&lt;li&gt;Code (Read &amp;amp; Write)&lt;/li&gt;
&lt;li&gt;Build (Read &amp;amp; Execute)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Click "Create".
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0jqnem40vtdk53yeuzr.png" alt="alt text"&gt;
&lt;/li&gt;
&lt;li&gt;Copy the generated token and store it securely. You won't be able to see it again.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 2: Add the PAT as a Pipeline Variable
&lt;/h2&gt;

&lt;p&gt;Now that you have a PAT, you need to add it as a variable in your pipeline:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In Azure DevOps, go to "Pipelines" &amp;gt; "Library".&lt;/li&gt;
&lt;li&gt;Click on "+ Variable group".&lt;/li&gt;
&lt;li&gt;Name the group (e.g., "BuildValidationTokens").&lt;/li&gt;
&lt;li&gt;Add a new variable:

&lt;ul&gt;
&lt;li&gt;Name: &lt;code&gt;Code_Read_PAT&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Value: Paste your PAT here&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Check the "Keep this value secret" box.&lt;/li&gt;
&lt;li&gt;Click "Save".&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 3: Update Your Pipeline YAML
&lt;/h2&gt;

&lt;p&gt;Update your &lt;code&gt;azure-pipelines.yml&lt;/code&gt; to use the variable group:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;group&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;BuildValidationTokens&lt;/span&gt;
  &lt;span class="c1"&gt;# ... other variables ...&lt;/span&gt;

&lt;span class="c1"&gt;# ... rest of your pipeline configuration ...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 4: Create a Branch Policy
&lt;/h2&gt;

&lt;p&gt;To ensure this pipeline runs on all pull requests to your main branch:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to your Azure DevOps project.&lt;/li&gt;
&lt;li&gt;Navigate to "Repos" &amp;gt; "Branches".&lt;/li&gt;
&lt;li&gt;Find your main branch (often named &lt;code&gt;main&lt;/code&gt; or &lt;code&gt;master&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Click the ellipsis (...) next to the branch name and select "Branch policies".&lt;/li&gt;
&lt;li&gt;Under "Build Validation", click "+ Add build policy".&lt;/li&gt;
&lt;li&gt;Select your build validation pipeline.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzibb3e0idaq31tjlx8u8.png" alt="alt text"&gt;
&lt;/li&gt;
&lt;li&gt;Set "Trigger" to "Automatic".&lt;/li&gt;
&lt;li&gt;Set "Policy requirement" to "Required".&lt;/li&gt;
&lt;li&gt;Set "Build expiration" as per your preference (e.g., "Immediately").&lt;/li&gt;
&lt;li&gt;Click "Save".&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 5: Configure Branch Protection
&lt;/h2&gt;

&lt;p&gt;To prevent direct pushes to the main branch:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Still in the branch policies page for your main branch.&lt;/li&gt;
&lt;li&gt;Under "Require a minimum number of reviewers", check the box.&lt;/li&gt;
&lt;li&gt;Set the minimum number of reviewers (e.g., 1 or 2).&lt;/li&gt;
&lt;li&gt;Optionally, check "Allow requestors to approve their own changes" based on your team's preferences.&lt;/li&gt;
&lt;li&gt;Click "Save".&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 6: Update Repository Settings
&lt;/h2&gt;

&lt;p&gt;Ensure your repository settings align with your new policy:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to "Project settings" &amp;gt; "Repositories".&lt;/li&gt;
&lt;li&gt;Select your repository.&lt;/li&gt;
&lt;li&gt;Under "Policies":

&lt;ul&gt;
&lt;li&gt;Check "Require a minimum number of reviewers".&lt;/li&gt;
&lt;li&gt;Set "Minimum number of reviewers" (should match your branch policy).&lt;/li&gt;
&lt;li&gt;Check "Check for linked work items".&lt;/li&gt;
&lt;li&gt;Check "Check for comment resolution".&lt;/li&gt;
&lt;li&gt;Optionally, check other policies as per your team's needs.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Click "Save".&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 7: Educate Your Team
&lt;/h2&gt;

&lt;p&gt;Finally, make sure your team understands the new process:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create documentation explaining the new pipeline and its checks.&lt;/li&gt;
&lt;li&gt;Highlight the importance of running checks locally before creating a pull request.&lt;/li&gt;
&lt;li&gt;Explain how to interpret and act on the pipeline results.&lt;/li&gt;
&lt;li&gt;Encourage the use of pre-commit hooks for catching issues early.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By following these steps, you've now set up a robust system where:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;All code changes must go through a pull request.&lt;/li&gt;
&lt;li&gt;Each pull request automatically triggers your build validation pipeline.&lt;/li&gt;
&lt;li&gt;The pipeline checks code formatting, linting, tests, and coverage.&lt;/li&gt;
&lt;li&gt;Pull requests cannot be merged until the pipeline passes and required reviews are completed.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now in the repo where you have added the build validation just try to merge the code into main branch&lt;br&gt;
you will the build validation pipeline will be automatically added in queue and you can see the logs from the pipeline&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fri85bqpi0m0irgary8sc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fri85bqpi0m0irgary8sc.png" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that if we go into the pipeline one step back there will be two important things&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6gqoakjd6cz3172yy7e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6gqoakjd6cz3172yy7e.png" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;on top left there will be a tab called code coverage&lt;br&gt;
upon clicking that you will be able to view line by line coverage report of each of the files&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl2egksgh61h7vfxea2ay.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl2egksgh61h7vfxea2ay.png" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This setup ensures that all code going into your main branch meets your quality standards. Remember to periodically review and update your pipeline as your project evolves and new best practices emerge.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;By following these steps, you'll have a robust build validation pipeline that ensures code quality across your Python projects. This setup:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enforces consistent code style&lt;/li&gt;
&lt;li&gt;Catches potential bugs early&lt;/li&gt;
&lt;li&gt;Encourages comprehensive testing&lt;/li&gt;
&lt;li&gt;Provides clear feedback to developers&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Remember, the key to success is continuous improvement. Don't hesitate to iterate on this pipeline as you learn what works best for your team and projects. Happy coding!&lt;/p&gt;

&lt;p&gt;It takes lot of time to write up and share with comminity if you think this can benifit you do share the word&lt;/p&gt;

</description>
      <category>azure</category>
      <category>python</category>
      <category>pytest</category>
    </item>
    <item>
      <title>Creating Azure Web App CI/CD with Terraform and Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:38:01 +0000</pubDate>
      <link>https://dev.to/kunaldas/creating-azure-web-app-cicd-with-terraform-and-azure-devops-5hn4</link>
      <guid>https://dev.to/kunaldas/creating-azure-web-app-cicd-with-terraform-and-azure-devops-5hn4</guid>
      <description>&lt;h1&gt;
  
  
  Step-by-Step Guide: Creating Azure Web App CI/CD with Terraform and Azure DevOps
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TfUrg6pQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ahj3ezml_2bs9uDUssU4oLQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TfUrg6pQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ahj3ezml_2bs9uDUssU4oLQ.png" alt="" width="700" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Introduction:&lt;br&gt;&lt;br&gt;
In today’s fast-paced development environment, implementing Continuous Integration and Continuous Deployment (CI/CD) is crucial for efficient software delivery. In this tutorial, we will walk through the process of setting up a CI/CD pipeline for an Azure web app using Terraform and Azure DevOps. By following these steps, you can automate the deployment process and ensure high-quality code reaches your production environment.&lt;/p&gt;

&lt;p&gt;Prerequisites:&lt;br&gt;&lt;br&gt;
Before we begin, make sure you have the following prerequisites in place:&lt;br&gt;&lt;br&gt;
- An Azure DevOps account&lt;br&gt;&lt;br&gt;
- Access to an Azure subscription&lt;br&gt;&lt;br&gt;
- Basic knowledge of Terraform and Docker&lt;/p&gt;

&lt;p&gt;Step 1: Creating Infrastructure with Terraform&lt;br&gt;&lt;br&gt;
To ensure a consistent and repeatable infrastructure setup, we will use Terraform. Terraform allows us to define our infrastructure as code and manage it through version control. Follow these steps to create the necessary infrastructure for your Azure web app:&lt;/p&gt;

&lt;p&gt;1. Set up a new Azure DevOps project.&lt;br&gt;&lt;br&gt;
2. Create a new repository within your project and clone it to your local machine.&lt;br&gt;&lt;br&gt;
3. Create a folder structure for your Terraform code, such as “infra/environment1” and “infra/environment2” for two environments.&lt;br&gt;&lt;br&gt;
4. Within each environment folder, create a main.tf file and define the necessary Azure resources, such as resource groups, app services, and databases. Use Terraform modules to ensure a modular and reusable infrastructure setup.&lt;br&gt;&lt;br&gt;
5. Define input variables in a separate variable.tf file to make your infrastructure code dynamic and easily configurable.&lt;br&gt;&lt;br&gt;
6. Initialize Terraform within each environment folder by running the &lt;code&gt;terraform init\&lt;/code&gt; command. This will download the required providers and modules.&lt;br&gt;&lt;br&gt;
7. Create a build pipeline in Azure DevOps to trigger the Terraform deployment. Use the “Terraform CLI” task to execute Terraform commands, passing the appropriate input variables for each environment.&lt;/p&gt;

&lt;p&gt;Step 2: CI Steps&lt;br&gt;&lt;br&gt;
In this step, we will set up Continuous Integration (CI) to ensure code quality and generate artifacts for deployment.&lt;/p&gt;

&lt;p&gt;1. Configure your source code repository in Azure DevOps to trigger the CI pipeline on code changes.&lt;br&gt;&lt;br&gt;
2. Set up a build pipeline in Azure DevOps with the following steps:&lt;br&gt;&lt;br&gt;
a. Use a task to install the required dependencies for your application, such as Python packages.&lt;br&gt;&lt;br&gt;
b. Run unit tests using your preferred testing framework (e.g., pytest) to ensure code functionality.&lt;br&gt;&lt;br&gt;
c. Use linters (flake8, black) and formatters (isort) to enforce code style consistency.&lt;br&gt;&lt;br&gt;
d. Generate a code coverage report using coverage.py to track the percentage of code covered by tests.&lt;br&gt;&lt;br&gt;
e. Integrate with SonarQube for static code analysis and maintain code quality.&lt;br&gt;&lt;br&gt;
f. Build a Docker image containing your application code and dependencies.&lt;br&gt;&lt;br&gt;
g. Push the Docker image to an Azure Container Registry (ACR) for later use in deployment.&lt;/p&gt;

&lt;p&gt;Step 3: CD — Continuous Deployment&lt;br&gt;&lt;br&gt;
Now that we have our infrastructure in place and a reliable CI process, we can proceed to set up Continuous Deployment (CD) to automate the deployment process.&lt;/p&gt;

&lt;p&gt;1. Determine the deployment strategy based on your requirements. For example:&lt;br&gt;&lt;br&gt;
— Two environments (dev and prod) can follow a simple deployment flow where the code is deployed to dev first and then promoted to prod after successful testing.&lt;br&gt;&lt;br&gt;
— Three environments (dev, test, and prod) can follow a more complex deployment flow where code moves through each environment sequentially.&lt;br&gt;&lt;br&gt;
2. Create release pipelines in Azure DevOps for each environment, using the appropriate deployment strategy.&lt;br&gt;&lt;br&gt;
3. Configure post-deployment tests specific to each environment to ensure the deployed application is functioning as expected.&lt;br&gt;&lt;br&gt;
4. Add performance testing steps after the deployment to the dev environment to validate system&lt;/p&gt;

&lt;p&gt;performance under real-world conditions. Use tools like Apache JMeter or Azure Application Insights to conduct performance tests.&lt;/p&gt;

&lt;p&gt;Certainly! Below are the code snippets for each step, following a generic parameterized approach and best practices:&lt;/p&gt;

&lt;p&gt;Step 1: Creating Infrastructure with Terraform&lt;/p&gt;

&lt;p&gt;1. Set up Azure DevOps pipeline:&lt;br&gt;&lt;br&gt;
— Create a new pipeline in Azure DevOps and select your repository.&lt;br&gt;&lt;br&gt;
— Choose the appropriate trigger for your pipeline, such as a repository branch or pull request.&lt;/p&gt;

&lt;p&gt;2. Configure Terraform backend:&lt;br&gt;&lt;br&gt;
— Create an Azure Storage Account to store Terraform state.&lt;br&gt;&lt;br&gt;
— Add the following code to your &lt;code&gt;main.tf\&lt;/code&gt; file within each environment folder:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;terraform&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;backend&lt;/span&gt; &lt;span class="err"&gt;“&lt;/span&gt;&lt;span class="nx"&gt;azurerm&lt;/span&gt;&lt;span class="err"&gt;”&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;storage_account_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="err"&gt;“&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;storage_account_name&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;”&lt;/span&gt;
 &lt;span class="nx"&gt;container_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="err"&gt;“&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;container_name&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;”&lt;/span&gt;
 &lt;span class="nx"&gt;key&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="err"&gt;“&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;environment&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;tfstate&lt;/span&gt;&lt;span class="err"&gt;”&lt;/span&gt;
 &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3. Define variables:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;variable&lt;/span&gt; &lt;span class="s2"&gt;"environment"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;type&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;string&lt;/span&gt;
 &lt;span class="nx"&gt;description&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Environment name"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;variable&lt;/span&gt; &lt;span class="s2"&gt;"resource_group_name"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;type&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;string&lt;/span&gt;
 &lt;span class="nx"&gt;description&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Name of the resource group"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="c1"&gt;# Add more variables as per your infrastructure requirements&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;— Create a &lt;code&gt;variables.tf\&lt;/code&gt; file within each environment folder to define input variables. Here’s an example:&lt;/p&gt;

&lt;p&gt;4. Create infrastructure resources:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_resource_group"&lt;/span&gt; &lt;span class="s2"&gt;"rg"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;var&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;resource_group_name&lt;/span&gt;
 &lt;span class="nx"&gt;location&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"West US"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_app_service_plan"&lt;/span&gt; &lt;span class="s2"&gt;"app_service_plan"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="kd"&gt;var&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;environment&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;-app-service-plan"&lt;/span&gt;
 &lt;span class="nx"&gt;location&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;location&lt;/span&gt;
 &lt;span class="nx"&gt;resource_group_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;
 &lt;span class="nx"&gt;kind&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Linux"&lt;/span&gt;
 &lt;span class="nx"&gt;reserved&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;span class="nx"&gt;sku&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;tier&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Basic"&lt;/span&gt;
 &lt;span class="nx"&gt;size&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"B1"&lt;/span&gt;
 &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="c1"&gt;# Add more resources as per your infrastructure requiremen&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add the following code to create a resource group and app service plan in the &lt;code&gt;main.tf\&lt;/code&gt; file within each environment folder:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="nx"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;task&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;UsePythonVersion&lt;/span&gt;&lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;
 &lt;span class="nx"&gt;inputs&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
 &lt;span class="nx"&gt;versionSpec&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'3.x'&lt;/span&gt;
 &lt;span class="nx"&gt;addToPath&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;span class="nx"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;task&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;TerraformInstaller&lt;/span&gt;&lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;
 &lt;span class="nx"&gt;inputs&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
 &lt;span class="nx"&gt;terraformVersion&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'latest'&lt;/span&gt;
&lt;span class="nx"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;task&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;TerraformCLI&lt;/span&gt;&lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;
 &lt;span class="nx"&gt;inputs&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
 &lt;span class="nx"&gt;command&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'init'&lt;/span&gt;
 &lt;span class="nx"&gt;workingDirectory&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'infra/environment1'&lt;/span&gt;
&lt;span class="nx"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;task&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;TerraformCLI&lt;/span&gt;&lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;
 &lt;span class="nx"&gt;inputs&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
 &lt;span class="nx"&gt;command&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'apply'&lt;/span&gt;
 &lt;span class="nx"&gt;workingDirectory&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'infra/environment1'&lt;/span&gt;
 &lt;span class="nx"&gt;environmentServiceNameAzureRM&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;connection_name&amp;gt;'&lt;/span&gt;
 &lt;span class="nx"&gt;commandOptions&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'-auto-approve'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;5. Execute Terraform commands in Azure DevOps pipeline:&lt;/p&gt;

&lt;p&gt;Add the following steps to your Azure DevOps pipeline YAML file:&lt;/p&gt;

&lt;p&gt;Step 2: CI Steps&lt;/p&gt;

&lt;p&gt;1. Configure CI pipeline triggers and variables:&lt;br&gt;&lt;br&gt;
— Define your pipeline triggers and variables in the Azure DevOps YAML file, as per your requirements.&lt;/p&gt;

&lt;p&gt;2. Set up build steps:&lt;br&gt;&lt;br&gt;
— Use appropriate task names and commands based on your specific needs. Here’s an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;UsePythonVersion@0&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;versionSpec&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.x'&lt;/span&gt;
 &lt;span class="na"&gt;addToPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PythonScript@0&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;scriptSource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;filePath'&lt;/span&gt;
 &lt;span class="na"&gt;scriptPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;scripts/install_dependencies.py'&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PythonScript@0&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;scriptSource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;filePath'&lt;/span&gt;
 &lt;span class="na"&gt;scriptPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;scripts/run_unit_tests.py'&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CmdLine@2&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
 &lt;span class="s"&gt;pip install coverage&lt;/span&gt;
 &lt;span class="s"&gt;coverage run - source=src -m pytest&lt;/span&gt;
 &lt;span class="s"&gt;coverage xml -o coverage.xml&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;SonarQubePrepare@4&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;SonarQube&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;SonarQube_connection_name&amp;gt;'&lt;/span&gt;
 &lt;span class="na"&gt;scannerMode&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;CLI'&lt;/span&gt;
 &lt;span class="na"&gt;cliProjectKey&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;project_key&amp;gt;'&lt;/span&gt;
 &lt;span class="na"&gt;cliProjectName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;project_name&amp;gt;'&lt;/span&gt;
 &lt;span class="na"&gt;cliSources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;src'&lt;/span&gt;
 &lt;span class="na"&gt;extraProperties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
 &lt;span class="s"&gt;sonar.coverage.reportPaths=coverage.x&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3. Build and push Docker image:&lt;br&gt;&lt;br&gt;
— Use the appropriate task names and commands based on your specific needs. Here’s an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Docker@2&lt;/span&gt;
 &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
 &lt;span class="na"&gt;containerRegistry&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;container_registry_connection_name&amp;gt;'&lt;/span&gt;
 &lt;span class="na"&gt;repository&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;image_repository_name&amp;gt;'&lt;/span&gt;
 &lt;span class="na"&gt;command&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;buildAndPush'&lt;/span&gt;
 &lt;span class="na"&gt;Dockerfile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Dockerfile'&lt;/span&gt;
 &lt;span class="na"&gt;tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.BuildId)'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 3: CD — Continuous Deployment&lt;/p&gt;

&lt;p&gt;1. Determine deployment strategy and set up release pipelines:&lt;br&gt;&lt;br&gt;
— Based on your deployment strategy, define the release pipelines in Azure DevOps with appropriate stages and triggers.&lt;/p&gt;

&lt;p&gt;2. Configure post-deployment tests:&lt;br&gt;&lt;br&gt;
— Use the appropriate tasks and scripts to run post-deployment tests. Here’s an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PythonScript@0&lt;/span&gt;  
&lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;  
&lt;span class="na"&gt;scriptSource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘filePath’&lt;/span&gt;  
&lt;span class="na"&gt;scriptPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘scripts/post\_deployment\_tests.py’&lt;/span&gt;  
&lt;span class="na"&gt;arguments&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘-e $(environment)’&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishTestResults@2&lt;/span&gt;  
&lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;  
&lt;span class="na"&gt;testResultsFormat&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘JUnit’&lt;/span&gt;  
&lt;span class="na"&gt;testResultsFiles&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘\*\*/test-results.xml’&lt;/span&gt;  
&lt;span class="na"&gt;searchFolder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘$(System.DefaultWorkingDirectory)’&lt;/span&gt;  
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3. Perform performance testing:&lt;br&gt;&lt;br&gt;
— Use the appropriate tasks and scripts to perform performance testing. Here’s an example using Apache JMeter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DownloadFile@1&lt;/span&gt;  
&lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;  
&lt;span class="na"&gt;sourceUrl&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘[https://archive.apache.org/dist/jmeter/binaries/apache-jmeter-5.4.1.tgz'](https://archive.apache.org/dist/jmeter/binaries/apache-jmeter-5.4.1.tgz')&lt;/span&gt;  
&lt;span class="na"&gt;targetPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘$(Pipeline.Workspace)/jmeter/apache-jmeter-5.4.1.tgz’&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ExtractFiles@1&lt;/span&gt;  
&lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;  
&lt;span class="na"&gt;archiveFilePatterns&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘$(Pipeline.Workspace)/jmeter/apache-jmeter-5.4.1.tgz’&lt;/span&gt;  
&lt;span class="na"&gt;destinationFolder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;‘$(Pipeline.Workspace)/jmeter/’&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CmdLine@2&lt;/span&gt;  
&lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;  
&lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;  
&lt;span class="err"&gt;$&lt;/span&gt;&lt;span class="s"&gt;(Pipeline.Workspace)/jmeter/apache-jmeter-5.4.1/bin/jmeter -n -t $(Pipeline.Workspace)/jmeter/performance\_test.jmx -l $(Pipeline.Workspace)/jmeter/results.jtl  &lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: The provided code snippets serve as examples and may require modifications based on your specific application, environment, and tools being used.&lt;/p&gt;

&lt;p&gt;Remember to update the connection names, resource names, and specific command arguments according to your project setup and requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>terraform</category>
      <category>devops</category>
    </item>
    <item>
      <title>Streamlining Synapse CI/CD &amp; Dedicated SQL pool with Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:58 +0000</pubDate>
      <link>https://dev.to/kunaldas/streamlining-synapse-cicd-dedicated-sql-pool-with-azure-devops-1g7h</link>
      <guid>https://dev.to/kunaldas/streamlining-synapse-cicd-dedicated-sql-pool-with-azure-devops-1g7h</guid>
      <description>&lt;h1&gt;
  
  
  Streamlining Synapse CI/CD &amp;amp; Dedicated SQL pool with Azure DevOps: Best Practices and Implementation Tips
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Streamlining Synapse CI/CD &amp;amp; Dedicated SQL pool with Azure DevOps: Best Practices and Implementation Tips

&lt;ul&gt;
&lt;li&gt;What is Synapse Analytics?&lt;/li&gt;
&lt;li&gt;what is the need for CI-CD for synapse analytics?&lt;/li&gt;
&lt;li&gt;What are the different components of Synapse Analytics?&lt;/li&gt;
&lt;li&gt;why it is hard to implement ci cd for synapse analytics?&lt;/li&gt;
&lt;li&gt;Let’s understand the CI -CD Approach:&lt;/li&gt;
&lt;li&gt;Continuous Integration :&lt;/li&gt;
&lt;li&gt;let’s look at the high-level workflow of the things we have done so far!&lt;/li&gt;
&lt;li&gt;CI-CD for dedicated SQL pool&lt;/li&gt;
&lt;li&gt;First, let’s setup the repository for the SQL pool&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;IMPLEMENTING THE RELEASE PIPELINE TO DEPLOY THE ABOVE-GENERATED BUILD ARTIFACTS&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It is crucial to have a simplified and effective process for developing, testing, and implementing solutions as data and analytics become more and more important for enterprises. Microsoft’s cloud-based analytics solution, Synapse Analytics, has strong data warehousing, machine learning, and integration capabilities. However, without the proper equipment and knowledge, building a Continuous Integration and Continuous Deployment (CI/CD) procedure for Synapse can be difficult. Teams can automate the deployment of Synapse solutions thanks to Azure DevOps’ complete collection of CI/CD pipeline capabilities. Using Azure DevOps to streamline Synapse CI/CD, we will examine best practices and implementation advice in this article.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Wjwm9KdX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2ACoufxbMwN_U4G5CV" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Wjwm9KdX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2ACoufxbMwN_U4G5CV" alt="" width="700" height="700"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Synapse Analytics?
&lt;/h2&gt;

&lt;p&gt;Microsoft created the cloud-based analytics solution known as Synapse Analytics. It is a platform that gives businesses the ability to ingest, prepare, manage, and provide data for their urgent needs in business intelligence and machine learning. Data integration, big data, data warehousing, and AI are just a few of the tools and services that Synapse Analytics integrates into a single workspace to provide a seamless end-to-end data analytics experience. The platform provides cutting-edge functions like code-free or code-first data integration, big data processing based on SQL and Spark, machine learning, and Power BI reporting capabilities. Synapse Analytics offers a single, end-to-end development and deployment experience while allowing customers to work with their current tools, languages, and frameworks.&lt;/p&gt;

&lt;h2&gt;
  
  
  what is the need for CI-CD for synapse analytics?
&lt;/h2&gt;

&lt;p&gt;First of all, Synapse Analytics is a cloud-based analytics service that is frequently updated and added to with new capabilities. The danger of potential bugs and other problems is reduced by a well-defined CI/CD process, which makes sure that the most recent changes are integrated and tested in the current system before being deployed to production.&lt;/p&gt;

&lt;p&gt;Second, a well-defined CI/CD procedure that guarantees seamless cooperation and effective deployment is crucial because Synapse Analytics frequently involves numerous developers working on various project components.&lt;/p&gt;

&lt;p&gt;Lastly, A well-defined CI/CD approach also ensures that the pipeline is tested and verified at every level, resulting in greater quality, more stable solutions, and a quicker time to market. Synapse Analytics frequently entails complicated data processing and integration.&lt;/p&gt;

&lt;p&gt;Overall, teams can manage the development and deployment of solutions more effectively while maintaining high quality and lowering the risk of errors and downtime by adopting a CI/CD process for Synapse Analytics with technologies like Azure DevOps.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are the different components of Synapse Analytics?
&lt;/h2&gt;

&lt;p&gt;A cloud-based analytics service called Synapse Analytics consists of a number of parts that work together to offer a whole analytics solution. Synapse Analytics’ principal parts are:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Studio&lt;/strong&gt;: A web-based integrated development environment (IDE) called Synapse Studio offers a centralized workspace for creating, overseeing, and maintaining Synapse applications. It consists of a number of technologies and services, including big data, machine learning, data warehousing, and data integration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse SQL:&lt;/strong&gt; Users of Synapse SQL can execute SQL queries against both organized and unstructured data using this distributed SQL engine. It supports serverless and provisioned models, enables petabyte-scale data warehousing, and supports both.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Spark:&lt;/strong&gt; Synapse Spark is a big data processing engine that enables customers to use Apache Spark to process massive amounts of data. Big data analytics, offers a high-performance computing environment that supports both batch and real-time data processing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Pipelines:&lt;/strong&gt; Users can design, orchestrate, and monitor data pipelines using the data integration service known as Synapse Pipelines. It can integrate data from numerous sources and supports both code-free and code-first data integration scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Serverless:&lt;/strong&gt; Users can analyze data using SQL queries using Synapse Serverless, a serverless SQL pool, without needing to set up or maintain a specific cluster.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Dedicated SQL Pool:&lt;/strong&gt; A dedicated cluster for high-performance data warehousing and analytics is offered via the provided SQL pool known as Synapse.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Synapse Notebooks:&lt;/strong&gt; Users can work with code and data in a group setting using Synapse Notebooks, a collaborative notebook environment. Python, Scala, and .NET are just a few of the many programming languages it supports.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SjVHbpkb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:541/1%2A6KsmHKmVqEAqPPG1BFyayw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SjVHbpkb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:541/1%2A6KsmHKmVqEAqPPG1BFyayw.png" alt="" width="541" height="651"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  why it is hard to implement ci cd for synapse analytics?
&lt;/h2&gt;

&lt;p&gt;For a number of reasons, implementing a Continuous Integration and Continuous Deployment (CI/CD) procedure for Synapse Analytics can be difficult.&lt;/p&gt;

&lt;p&gt;First off, Synapse Analytics is a sophisticated cloud-based analytics service that combines a variety of parts and offerings, each with specific needs and dependencies. It may be challenging to develop a simplified and effective CI/CD pipeline that manages all the many components of the service due to its complexity.&lt;/p&gt;

&lt;p&gt;Second, it can be difficult to develop a testing and deployment procedure that is quick and successful since Synapse Analytics frequently requires processing and handling massive volumes of data. It can take a lot of time to test and validate data pipelines, and handling enormous amounts of data makes it challenging to set up a testing environment that precisely mimics the production environment.&lt;/p&gt;

&lt;p&gt;Thirdly, it can be difficult to make sure that everyone is using the most recent code and data because Synapse Analytics is frequently utilized by numerous developers and teams. Version control problems and other issues may result from this, which may slow down the development and deployment process.&lt;/p&gt;

&lt;p&gt;Last but not least, For Synapse Analytics one needs experience in a variety of fields, including data integration, warehousing, big data, and machine learning, in order to establish a CI/CD process. Teams may find it difficult to locate the necessary skills and materials to build a reliable and effective CI/CD pipeline as a result.&lt;/p&gt;

&lt;p&gt;Overall, For Synapse Analytics one needs to carefully prepare the implementation of a CI/CD process and have a thorough understanding of all the different parts and services involved. To build a streamlined and effective pipeline that can handle the complexity of Synapse Analytics, it’s crucial to collaborate with professionals in data engineering, DevOps, and cloud computing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Let’s understand the CI -CD Approach:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Aq1_LFtd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ABWLgWiFl2NK106GjlgnXuA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Aq1_LFtd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ABWLgWiFl2NK106GjlgnXuA.png" alt="" width="700" height="1336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Resource Groups:&lt;/strong&gt; Here we have two resource groups, In how enterprise we may have multiple resource groups and multiple synapse workspaces so to deploy that you can use my terraform guide which can be done very easily.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Artifacts:&lt;/strong&gt; for artifacts deployment, I have used the ARM template deployment approach and deployed all the artifacts in that way.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;em&gt;The most important thing is if you have multiple artifacts linked services then you have to manually edit the ARM templates before deployment for it to work.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;3. &lt;strong&gt;Dedicated SQL pool:&lt;/strong&gt; The dedicated SQL pool scripts and stored procedures, views, etc also can be automated using CI-CD but I have used a secondary pipeline for the same, which makes sense as the development of the two will be asynchronous and have a different lifecycle.&lt;/p&gt;

&lt;h2&gt;
  
  
  Continuous Integration :
&lt;/h2&gt;

&lt;p&gt;So for the example let’s think that we have two synapse workspace which represents two environments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--E6IAbc3P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2A-fjFCeudSibAyUomdONW_Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--E6IAbc3P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2A-fjFCeudSibAyUomdONW_Q.png" alt="" width="700" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now log into the dev synapse workspace by going to the portal and clicking on the open synapse studio dialog box&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CXYPAAFj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:404/1%2A9oL7oOlWimT5D94EZKqsPw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CXYPAAFj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:404/1%2A9oL7oOlWimT5D94EZKqsPw.png" alt="" width="404" height="190"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you are in dev synapse studio go to Settings and then GIT configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5GpOOCLc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:386/1%2Aohhgl34OOCXcvasXNAAflQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5GpOOCLc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:386/1%2Aohhgl34OOCXcvasXNAAflQ.png" alt="" width="386" height="842"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;then click on ‘configure’&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--valVSRLL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AB-bcu3WuDAgFhFagRZ0KxA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--valVSRLL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AB-bcu3WuDAgFhFagRZ0KxA.png" alt="" width="700" height="260"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;select GITHUB or AZURE DEVOPS in the repository type&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tyb7S57e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AHFd49zarfzjmvwN3rC53-w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tyb7S57e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AHFd49zarfzjmvwN3rC53-w.png" alt="" width="700" height="264"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then proceed to select the collaboration branch and publish branch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m4Q3E5tQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AJPLjTkXdfFWWImzfjWAaJw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m4Q3E5tQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AJPLjTkXdfFWWImzfjWAaJw.png" alt="" width="700" height="545"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The collaboration branch and publish branch are two crucial ideas in Synapse Studio that are connected to source control and versioning of Synapse artifacts.&lt;/p&gt;

&lt;p&gt;The collaboration branch where numerous developers can work together on a Single workspace is the collaboration branch. Usually, Synapse artifacts are developed, tested, and validated using this branch. The collaboration branch can be updated by developers after they create and modify Synapse items like pipelines, notebooks, data flows, and SQL scripts. When you establish a new workspace in Synapse Studio, the collaboration branch, which is the default branch, is automatically created.&lt;/p&gt;

&lt;p&gt;The publish branch however does a different thing, it publishes the ARM template into that branch which contains everything in the artifact be it pipelines, notebooks, and Linked services detail everything!&lt;/p&gt;

&lt;p&gt;Once you are done it will look something like this,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--97GbubMT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Aq2wVkCmmGsUl6z_WGg3ZcA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--97GbubMT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Aq2wVkCmmGsUl6z_WGg3ZcA.png" alt="" width="700" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now try to commit all and then publish&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nTZwVt-9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:523/1%2AmWeYAg0qzsdtSvyyj2nDcQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nTZwVt-9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:523/1%2AmWeYAg0qzsdtSvyyj2nDcQ.png" alt="" width="523" height="82"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This will start generating the ARM templates and save them in the repo mentioned.&lt;/p&gt;

&lt;p&gt;If you go to the repo you will see a folder showing your synapse workspace name which will contain two ARM templates.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AbykBbxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:429/1%2A8obf7NpjjZ-5ghruiXYQLA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AbykBbxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:429/1%2A8obf7NpjjZ-5ghruiXYQLA.png" alt="" width="429" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the build and release concept is kind of this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---mDVF4Rb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AxOtUyHXBbKEJjo8TRYQRgQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---mDVF4Rb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AxOtUyHXBbKEJjo8TRYQRgQ.png" alt="" width="700" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So we are using three branch policy&lt;/p&gt;

&lt;p&gt;Develop branch: This branch would be the primary branch for developers to work on Synapse artifacts, including pipelines, notebooks, and data flows. Each developer would create their own feature branch from the develop branch, where they would make and commit their changes. When their feature is complete and tested, they would submit a pull request to merge their changes into the develop branch. The development branch would contain the latest tested changes from all the feature branches.&lt;/p&gt;

&lt;p&gt;Feature branch: Each developer would create their own feature branch from the develop branch, where they would make and commit their changes. Feature branches are typically named after the feature or issue that they are addressing. Once a developer has completed their changes and testing, they would submit a pull request to merge their changes into the develop branch.&lt;/p&gt;

&lt;p&gt;Main branch: The main branch would be the publish branch where you would deploy your Synapse artifacts to a live environment. The main branch would only contain the latest, tested changes that have been approved and merged from the develop branch.&lt;/p&gt;

&lt;p&gt;In this setup, developers would use the develop branch to collaborate and integrate their changes before publishing to the main branch. This enables developers to work in parallel while maintaining a single source of truth for the Synapse artifacts. The use of feature branches ensures that changes are isolated and tested before being integrated into the develop branch. Finally, the main branch would contain only the tested and approved changes that are ready for deployment. By using this branch policy, you can help ensure that changes to your Synapse artifacts are properly tested and validated, and that version control is maintained throughout the development process.&lt;/p&gt;

&lt;p&gt;Create a pipeline with the below code&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# The pipeline copies files from workspace publish branch and publishes artifact&lt;/span&gt;
&lt;span class="c1"&gt;# @author  Kunal Das&lt;/span&gt;
&lt;span class="c1"&gt;# @version 1.0&lt;/span&gt;
&lt;span class="c1"&gt;# @since   10-11-2022&lt;/span&gt;
&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
    &lt;span class="s"&gt;-synapse-CI-low&lt;/span&gt;
&lt;span class="na"&gt;trigger&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;workspace_publish&lt;/span&gt;

&lt;span class="na"&gt;pool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;vmImage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-22.04&lt;/span&gt;

&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CopyFiles@2&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Copy&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Files&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;to&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Build&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Pipeline&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;directory'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;SourceFolder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;synapse-dev'&lt;/span&gt;
    &lt;span class="na"&gt;TargetFolder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.ArtifactStagingDirectory)/ARM'&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishPipelineArtifact@1&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Publish&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Pipeline&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Artifact'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;targetPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.ArtifactStagingDirectory)'&lt;/span&gt;
    &lt;span class="na"&gt;artifact&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;drop&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a simplified YAML pipeline for Synapse CI/CD that has been named ‘synapse-CI-low’. The pipeline triggers whenever a Synapse workspace is published. The pipeline runs on an Ubuntu 22.04 virtual machine.&lt;/p&gt;

&lt;p&gt;The first step in the pipeline is to copy files from a folder called ‘synapse-dev’ to a directory called ‘ARM’ in the build pipeline. This step is accomplished using the ‘CopyFiles’ task.&lt;/p&gt;

&lt;p&gt;The second and final step in the pipeline is to publish the Synapse artifacts as a pipeline artifacts. This step is accomplished using the ‘PublishPipelineArtifact’ task. The target path for the artifacts is set to the build artifact staging directory, and the artifact is named ‘drop’.&lt;/p&gt;

&lt;p&gt;This pipeline is a simple example of a Synapse CI/CD pipeline that copies Synapse artifacts from a source directory and publishes them as a pipeline artifact. This pipeline can be further expanded to include additional steps for building, testing, and deploying the Synapse artifacts.&lt;/p&gt;

&lt;p&gt;Now to deploy the ARM template one extension is required, which can be added from the &lt;a href="https://marketplace.visualstudio.com/items?itemName=AzureSynapseWorkspace.synapsecicd-deploy&amp;amp;targetId=30894b01-00f5-4e06-962b-d1eec4db15f5&amp;amp;utm_source=vstsproduct&amp;amp;utm_medium=ExtHubManageList&amp;amp;source=post_page-----59746afe022f--------------------------------"&gt;marketplace&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To install the extension follow these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Log in to your Azure DevOps organization and navigate to the Extensions tab in the left-hand menu.&lt;/li&gt;
&lt;li&gt; Click on Browse Marketplace to access the Visual Studio Marketplace.&lt;/li&gt;
&lt;li&gt; Search for the extension you want to install by typing the name of the extension in the search bar.&lt;/li&gt;
&lt;li&gt; Click on the extension from the list of results to open the extension page.&lt;/li&gt;
&lt;li&gt; Click on the Get it free button.&lt;/li&gt;
&lt;li&gt; Select the Azure DevOps organization where you want to install the extension and review the terms and conditions.&lt;/li&gt;
&lt;li&gt; Click on the Install button to install the extension in your Azure DevOps organization.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now for release, we need to add the main step:&lt;br&gt;&lt;br&gt;
which is workspace deployment, for that we add the below task&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AzureSynapseWorkspace.synapsecicd-deploy.synapse-deploy.Synapse workspace deployment@2&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Synpase&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;deployment&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;task&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;for&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;workspace:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;synapse-qa'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;TemplateFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json'&lt;/span&gt;
    &lt;span class="na"&gt;ParametersFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateParametersForWorkspace.json'&lt;/span&gt;
    &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;syn-sp'&lt;/span&gt;
    &lt;span class="na"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;synapseQA-RG'&lt;/span&gt;
    &lt;span class="na"&gt;TargetWorkspaceName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;synapse-qa'&lt;/span&gt;
    &lt;span class="na"&gt;OverrideArmParameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;-LS_AKV_QA_properties_typeProperties_baseUrl&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;lt;Synapse&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;QA&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;AKV&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;URL&amp;gt;'&lt;/span&gt;
    &lt;span class="na"&gt;FailOnMissingOverrides&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a YAML code snippet for deploying a Synapse workspace using the Synapse Workspace Deployment task in Azure DevOps. The code contains the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; The task is defined with the name ‘Synapse deployment task for workspace: synapse-qa’ and the version number of the Synapse Workspace Deployment task is set to 2.&lt;/li&gt;
&lt;li&gt; The template and parameter files for the Synapse workspace are specified. The template file is named ‘TemplateForWorkspace.json’ and the parameters file is named ‘TemplateParametersForWorkspace.json’. These files are located in the default working directory of the pipeline.&lt;/li&gt;
&lt;li&gt; The Azure subscription to be used for the deployment is specified as ‘syn-sp’.&lt;/li&gt;
&lt;li&gt; The resource group name where the Synapse workspace will be deployed is set to ‘synapseQA-RG’.&lt;/li&gt;
&lt;li&gt; The target Synapse workspace name is set to ‘synapse-qa’.&lt;/li&gt;
&lt;li&gt; The override ARM parameters are specified using the parameter name and the new value. In this case, the parameter name is ‘-LS_AKV_QA_properties_typeProperties_baseUrl’ and the new value is 'QA-AKV-URL'.&lt;/li&gt;
&lt;li&gt; The ‘FailOnMissingOverrides’ option is set to true, which means that the deployment will fail if any of the specified override parameters are not found.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In summary, this code snippet deploys a Synapse workspace to the specified Azure subscription, resource group, and Synapse workspace name. The override ARM parameters allow you to customize the deployment settings for the workspace.&lt;/p&gt;

&lt;p&gt;if you notice when we deploy the ARM template we need to change somethings like&lt;/p&gt;

&lt;p&gt;1 . Sql pool name&lt;/p&gt;

&lt;p&gt;2. Linked Services details&lt;/p&gt;

&lt;p&gt;3. Spark pool name&lt;/p&gt;

&lt;p&gt;If you have the same name for all these then you may not have to tinker with the ARM template but for others before running the above deployment step we need to modify these.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PythonScript@0&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Change&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;SQL&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Pool'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;scriptSource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;inline&lt;/span&gt;
    &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
     &lt;span class="s"&gt;search_text = 'synapsesqlpooldev'&lt;/span&gt;
     &lt;span class="s"&gt;replace_text = 'synapsesqlpoolqa'&lt;/span&gt;

     &lt;span class="s"&gt;with open(r'$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json', 'r') as file:&lt;/span&gt;
        &lt;span class="s"&gt;data = file.read()&lt;/span&gt;
        &lt;span class="s"&gt;data = data.replace(search_text, replace_text)&lt;/span&gt;
     &lt;span class="s"&gt;with open(r'$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json', 'w') as file:&lt;/span&gt;
        &lt;span class="s"&gt;file.write(data)&lt;/span&gt;
     &lt;span class="s"&gt;print("SQL pool changed")&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a YAML code snippet for a PythonScript task in Azure DevOps. The task changes the name of a SQL pool in an ARM template file for a Synapse workspace deployment. The code contains the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; The task is defined with the name ‘Change SQL Pool’ and the task version is set to 0.&lt;/li&gt;
&lt;li&gt; The script source is set to ‘inline’, which means that the script is written in the YAML file directly.&lt;/li&gt;
&lt;li&gt; The Python script is defined, which replaces the search text ‘synapsesqlpooldev’ with the replacement text ‘synapsesqlpoolqa’ in the ARM template file.&lt;/li&gt;
&lt;li&gt; The ARM template file is read and its contents are stored in the ‘data’ variable.&lt;/li&gt;
&lt;li&gt; The ‘replace’ method is used to replace the search text with the replacement text.&lt;/li&gt;
&lt;li&gt; The modified data is written back to the ARM template file.&lt;/li&gt;
&lt;li&gt; A message ‘SQL pool changed’ is printed to the console.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In summary, this code snippet uses a Python script to modify the ARM template file for a Synapse workspace deployment by changing the name of a SQL pool from ‘synapsesqlpooldev’ to ‘synapsesqlpoolqa’. The modified ARM template file is used in the subsequent tasks for deploying the Synapse workspace.&lt;/p&gt;

&lt;p&gt;The same kind of scripts can use for Spark pool and linked services as well.&lt;/p&gt;

&lt;p&gt;If you want to use Powershell here is the command&lt;br&gt;&lt;br&gt;
Let us change the linked Azure KeyVault for the same.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Get-Content&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-path&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;System.DefaultWorkingDirectory&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nx"&gt;/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-replace&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'LS_AKV_DEV'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="s1"&gt;'LS_AKV_QA'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;Set-Content&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Path&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;System.DefaultWorkingDirectory&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nx"&gt;/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Get-Content&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-path&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;System.DefaultWorkingDirectory&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nx"&gt;/_Synapse-CI-pipeline/drop/ARM/TemplateParametersForWorkspace.json&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-replace&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'LS_AKV_DEV'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="s1"&gt;'LS_AKV_QA'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;Set-Content&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Path&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;System.DefaultWorkingDirectory&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nx"&gt;/_Synapse-CI-pipeline/drop/ARM/TemplateParametersForWorkspace.json&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt; The first command replaces the string ‘LS_AKV_DEV’ with ‘LS_AKV_QA’ in the contents of the ARM template file named ‘TemplateForWorkspace.json’.&lt;/li&gt;
&lt;li&gt; The second command replaces the same string in the contents of the ARM template parameter file named ‘TemplateParametersForWorkspace.json’.&lt;/li&gt;
&lt;li&gt; The ‘-path’ parameter specifies the file path for each file, which is set to the default working directory in the build pipeline.&lt;/li&gt;
&lt;li&gt; The ‘-replace’ parameter is used to search for the specified string and replace it with the replacement string in each file.&lt;/li&gt;
&lt;li&gt; The ‘|’ (pipe) character is used to send the modified contents to the ‘Set-Content’ command, which writes the modified contents back to the same file.&lt;/li&gt;
&lt;li&gt; The modified ARM template files are used in the subsequent tasks for deploying the Synapse workspace.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In summary, this PowerShell script code uses the ‘Get-Content’ and ‘Set-Content’ commands to modify the contents of two ARM template files for a Synapse workspace deployment in Azure DevOps. The ‘replace’ parameter is used to replace a string in each file, and the modified files are used in the subsequent deployment tasks.&lt;/p&gt;

&lt;p&gt;Now if you have some triggers in the synapse workspace it will not deploy as before deployment triggers must be turned off, to achieve this we can add the below task just before deployment&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AzureSynapseWorkspace.synapsecicd-deploy.toggle-trigger.toggle-triggers-dev@2&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Toggle&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Synapse&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Triggers'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;syn-sp'&lt;/span&gt;
    &lt;span class="na"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Synapseqa-RG'&lt;/span&gt;
    &lt;span class="na"&gt;WorkspaceName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;synapse-qa'&lt;/span&gt;
    &lt;span class="na"&gt;ToggleOn&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This pipeline step uses the Azure Synapse CI/CD extension to turn off triggers for a Synapse workspace. The task is named “Toggle Azure Synapse Triggers” and the version being used is 2. The task has several inputs including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  azureSubscription: The Azure subscription that contains the Synapse workspace.&lt;/li&gt;
&lt;li&gt;  ResourceGroupName: The name of the resource group where the Synapse workspace is located.&lt;/li&gt;
&lt;li&gt;  WorkspaceName: The name of the Synapse workspace where the triggers will be turned off.&lt;/li&gt;
&lt;li&gt;  ToggleOn: A boolean value indicating whether to turn on or off the Synapse triggers. In this case, it is set to false, meaning the triggers will be turned off.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once you do all these steps you are finally ready to deploy the synapse workspace!!!&lt;/p&gt;

&lt;h2&gt;
  
  
  let’s look at the high-level workflow of the things we have done so far!
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; Developers create or modify Synapse artifacts, such as notebooks, and pipelines, in the development environments.&lt;/li&gt;
&lt;li&gt; The changes are pushed to a source control repository, such as Azure DevOps Repos, GitHub, or Bitbucket.&lt;/li&gt;
&lt;li&gt; A build pipeline is triggered, which compiles the changes, and creates an artifact, such as an ARM template.&lt;/li&gt;
&lt;li&gt; The artifact is published to a release pipeline, which deploys it to the Synapse workspace.&lt;/li&gt;
&lt;li&gt; The deployment process may involve creating or updating Synapse artifacts, deploying resources to Azure, and configuring the Synapse workspace.&lt;/li&gt;
&lt;li&gt; After the deployment is completed, the triggers for Synapse pipelines, notebooks, and triggers may need to be toggled on or off.&lt;/li&gt;
&lt;li&gt; If any issues are found during testing, the pipeline may be rolled back or the code may be fixed and re-deployed.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;But this whole CI-CD pipeline will take care of the spark notebooks pipelines and additionally the Linked services and IRs as well but one major thing is the Stored procedures views etc from the dedicated SQL pool, we need some way to publish these changes can be taken care of by using a second Pipeline,&lt;/p&gt;

&lt;h2&gt;
  
  
  CI-CD for dedicated SQL pool
&lt;/h2&gt;

&lt;p&gt;Let’s look at the high-level workflow&lt;/p&gt;

&lt;p&gt;Sure, here’s a detailed explanation of the CI/CD workflow for a dedicated SQL pool using Azure Data Studio:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Connect to dedicated SQL pool:&lt;/strong&gt; First, connect to the dedicated SQL pool using Azure Data Studio. This can be done by selecting the dedicated SQL pool in the Object Explorer and providing the necessary login credentials.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Create a project:&lt;/strong&gt; Once connected, create a new project in Azure Data Studio by selecting File -&amp;gt; New Project. Choose the appropriate project type based on the requirements of the dedicated SQL pool.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Add source control:&lt;/strong&gt; Add the project to a source control repository such as Git by selecting View -&amp;gt; Source Control and following the prompts to initialize the repository and commit the project.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Build pipeline:&lt;/strong&gt; Create a build pipeline in Azure DevOps that takes the *.sqlproj file from the source control repository and builds a DACPAC file using the vsbuild Task. This task compiles the SQL scripts and T-SQL code into a single package that can be deployed to the dedicated SQL pool.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Release pipeline:&lt;/strong&gt; Create a release pipeline in Azure DevOps that deploys the DACPAC file to the next environment, such as the QA environment. This can be done using the dacpac Deploy Task. This task deploys the DACPAC file to the specified SQL Server instance or dedicated SQL pool, and can also handle any necessary pre- or post-deployment scripts.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here is the complete workflow diagram of the same.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gxi0xadc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:693/1%2AvMZpqncYYtNPkveUX_Tdqw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gxi0xadc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:693/1%2AvMZpqncYYtNPkveUX_Tdqw.png" alt="" width="693" height="696"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  First, let’s setup the repository for the SQL pool
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; Create or use an existing Azure DevOps project.&lt;/li&gt;
&lt;li&gt; Click on Repos from the sidebar.&lt;/li&gt;
&lt;li&gt; Click on New Repository and fill in the necessary details, such as the repo name. In this case, the repo name used in this example is “Synapse_SQLpool”. Click on Create.&lt;/li&gt;
&lt;li&gt; Configure Azure Data Studio (ADS) to connect to the above-created repository. Open ADS and select the New Connection option from the welcome page.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EX1StCee--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:570/0%2AdGDlzzBforon6Jdu" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EX1StCee--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:570/0%2AdGDlzzBforon6Jdu" alt="" width="570" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;5. A new pop-up window will come, here put all the details required for the connection.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3Gu690jy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:591/1%2ACQ-ndkbQDFKEws59X0Zbig.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3Gu690jy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:591/1%2ACQ-ndkbQDFKEws59X0Zbig.png" alt="" width="591" height="536"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6. Click on &lt;strong&gt;connect&lt;/strong&gt; and then wait for the browser window to come on screen, here u can normally log in with your azure credentials and after successful login, you will be redirected to the data studio.&lt;/p&gt;

&lt;p&gt;7. At this point if you click on the &lt;strong&gt;Connections&lt;/strong&gt; side pane on the left you will definitely see the connection sub menu there,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7yJ0TFll--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:538/1%2ABqwP_pVdJpAoNNkz7qGhyQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7yJ0TFll--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:538/1%2ABqwP_pVdJpAoNNkz7qGhyQ.png" alt="" width="538" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now select the server connection from here and then right-click and select ‘&lt;strong&gt;create project from database’&lt;/strong&gt; option,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--k1wtgko2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:339/1%2Ays6bRnBDcQPLENVD0MQGFA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k1wtgko2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:339/1%2Ays6bRnBDcQPLENVD0MQGFA.png" alt="" width="339" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Keep the settings as shown in the below image but select the folder and project name appropriately.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DOD2ihEt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:463/1%2AKIqHT-Ev2o2KKOmxquDBFg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DOD2ihEt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:463/1%2AKIqHT-Ev2o2KKOmxquDBFg.png" alt="" width="463" height="783"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Once you are done with it you will see a project from the left side pane&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SFRkx4do--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:639/0%2A9-6cgsXMYYCfjsBd" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SFRkx4do--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:639/0%2A9-6cgsXMYYCfjsBd" alt="" width="639" height="475"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Now the only step is to add the project to the Azure repo, there are plenty of ways we can do that, but here inside Azure Data studio itself, you can do the same by going into the left pane source control option.&lt;/li&gt;
&lt;li&gt;  Click on the three-dot menu and then select the add remote option,&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6L1q02Eu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:591/0%2A_qW1ADElqbUDF5dU" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6L1q02Eu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:591/0%2A_qW1ADElqbUDF5dU" alt="" width="591" height="624"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Here u just have to put the azure repo link, if it fails somehow open the terminal by pressing “ctrl + `” and then run the git init command which will initialize the directory,&lt;/li&gt;
&lt;li&gt;  Once u put the azure repo link whatever changes are there in your project can be pushed to the azure repo easily.&lt;/li&gt;
&lt;li&gt;  Verify the remote repo by the command ‘git remote -v’&lt;/li&gt;
&lt;li&gt;  The Sync option can also be used as it will automatically pull and then push the repo from the azure repo to the local system,&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--76ZoVIz7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:469/0%2AID5HsaSWZmebcFgC" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--76ZoVIz7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:469/0%2AID5HsaSWZmebcFgC" alt="" width="469" height="727"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Open the azure repo and navigate to your repo and see the files updated,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yGkT6o9H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A9N2UwcsgcFCexLD_" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yGkT6o9H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A9N2UwcsgcFCexLD_" alt="" width="700" height="222"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have to create a build pipeline for that,&lt;br&gt;&lt;br&gt;
go to pipeline and a new pipeline,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Dvx2Vj3S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AbzWD9lhx-L2wjEP6UKaJBQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Dvx2Vj3S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AbzWD9lhx-L2wjEP6UKaJBQ.png" alt="" width="700" height="127"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;select Azure repo and locate your repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--N5NeicwJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AxMfhg4gEwppeelg4" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--N5NeicwJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AxMfhg4gEwppeelg4" alt="" width="700" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  The third step would be just selecting the starter pipeline option.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cPJN6wW_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AOAwRILYqxVLKnDaX" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cPJN6wW_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AOAwRILYqxVLKnDaX" alt="" width="700" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Then in the yaml file paste the below code,
&lt;code&gt;&lt;/code&gt;`yaml
# @Author : Kunal Das
# Date : 30-12-2022&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;variables:&lt;br&gt;
  poolstagingarea: $(Build.ArtifactStagingDirectory)\poolstaging&lt;br&gt;
  BuildConfiguration: release&lt;br&gt;
  SQLPoolartifactname: AzureSQLPool&lt;br&gt;
  SQLPooldacpacfile: $(System.ArtifactsDirectory)\$(SQLPoolartifactname)\synapseSQLpoolDEV.sqlproj&lt;/p&gt;

&lt;p&gt;trigger:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;main&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;pool:&lt;br&gt;
  vmImage: windows-2019&lt;/p&gt;

&lt;p&gt;stages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;stage: Pooldacpac
displayName: 'Build dacpac'&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;jobs:&lt;br&gt;
    - job: 'Builddacpac'&lt;br&gt;
      displayName: 'Build SQL Pool dacpac'&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  steps:
  - task: VSBuild@1
    displayName: 'Builds the dacpac'
    inputs:
      solution: synapseSQLpoolDEV.sqlproj
      configuration: $(BuildConfiguration)


  - task: PublishBuildArtifacts@1
    displayName: 'Publishes dacpac as an artifact'
    # Publishes the dacpac as part of an artifact within Azure DevOps
    inputs:
      PathtoPublish: 'bin\$(BuildConfiguration)'
      ArtifactName: $(SQLPoolartifactname)
      publishLocation: 'Container'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;`&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  And Done! The build pipeline is ready, you may notice I have given the main branch as a trigger at the top so every time someone updates the main branch it triggers the main pipeline.&lt;/li&gt;
&lt;li&gt;  Save and run the pipeline, you can see all jobs running perfectly!!!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--g4-vE9_X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A720OlcbqBDkb8Wyc" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g4-vE9_X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A720OlcbqBDkb8Wyc" alt="" width="700" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Click on the pipeline and see the job artifact in this case the dacpac file has been published.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--71f3XUiu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A2a1QYjOx2ldAAwAk" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--71f3XUiu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2A2a1QYjOx2ldAAwAk" alt="" width="700" height="93"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;IMPLEMENTING THE RELEASE PIPELINE TO DEPLOY THE ABOVE-GENERATED BUILD ARTIFACTS&lt;/strong&gt;
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  Now the release pipeline can be created easily by going to pipeline -&amp;gt; release and then selecting create new release pipeline.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_y7yGwyR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:221/1%2AGaDP_GX0fDAbO2IjhJ06tA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_y7yGwyR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:221/1%2AGaDP_GX0fDAbO2IjhJ06tA.png" alt="" width="221" height="127"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Go to the pipeline and add just one task “SQL dacpac deployment”&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ykowyd7K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:682/0%2ALMyAPzwF4L6ZI4-Q" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ykowyd7K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:682/0%2ALMyAPzwF4L6ZI4-Q" alt="" width="682" height="242"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  This task just points to the QA or the next environment and then just takes the dacpac file to deploy it in the next environments.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Then locate and select the dacpac file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Zg8aiUJn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:574/0%2AcArsKbJYMgjNjCX1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Zg8aiUJn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:574/0%2AcArsKbJYMgjNjCX1" alt="" width="574" height="588"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Pass an additional argument in the field &lt;strong&gt;Additional SqlPackage.exe Arguments&lt;/strong&gt; as /p: BlockOnPossibleDataLoss=false&lt;/li&gt;
&lt;li&gt;  Just save and run the pipeline.&lt;/li&gt;
&lt;li&gt;  The job will run and deploy the changes to the next environment,&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zmsM6pu8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AulXYtTqFJ3q7xzEm" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zmsM6pu8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AulXYtTqFJ3q7xzEm" alt="" width="700" height="207"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In conclusion, the effectiveness and collaboration of data engineering and BI projects can be significantly enhanced by connecting Azure DevOps with Azure Synapse Analytics workspace. You may connect your Azure DevOps project with your Synapse workspace by following the instructions in this article, which will also help you optimize your CI/CD pipeline for better data asset distribution and maintenance. To ensure a seamless and successful implementation, it is crucial to test carefully and make any necessary adjustments. Your firm can benefit from a smooth DevOps and data analytics process with proper design and implementation.&lt;/p&gt;
&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>A comprehensive list of Azure Best Practices</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:57 +0000</pubDate>
      <link>https://dev.to/kunaldas/a-comprehensive-list-of-azure-best-practices-pp8</link>
      <guid>https://dev.to/kunaldas/a-comprehensive-list-of-azure-best-practices-pp8</guid>
      <description>&lt;h1&gt;
  
  
  Azure Best Practices
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DVk_mRrP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AJFxFUjxO8pAiWsq0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DVk_mRrP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AJFxFUjxO8pAiWsq0.png" alt="" width="700" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Azure Best Practices

&lt;ul&gt;
&lt;li&gt;Security:&lt;/li&gt;
&lt;li&gt;Enable Azure AD Conditional Access and enforce MFA&lt;/li&gt;
&lt;li&gt;Disable RDP/SSH from the Internet&lt;/li&gt;
&lt;li&gt;Management:&lt;/li&gt;
&lt;li&gt;Operations:&lt;/li&gt;
&lt;li&gt;Cost Optimization:&lt;/li&gt;
&lt;li&gt;Services:&lt;/li&gt;
&lt;li&gt;Database&lt;/li&gt;
&lt;li&gt;AppService:&lt;/li&gt;
&lt;li&gt;Conclusion:&lt;/li&gt;
&lt;li&gt;Credits:&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Azure best practices assist businesses in making the most of Azure resources to create and maintain scalable, cost-efficient, and secure solutions on the Microsoft Azure cloud. Having the correct Azure best practices can make or ruin a firm because Azure serves as the foundation of many modern organizations. I’ve covered the fundamental best practices that every Azure administrator needs to be aware of in this document. I’ve also provided advice on how to create a secure, reliable, and effective Azure infrastructure. These best practices must be followed in tandem because none of them by themselves can adequately secure the systems. You must select the best security alternatives based on your surroundings and demands, as is always the case.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security:
&lt;/h2&gt;

&lt;p&gt;the most important thing before anything else is security, The recommendations below can help assure more robust Azure security, but they cannot serve as a complete substitute. The top practices that I believe will help you strengthen and safeguard your Azure are listed below.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enable Azure AD Conditional Access and enforce MFA
&lt;/h2&gt;

&lt;p&gt;Azure AD Conditional Access assists administrators in enabling users to be productive whenever and wherever they choose while simultaneously safeguarding the assets of the company. Automating access control based on security, business, and compliance requirements are made possible via conditional access. Access to data and applications is protected by the addition of a crucial security layer provided by Azure AD Multi-Factor Authentication. To design an Azure Active Directory conditional access deployment and evaluate deployment considerations for Azure AD Multi-Factor Authentication, go to these Azure documentation sites (MFA).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JvD08K3T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AujYvSNEzVu05zQDpAmkkjQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JvD08K3T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AujYvSNEzVu05zQDpAmkkjQ.png" alt="" width="700" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Disable RDP/SSH from the Internet
&lt;/h2&gt;

&lt;p&gt;Use JIT and Azure Bastion Do not expose RDP and SSH access over the Internet in order to provide remote access to Windows and Linux VMs in Azure from anywhere in the world without compromising security. To provide safe remote access to Azure virtual machines, use one of these techniques:&lt;/p&gt;

&lt;p&gt;✓ &lt;strong&gt;Enable site-to-site VPN or ExpressRoute connections for Just-In-Time (JIT) VM access:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;JIT offers time-limited VM access through RDP and SSH, which lessens the exposure to brute force assaults. In essence, Network Security Groups (NSGs) lock off RDP and SSH ports and only permit authorized users to access them for a predetermined time. Users can request access to JIT VMs using Azure AD and role-based Access Control (RBAC) permissions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;✓ Configure Azure Bastion inside your virtual network.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Direct RDP/SSH communication to your VMs over TLS from the Azure interface is made possible by Azure Bastion. The PaaS service Azure Bastion does away with the requirement for VMs to have public IP addresses, agents, or specialized client software.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uSGRDciV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AyHCrH7KDdmEHkkpwLYhQbg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uSGRDciV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AyHCrH7KDdmEHkkpwLYhQbg.png" alt="" width="700" height="320"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Secure privileged access with Azure AD PIM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Azure AD Privileged Identity Management enables management, control, and monitoring of access to vital resources in Azure, Microsoft 365, and Intune (PIM). Admins must activate or elevate their privilege through PIM in order to use it for a brief period of time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Sls7E5nU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AQSUyijEFHVX-AiE5FA9eqg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Sls7E5nU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AQSUyijEFHVX-AiE5FA9eqg.png" alt="" width="700" height="221"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Management:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use Resource Groups; Tag individual resources:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create a resource group strategy that meets your company’s needs, then plan, create, and implement it. Resource groups, which are logical collections of Azure resources, containerize related resources in a group for administration simplicity, security, and cost tracking for your workloads.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hxhsL54E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AF7hE-TH6xttMSB9m2_wQWg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hxhsL54E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AF7hE-TH6xttMSB9m2_wQWg.png" alt="" width="700" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A best practice resource group strategy should include the following: Group resources by&lt;/p&gt;

&lt;p&gt;Environment: prod, dev, uat,stg,perf,sit&lt;/p&gt;

&lt;p&gt;Application: BI,DWH&lt;/p&gt;

&lt;p&gt;Business Unit: ML,DS&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Follow a well-defined naming standard&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Include resource type, application or business unit, environment, Azure region, and consecutive entity number etc.&lt;/p&gt;

&lt;p&gt;For example, dev-ause-asy-01 means Azure Synapse Workspace in Australia South East Region which is in DEV environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Leverage Resource tagging&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Tag all resources inside resource groups to communicate valuable information to your teams, discover resources, and manage costs and it is easy to delete resources as well.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JsYIiQdT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:627/1%2A2ci7kwnh5Untp4I65UVGuw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JsYIiQdT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:627/1%2A2ci7kwnh5Untp4I65UVGuw.png" alt="" width="627" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Operations:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use Azure Advisor:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Azure Advisor offers individualised, practical advice for cost, security, dependability, operational excellence, and performance. Organizations can optimise their deployments with Azure Advisor in accordance with Microsoft best practises. These suggestions are based on Microsoft best practises that are successful for most businesses and were compiled using resource configuration analysis and usage telemetry in your Azure tenant.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Optimization:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use reserved instances:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Reserved instances are useful in certain circumstances, such as when you frequently employ the same VM size across several VMs. Domain controllers operating on Azure are an excellent example. On these VMs, three-year reserved instances offer savings of up to 72%.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Delete unneeded resources:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After VMs are decommissioned, orphan resources are frequently forgotten about and left in a tenant. These resources are pricey! Common examples of these Azure orphan resources include network cards and OS discs. Fortunately, the Azure Portal assists in reminding administrators to delete unnecessary resources at provisioning and deletion time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Services:
&lt;/h2&gt;

&lt;p&gt;In this section let us see in detail the most used resources and the best way to use those,&lt;/p&gt;

&lt;p&gt;➢ Virtual Machine&lt;/p&gt;

&lt;p&gt;✓ Enforce multi-factor authentication (MFA) and complex passwords. MFA can help limit the threat of compromised credentials. Complex passwords help reduce the effectiveness of brute-force password attacks. ✓ Use just-in-time (JIT) virtual machine access. JIT access works with NSGs and the Azure firewall and helps you layer in role-based access controls (RBAC) and time-bindings on access to virtual machines.&lt;/p&gt;

&lt;p&gt;✓ Have a patch process in place. If you’re not patching your workloads, all your other efforts may be for nothing. A single unpatched vulnerability can lead to a breach. A patch process to keep your operating systems and applications up to date helps you mitigate this risk.&lt;/p&gt;

&lt;p&gt;✓ Lock down administrative ports. Unless necessary, restrict access to SSH, RDP, WinRM, and other administrative ports.&lt;/p&gt;

&lt;p&gt;✓ Use the Azure firewall and network security groups (NGSs) to limit access to workloads. Consistent with the principle of least privilege, use NSGs and the Azure firewall to restrict workload access.&lt;/p&gt;

&lt;p&gt;✓ Apply the Latest OS Patches Ensure that the latest OS patches available for Microsoft Azure virtual machines are applied.&lt;/p&gt;

&lt;p&gt;✓ Approved Azure Machine Image in Use Ensure that all your Azure virtual machine instances are launched from approved machine images only.&lt;/p&gt;

&lt;p&gt;✓ Enable Auto-Shutdown Configure your Microsoft Azure virtual machines to automatically shut down on a daily basis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ Storage&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Restrict database and storage access. UseFirewalls and access controls to limit what level of access users, devices, and services have to your databases and storage blobs.&lt;/p&gt;

&lt;p&gt;✓ Leverage auditing. Turn on auditing for your Azure databases. Doing so enables you to gain visibility into all database changes.&lt;/p&gt;

&lt;p&gt;✓ Configure threat detection for Azure SQL. If you use Azure SQL, activating threat detection helps you identify security issues faster and limit dwell time.&lt;/p&gt;

&lt;p&gt;✓ Set log alerts in Azure Monitor. It isn’t enough to simply log events. Make sure you are alerting against security-related events in Azure Monitor so you can remediate issues quickly (and automatically when possible).&lt;/p&gt;

&lt;p&gt;✓ Enable Azure Defender for your storage accounts. Azure Defender provides you with hardening and securing your Azure storage accounts.&lt;/p&gt;

&lt;p&gt;✓ Use soft deletes. Soft deletes help you ensure data is still retrievable (for 14 days) in the event a malicious actor (or user error) leads to data you wanted to keep — getting deleted.&lt;/p&gt;

&lt;p&gt;✓ Use shared access signatures (SAS). SAS enables you to implement granular access controls and time limits on client access to data.&lt;/p&gt;

&lt;p&gt;✓ Disable Anonymous Access to Blob Containers Ensure that anonymous access to blob containers is disabled within your Azure Storage account.&lt;/p&gt;

&lt;p&gt;✓ Disable public access to storage accounts with blob containers Ensure that public access to blob containers is disabled for your Azure storage accounts to override any ACL configurations allowing access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ Network&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Encrypt data in transit. As we mentioned in the encryption and data security section: encryption of data in transit (and at rest) is a must. Leverage modern encryption protocols for all network traffic.&lt;/p&gt;

&lt;p&gt;✓ Implement zero trust. By default, network policies should deny access unless there is an explicit allow rule.&lt;/p&gt;

&lt;p&gt;✓ Limit open ports and Internet-facing endpoints. Unless there is a well-defined business reason for a port to be open or workload to be Internet-facing, don’t let it happen.&lt;/p&gt;

&lt;p&gt;✓ Monitor device access. Monitoring access to your workloads and devices (e.g. using a SIEM or Azure Monitor) helps you proactively detect threats&lt;/p&gt;

&lt;p&gt;✓ Segment your networks. Logical network segmentation can help improve visibility, make your networks easier to manage and limit east-west movement in the event of a breach.&lt;/p&gt;

&lt;p&gt;✓ Check for NSG Flow Log Retention Period Ensure that the Network Security Group (NSG) flow log retention period is greater than or equal to 90 days.&lt;/p&gt;

&lt;p&gt;✓ Check for Network Security Groups with Port Ranges Ensure there are no network security groups with a range of ports opened to allow incoming traffic.&lt;/p&gt;

&lt;p&gt;✓ Enable DDoS Standard Protection for Virtual Networks Ensure that DDoS standard protection is enabled for production Azure virtual networks.&lt;/p&gt;

&lt;p&gt;✓ Monitor Network Security Group Configuration Changes Network security group changes have been detected in your Microsoft Azure cloud account.&lt;/p&gt;

&lt;p&gt;✓ Review Network Interfaces with IP Forwarding Enabled Ensure that the Azure network interfaces with IP forwarding enabled are regularly reviewed.&lt;/p&gt;

&lt;h2&gt;
  
  
  ➢ Database
&lt;/h2&gt;

&lt;p&gt;➢ &lt;strong&gt;Cosmos DB:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Enable Advanced Threat Protection Ensure that Advanced Threat Protection is enabled for all Microsoft Azure Cosmos DB accounts.&lt;/p&gt;

&lt;p&gt;✓ Enable Automatic Failover Enable automatic failover for Microsoft Azure Cosmos DB accounts.&lt;/p&gt;

&lt;p&gt;✓ Restrict Default Network Access for Azure Cosmos DB Accounts Ensure that default network access (i.e. public access) is denied within your Azure Cosmos DB accounts configuration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ SQL:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Advanced Data Security for SQL Servers Ensure that Advanced Data Security (ADS) is enabled at the Azure SQL database server level.&lt;/p&gt;

&lt;p&gt;✓ Check for Publicly Accessible SQL Servers Ensure that Azure SQL database servers are accessible via private endpoints only.&lt;/p&gt;

&lt;p&gt;✓ Check for Sufficient Point in Time Restore (PITR) Backup Retention Period Ensure there is a sufficient PITR backup retention period configured for Azure SQL databases.&lt;/p&gt;

&lt;p&gt;✓ Check for Unrestricted SQL Database Access Ensure that no SQL databases allow unrestricted inbound access from 0.0.0.0/0 (any IP address).&lt;/p&gt;

&lt;p&gt;✓ Configure “AuditActionGroup” for SQL Server Auditing Ensure that “AuditActionGroup” property is well configured at the Azure SQL database server level.&lt;/p&gt;

&lt;p&gt;✓ Configure Emails for Vulnerability Assessment Scan Reports and Alerts Ensure that “Send scan reports to” setting is configured for SQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Detect Create, Update, and Delete SQL Server Firewall Rule Events SQL Server firewall rule changes have been detected in your Microsoft Azure cloud account.&lt;/p&gt;

&lt;p&gt;✓ Enable All Types of Threat Detection on SQL Servers Enable all types of threat detection for your Microsoft Azure SQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable Auditing for SQL Servers Ensure that database auditing is enabled at the Azure SQL database server level.&lt;/p&gt;

&lt;p&gt;✓ Enable Auto-Failover Groups Ensure that your Azure SQL database servers are configured to use auto-failover groups.&lt;/p&gt;

&lt;p&gt;✓ Enable Automatic Tuning for SQL Database Servers Ensure that Automatic Tuning feature is enabled for Microsoft Azure SQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable Transparent Data Encryption for SQL Databases Ensure that Transparent Data Encryption (TDE) is enabled for every Azure SQL database.&lt;/p&gt;

&lt;p&gt;✓ Enable Vulnerability Assessment Email Notifications for Admins and Subscription Owners Ensure that the Vulnerability Assessment setting “Also send email notification to admins and subscription owners” is enabled. Enable Vulnerability Assessment Periodic Recurring Scans Ensure that the Vulnerability Assessment Periodic Recurring Scans setting is enabled for SQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable Vulnerability Assessment for Microsoft SQL Servers Ensure that Vulnerability Assessment is enabled for Microsoft SQL database servers.&lt;/p&gt;

&lt;p&gt;✓ SQL Auditing Retention Ensure that SQL database auditing has a sufficient log data retention period configured.&lt;/p&gt;

&lt;p&gt;✓ Use Azure Active Directory Admin for SQL Authentication Ensure that an Azure Active Directory (AAD) admin is configured for SQL authentication.&lt;/p&gt;

&lt;p&gt;✓ Use BYOK for Transparent Data Encryption Use Bring Your Own Key (BYOK) support for Transparent Data Encryption (TDE).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ PostgreSQL&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Check for PostgreSQL Log Retention Period Ensure that PostgreSQL database servers have a sufficient log retention period configured.&lt;/p&gt;

&lt;p&gt;✓ Check for PostgreSQL Major Version Ensure that PostgreSQL database servers are using the latest major version of PostgreSQL database.&lt;/p&gt;

&lt;p&gt;✓ Enable “CONNECTION_THROTTLING” Parameter for PostgreSQL Servers Ensure that “connection_throttling” parameter is set to “ON” within your Azure PostgreSQL server settings.&lt;/p&gt;

&lt;p&gt;✓ Enable “LOG_CHECKPOINTS” Parameter for PostgreSQL Servers Enable “log_checkpoints” parameter for your Microsoft Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable “LOG_CONNECTIONS” Parameter for PostgreSQL Servers Enable “log_connections” parameter for your Microsoft Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable “LOG_DISCONNECTIONS” Parameter for PostgreSQL Servers Enable “log_disconnections” parameter for your Microsoft Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable “LOG_DURATION” Parameter for PostgreSQL Servers Enable “log_duration” parameter on your Microsoft Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable “log_checkpoints” Parameter for PostgreSQL Flexible Servers Enable “log_checkpoints” parameter for your Microsoft Azure PostgreSQL flexible database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable In-Transit Encryption for PostgreSQL Database Servers Ensure that in-transit encryption is enabled for your Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable Infrastructure Double Encryption for Single Servers Ensure that infrastructure double encryption is enabled for Single Server Azure PostgreSQL database servers.&lt;/p&gt;

&lt;p&gt;✓ Use Azure Active Directory Admin for PostgreSQL Authentication Ensure that an Azure Active Directory (AAD) admin is configured for PostgreSQL authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ MySQL:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ Configure TLS Version for MySQL Flexible Database Servers Ensure that the ‘tls_version’ parameter is set to a minimum of ‘TLSV1.2’ for all MySQL flexible database servers.&lt;/p&gt;

&lt;p&gt;✓ Enable In-Transit Encryption for MySQL Servers Ensure that in-transit encryption is enabled for your Azure MySQL database servers.&lt;/p&gt;

&lt;h2&gt;
  
  
  ➢ AppService:
&lt;/h2&gt;

&lt;p&gt;✓ Check for Latest Version of .NET Framework Enable HTTP to HTTPS redirects for your Microsoft Azure App Service web applications.&lt;/p&gt;

&lt;p&gt;✓ Check for Latest Version of Java Ensure that Azure App Service web applications are using the latest stable version of Java.&lt;/p&gt;

&lt;p&gt;✓ Check for Latest Version of PHP Ensure that Azure App Service web applications are using the latest version of PHP.&lt;/p&gt;

&lt;p&gt;✓ Check for Latest Version of Python Ensure that Azure App Service web applications are using the latest version of Python.&lt;/p&gt;

&lt;p&gt;✓ Check for Sufficient Backup Retention Period Ensure there is a sufficient backup retention period configured for Azure App Services applications.&lt;/p&gt;

&lt;p&gt;✓ Check for TLS Protocol Latest Version Ensure that Azure App Service web applications are using the latest version of TLS encryption.&lt;/p&gt;

&lt;p&gt;✓ Disable Plain FTP Deployment Ensure that FTP access is disabled for your Azure App Services web applications.&lt;/p&gt;

&lt;p&gt;✓ Disable Remote Debugging Disable Remote Debugging feature for your Microsoft Azure App Services web applications.&lt;/p&gt;

&lt;p&gt;✓ Enable Always On Ensure that your Azure App Services web applications stay loaded all the time by enabling the Always On feature.&lt;/p&gt;

&lt;p&gt;✓ Enable App Service Authentication Ensure that App Service Authentication is enabled within your Microsoft Azure cloud account.&lt;/p&gt;

&lt;p&gt;✓ Enable Application Insights Ensure that Azure App Services applications are configured to use Application Insights feature.&lt;/p&gt;

&lt;p&gt;✓ Enable Automated Backups Ensure that all your Azure App Services applications are using the Backup and Restore feature.&lt;/p&gt;

&lt;p&gt;✓ Enable FTPS-Only Access Enable FTPS-only access for your Microsoft Azure App Services web applications.&lt;/p&gt;

&lt;p&gt;✓ Enable HTTP/2 Ensure that Azure App Service web applications are using the latest stable version of HTTP.&lt;/p&gt;

&lt;p&gt;✓ Enable HTTPS-Only Traffic Enable HTTP to HTTPS redirects for your Microsoft Azure App Service web applications.&lt;/p&gt;

&lt;p&gt;✓ Enable Incoming Client Certificates Ensure that Azure App Service web applications are using incoming client certificates.&lt;/p&gt;

&lt;p&gt;✓ Enable Registration with Azure Active Directory Ensure that registration with Azure Active Directory is enabled for Azure App Service applications.&lt;/p&gt;

&lt;p&gt;✓ Use Key Vaults to Store App Service Application Secrets Ensure that Azure Key Vaults are used to store App Service application secrets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➢ KeyVault&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✓ App Tier Customer-Managed Key In Use Ensure that a Customer-Managed Key is created for your Azure cloud application tier.&lt;/p&gt;

&lt;p&gt;✓ Check for Allowed Certificate Key Types Ensure that Azure Key Vault certificates are using the appropriate key type(s).&lt;/p&gt;

&lt;p&gt;✓ Check for Azure Key Vault Keys Expiration Date Ensure that your Azure Key Vault encryption keys are renewed prior to their expiration date.&lt;/p&gt;

&lt;p&gt;✓ Check for Azure Key Vault Secrets Expiration Date Ensure that your Azure Key Vault secrets are renewed prior to their expiration date.&lt;/p&gt;

&lt;p&gt;✓ Check for Certificate Minimum Key Size Ensure that Azure Key Vault RSA certificates are using the appropriate key size.&lt;/p&gt;

&lt;p&gt;✓ Check for Key Vault Full Administrator Permissions Ensure that no Azure user, group or application has full permissions to access and manage Key Vaults.&lt;/p&gt;

&lt;p&gt;✓ Check for Sufficient Certificate Auto-Renewal Period Ensure there is a sufficient period configured for the SSL certificates auto-renewal.&lt;/p&gt;

&lt;p&gt;✓ Database Tier Customer-Managed Key In Use Ensure that a Customer-Managed Key is created for your Microsoft Azure cloud database tier.&lt;/p&gt;

&lt;p&gt;✓ Enable AuditEvent Logging for Azure Key Vaults Ensure that AuditEvent logging is enabled for your Microsoft Azure Key Vaults.&lt;/p&gt;

&lt;p&gt;✓ Enable Certificate Transparency Ensure that certificate transparency is enabled for all your Azure Key Vault certificates.&lt;/p&gt;

&lt;p&gt;✓ Enable Key Vault Recoverability Ensure that your Microsoft Azure Key Vault instances are recoverable.&lt;/p&gt;

&lt;p&gt;✓ Enable SSL Certificate Auto-Renewal Ensure that Auto-Renewal feature is enabled for your Azure Key Vault SSL certificates.&lt;/p&gt;

&lt;p&gt;✓ Enable Trusted Microsoft Services for Key Vault Access Allow trusted Microsoft services to access your Azure Key Vault resources (i.e. encryption keys, secrets and certificates).&lt;/p&gt;

&lt;p&gt;✓ Restrict Default Network Access for Azure Key Vaults Ensure that default network access (i.e. public access) rule is set to “Deny” within your Azure Key Vaults configuration.&lt;/p&gt;

&lt;p&gt;✓ Set Azure Secret Key Expiration Ensure that an expiration date is set for all your Microsoft Azure secret keys.&lt;/p&gt;

&lt;p&gt;✓ Set Encryption Key Expiration Ensure that an expiration date is configured for all your Microsoft Azure encryption keys.&lt;/p&gt;

&lt;p&gt;✓ Web Tier Customer-Managed Key In Use Ensure that a Customer-Managed Key is created for your Microsoft Azure cloud web tier&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion:
&lt;/h2&gt;

&lt;p&gt;There are numerous Azure features and services that require ongoing maintenance in terms of security. There are countless ways to attack a system, and poorly protected systems are the ones that hackers most frequently target. By keeping a few simple things in mind, you can strengthen your network considerably. With some investment and your work, you can make your Azure secure and robust by using a variety of Azure services.&lt;/p&gt;

&lt;h2&gt;
  
  
  Credits:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://learn.microsoft.com/en-us/azure/security/fundamentals/best-practices-and-patterns"&gt;https://learn.microsoft.com/en-us/azure/security/fundamentals/best-practices-and-patterns&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>microsoft</category>
    </item>
    <item>
      <title>Mastering Private Repositories in Enterprise with GitHub</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:56 +0000</pubDate>
      <link>https://dev.to/kunaldas/mastering-private-repositories-in-enterprise-with-github-1dj5</link>
      <guid>https://dev.to/kunaldas/mastering-private-repositories-in-enterprise-with-github-1dj5</guid>
      <description>&lt;h1&gt;
  
  
  Mastering Private Repositories in Enterprise with GitHub: A Comprehensive Guide
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qJsBLARr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ao_r7JdrqQTcB0kkb3AF1UA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qJsBLARr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ao_r7JdrqQTcB0kkb3AF1UA.jpeg" alt="" width="700" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Mastering Private Repositories in Enterprise with GitHub: A Comprehensive Guide

&lt;ul&gt;
&lt;li&gt;Getting Started with GitHub and Git&lt;/li&gt;
&lt;li&gt;Securing Your Connection with SSH&lt;/li&gt;
&lt;li&gt;Additional Tips and Techniques&lt;/li&gt;
&lt;li&gt;Leverage GitHub’s Issue Tracker&lt;/li&gt;
&lt;li&gt;2. Use Branching Strategically&lt;/li&gt;
&lt;li&gt;3. Take Advantage of GitHub Actions&lt;/li&gt;
&lt;li&gt;4. Protect Sensitive Information with .gitignore&lt;/li&gt;
&lt;li&gt;5. Stay Informed with Webhooks&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the modern era of software development, understanding how to effectively use tools like GitHub is crucial. This guide is designed to help you navigate the world of GitHub, specifically focusing on working with private repositories in an enterprise setting. By the end of this guide, you’ll have a solid foundation in GitHub, git, and SSH, empowering you to manage your codebase efficiently and securely.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with GitHub and Git
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; Create Your GitHub Account&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Start your journey by setting up a GitHub account. GitHub is a web-based hosting service for version control and is a key player in the open-source community. Having a GitHub account opens up a world of opportunities for collaboration and project management.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/join"&gt;https://github.com/join&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2. Install Git&lt;/p&gt;

&lt;p&gt;Git is the backbone of GitHub. It’s a distributed version control system that allows multiple people to work on a project without overwriting each other’s changes. Download and install git on your local machine to start leveraging its power.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://git-scm.com/book/en/v2/Getting-Started-Installing-Git"&gt;https://git-scm.com/book/en/v2/Getting-Started-Installing-Git&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3. Configure Git&lt;/p&gt;

&lt;p&gt;Personalize your git setup by adding your username and email. This information will be associated with any commits you make. Open your terminal or shell and type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git config --global user.name "Your name here"git config --global user.email "your_email@example.com"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To enhance your git experience, enable colored output in the terminal and set your preferred editor. This can make navigating git responses easier and ensure you’re comfortable when git opens an editor for you:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git config - global color.ui  
git config - global core.editor
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Securing Your Connection with SSH
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Establish SSH Connection&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Security is paramount when working with code, especially in an enterprise setting. SSH, or Secure Shell, is a protocol that provides a secure channel between your local machine and GitHub. You can follow this &lt;a href="https://www.cyberithub.com/how-to-setup-passwordless-authentication-for-git-push-in-github/"&gt;comprehensive guide&lt;/a&gt; for setting up password-less logins. GitHub also provides a &lt;a href="https://docs.github.com/en/authentication/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent"&gt;detailed guide&lt;/a&gt; for generating SSH keys.&lt;/p&gt;

&lt;p&gt;Check if you have the files ~/.ssh/id_rsa and ~/.ssh/id_rsa.pub. If not, create these public/private keys:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh-keygen -t rsa -C "your_email@example.com"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now to Copy your public key and get ready to paste it into your GitHub account follow the below steps.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Update SSH Key in GitHub Account&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Navigate to your GitHub Account Settings and click on “SSH Keys”. Add a new SSH key with a label (like “Vs Code”) and paste the public key you copied earlier.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; To verify your setup, type the following in your terminal:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh -T git@github.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Hi username! You’ve successfully authenticated, but GitHub does not provide shell access.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If you see the above message, congratulations! You’re all set to start working with private repositories in an enterprise setting.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Additional Tips and Techniques
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Leverage GitHub’s Issue Tracker
&lt;/h2&gt;

&lt;p&gt;GitHub’s issue tracker is a powerful tool for managing tasks, bugs, and feature requests. Each issue provides a platform for discussion, allowing team members to communicate about the task at hand.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Use Branching Strategically
&lt;/h2&gt;

&lt;p&gt;Branching is a core feature of git that allows you to work on different features or bugs in isolation. Developing a good branching strategy can help keep your codebase organized and make the development process smoother.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Take Advantage of GitHub Actions
&lt;/h2&gt;

&lt;p&gt;GitHub Actions is a CI/CD service provided by GitHub. It allows you to automate tasks like building, testing, and deploying your code. This can save you time and help ensure consistent quality in your codebase.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Protect Sensitive Information with .gitignore
&lt;/h2&gt;

&lt;p&gt;The .gitignore file allows you to specify files or directories that git should ignore. This is crucial for keeping sensitive information, like API keys or configuration files with passwords, out of your codebase.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. Stay Informed with Webhooks
&lt;/h2&gt;

&lt;p&gt;Webhooks allow you to set up automatic notifications when specific events occur in your repository. This can help keep you informed about the state of your project and respond quickly to changes.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;By mastering these tools and techniques, you’ll be well-equipped to manage private repositories in an enterprise setting. Whether you’re a seasoned developer or just starting out, GitHub offers a wealth of features to streamline your workflow and enhance collaboration.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>github</category>
      <category>git</category>
    </item>
    <item>
      <title>Exploring Azure Storage Services</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:06 +0000</pubDate>
      <link>https://dev.to/kunaldas/exploring-azure-storage-services-59ho</link>
      <guid>https://dev.to/kunaldas/exploring-azure-storage-services-59ho</guid>
      <description>&lt;h1&gt;
  
  
  Exploring Azure Storage Services 🌐🗄️
&lt;/h1&gt;

&lt;p&gt;Azure, Microsoft's cloud computing service, offers a range of storage solutions designed to meet the diverse needs of modern businesses. In this guide, we'll delve into the various Azure storage services, exploring their unique features and use cases. Whether you're a developer, a data scientist, or an IT professional, understanding these services can enhance your cloud strategy.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
Exploring Azure Storage Services 🌐🗄️

&lt;ul&gt;
&lt;li&gt;
&lt;li&gt;Table of Contents&lt;/li&gt;
&lt;li&gt;Introduction to Azure Storage&lt;/li&gt;
&lt;li&gt;Azure Blob Storage&lt;/li&gt;
&lt;li&gt;What is Blob Storage? 🤔&lt;/li&gt;
&lt;li&gt;Key Features 🌟&lt;/li&gt;
&lt;li&gt;Use Cases 🛠️&lt;/li&gt;
&lt;li&gt;Azure File Storage&lt;/li&gt;
&lt;li&gt;What is File Storage? 📂&lt;/li&gt;
&lt;li&gt;Key Features 🌟&lt;/li&gt;
&lt;li&gt;Use Cases 🛠️&lt;/li&gt;
&lt;li&gt;Azure Queue Storage&lt;/li&gt;
&lt;li&gt;What is Queue Storage? 📨&lt;/li&gt;
&lt;li&gt;Key Features 🌟&lt;/li&gt;
&lt;li&gt;Use Cases 🛠️&lt;/li&gt;
&lt;li&gt;Azure Table Storage&lt;/li&gt;
&lt;li&gt;What is Table Storage? 📊&lt;/li&gt;
&lt;li&gt;Key Features 🌟&lt;/li&gt;
&lt;li&gt;Use Cases 🛠️&lt;/li&gt;
&lt;li&gt;Azure Disk Storage&lt;/li&gt;
&lt;li&gt;What is Disk Storage? 💽&lt;/li&gt;
&lt;li&gt;Key Features 🌟&lt;/li&gt;
&lt;li&gt;Use Cases 🛠️&lt;/li&gt;
&lt;li&gt;Choosing the Right Azure Storage Service&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;li&gt;Read my blogs:&lt;/li&gt;
&lt;li&gt;Connect with Me:&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Introduction to Azure Storage
&lt;/h2&gt;

&lt;p&gt;Azure Storage is a cloud service at the core of Azure. It offers scalable, durable, and secure storage options for a variety of data types. In this guide, we will explore the different Azure Storage services, understand their functionalities, and see how they can be applied in real-world scenarios.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GuVqw8hM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://media.licdn.com/dms/image/D4D22AQHX6-ddbaicLQ/feedshare-shrink_800/0/1702440527380%3Fe%3D1708560000%26v%3Dbeta%26t%3D8LQUTRTjxHB-0ovl1xQX2badC2j8wdEjjbMawyK_0Ec" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GuVqw8hM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://media.licdn.com/dms/image/D4D22AQHX6-ddbaicLQ/feedshare-shrink_800/0/1702440527380%3Fe%3D1708560000%26v%3Dbeta%26t%3D8LQUTRTjxHB-0ovl1xQX2badC2j8wdEjjbMawyK_0Ec" alt="azure storage" width="784" height="937"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Azure Blob Storage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is Blob Storage? 🤔
&lt;/h3&gt;

&lt;p&gt;Blob Storage is Azure's object storage solution for the cloud. It is optimized for storing massive amounts of unstructured data, such as text or binary data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features 🌟
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Scalability:&lt;/strong&gt; Easily handles massive amounts of data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security:&lt;/strong&gt; Advanced security and encryption features.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Accessibility:&lt;/strong&gt; Data is accessible from anywhere in the world over HTTP or HTTPS.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Cases 🛠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Storing Images and Videos:&lt;/strong&gt; Ideal for storing media files for websites and applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Backup:&lt;/strong&gt; Efficient for backing up data and disaster recovery solutions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Big Data Analysis:&lt;/strong&gt; Can store large datasets for analytics purposes.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Azure File Storage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is File Storage? 📂
&lt;/h3&gt;

&lt;p&gt;Azure File Storage offers fully managed file shares in the cloud using the standard SMB protocol.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features 🌟
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SMB Protocol:&lt;/strong&gt; Uses the standard SMB 3.0 protocol.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Shared Access:&lt;/strong&gt; Files can be shared across applications and virtual machines.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Cases 🛠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lift and Shift:&lt;/strong&gt; Ideal for migrating on-premises applications that rely on file shares to Azure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hybrid Storage:&lt;/strong&gt; Works well for scenarios requiring both on-premises and cloud storage.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Azure Queue Storage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is Queue Storage? 📨
&lt;/h3&gt;

&lt;p&gt;Queue Storage provides a messaging queue for reliable messaging between application components.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features 🌟
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Decoupling:&lt;/strong&gt; Helps in decoupling application components.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability:&lt;/strong&gt; Scales to handle a large number of messages.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Cases 🛠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Message Processing:&lt;/strong&gt; Perfect for asynchronous message processing in applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Task Scheduling:&lt;/strong&gt; Useful in task scheduling scenarios.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Azure Table Storage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is Table Storage? 📊
&lt;/h3&gt;

&lt;p&gt;Table Storage is a NoSQL data store for semi-structured data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features 🌟
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;NoSQL:&lt;/strong&gt; Ideal for storing large volumes of non-relational data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexible:&lt;/strong&gt; Easy to adapt to changing data requirements.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Cases 🛠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Storing User Data:&lt;/strong&gt; Great for storing user data for web applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Address Books:&lt;/strong&gt; Suitable for storing address books, user profiles, etc.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Azure Disk Storage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is Disk Storage? 💽
&lt;/h3&gt;

&lt;p&gt;Azure Disk Storage offers high-performance, durable block storage for Azure Virtual Machines.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features 🌟
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;High Performance:&lt;/strong&gt; Designed for I/O-intensive workloads.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Durability:&lt;/strong&gt; Provides persistent data storage.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Cases 🛠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Virtual Machines:&lt;/strong&gt; Ideal for databases and other high-performance applications running on VMs.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Choosing the Right Azure Storage Service
&lt;/h2&gt;

&lt;p&gt;When selecting an Azure Storage service, consider factors like data type, access patterns, and scalability requirements. Each service is tailored to specific scenarios, so understanding your application's needs is crucial.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Azure's diverse storage options offer flexible, scalable, and secure solutions to meet the ever-evolving data storage needs of businesses. By understanding the strengths and applications of each service, you can make informed decisions about your cloud storage strategy.&lt;/p&gt;




&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>docker</category>
      <category>kubernetes</category>
    </item>
    <item>
      <title>Seamless Integration and Deployment of Azure Data Factory by using Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:05 +0000</pubDate>
      <link>https://dev.to/kunaldas/seamless-integration-and-deployment-of-azure-data-factory-by-using-azure-devops-4ia0</link>
      <guid>https://dev.to/kunaldas/seamless-integration-and-deployment-of-azure-data-factory-by-using-azure-devops-4ia0</guid>
      <description>&lt;h1&gt;
  
  
  Seamless Integration and Deployment of Azure Data Factory by using Azure DevOps
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Seamless Integration and Deployment of Azure Data Factory by using Azure DevOps

&lt;ul&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;Embracing Source Control in ADF&lt;/li&gt;
&lt;li&gt;Advantages of Git Integration with Azure Data Factory&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Architecture Flow diagram&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Connecting to a Git Repository&lt;/li&gt;
&lt;li&gt;Implementing the Pipeline Template&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Azure Data Factory (ADF) offers a robust platform for data integration and transformation, and when combined with continuous integration and delivery (CI/CD), it becomes a powerhouse. CI/CD in ADF context is the seamless transition of data pipelines across different environments like development, testing, and production. ADF leverages Azure Resource Manager (ARM) templates to encapsulate the configurations of its entities, such as pipelines, datasets, and data flows. There are primarily two recommended ways to transition a data factory across environments:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Leveraging Azure Pipelines for an automated deployment.&lt;/li&gt;
&lt;li&gt; Manually uploading an ARM template through the Data Factory user interface integrated with Azure Resource Manager.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Embracing Source Control in ADF
&lt;/h2&gt;

&lt;p&gt;Azure Data Factory’s integration with ARM templates facilitates the deployment of pipelines. Notably, there’s a distinct ADF Publish branch and a collaboration branch.&lt;/p&gt;

&lt;p&gt;Steps for Integrating Source Control with Branching Strategy:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Initialize the Git Repository: Start by initializing a single Git repository. This repository will house multiple ADF configurations, each tailored for specific pipelines.&lt;/li&gt;
&lt;li&gt; Branching Blueprint: Designate a unique branch for every ADF, which will act as a collaboration branch. This setup will lead to the creation of distinct folders under the &lt;code&gt;adf_publish&lt;/code&gt; branch. It's crucial to have separate branches for development to allow individual feature development, testing, and deployment.Important to note that we may not use this branch as we will use automated ARM tempalte publishing methood.&lt;/li&gt;
&lt;li&gt; Integrate Development Branches with Source Control: Only link the development branches with source control. This ensures continuous validation and checking of the code during its development phase. Keeping UAT/Production deployments separate ensures a clear demarcation between development and deployment phases.&lt;/li&gt;
&lt;li&gt; Pipeline Deployment: Utilize the ARM templates produced by ADF to deploy your pipelines, ensuring a uniform deployment process.&lt;/li&gt;
&lt;li&gt; Final Integration: Post thorough testing, merge the feature branches with the collaboration branch. The final version in the collaboration branch should be the one deployed to production.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Advantages of Git Integration with Azure Data Factory
&lt;/h2&gt;

&lt;p&gt;1. &lt;strong&gt;Enhanced Source Control:&lt;/strong&gt; As ADF tasks become increasingly critical, it’s essential to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Seamlessly track and audit changes.&lt;/li&gt;
&lt;li&gt;  Effortlessly revert unwanted modifications.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;2. &lt;strong&gt;Flexible Drafting:&lt;/strong&gt; Unlike direct authoring, which mandates validation for every save, Git integration allows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Drafting or partial saves.&lt;/li&gt;
&lt;li&gt;  Incremental modifications without validation, ensuring only thoroughly tested changes are published.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;3. &lt;strong&gt;Collaborative Environment &amp;amp; Role-Based Access:&lt;/strong&gt; Git facilitates:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Collaborative code reviews.&lt;/li&gt;
&lt;li&gt;  Differentiated permissions, dictating who can edit via Git and who has publishing rights.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;4. &lt;strong&gt;Streamlined CI/CD Process:&lt;/strong&gt; Git aids in:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Automating release pipelines upon changes.&lt;/li&gt;
&lt;li&gt;  Customizing ARM template properties for cleaner configuration management.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;5. &lt;strong&gt;Boosted Performance:&lt;/strong&gt; ADFs integrated with Git are significantly faster, loading up to a whopping 10 times quicker due to efficient resource downloading.&lt;/p&gt;

&lt;p&gt;It’s worth noting that direct authoring in the Azure Data Factory UI becomes disabled once a Git repository is integrated. However, modifications made via PowerShell or SDK are directly published to the Data Factory service, bypassing Git.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Architecture Flow diagram&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--G9kwmz4f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Az3Maq-fWswhWLoPDNva7fA.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--G9kwmz4f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Az3Maq-fWswhWLoPDNva7fA.gif" alt="" width="800" height="614"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting to a Git Repository
&lt;/h2&gt;

&lt;p&gt;Configuration using Management Hub:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Navigate to the management hub within the ADF UI.&lt;/li&gt;
&lt;li&gt;  Under the Source control section, select Git configuration.&lt;/li&gt;
&lt;li&gt;  If no repository is linked, click on Configure.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When setting up Git in the Azure Portal, certain settings like project name and repository name need to be manually inputted.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w6_baj9q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Af5Qo3S3JmR4vclpCLWMQWA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w6_baj9q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Af5Qo3S3JmR4vclpCLWMQWA.png" alt="" width="700" height="305"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure Repos Settings:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The configuration pane will display various Azure Repos code repository settings. These settings are essential to apply the CI-CD template. For instance:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Repository Type: Specifies the type of the Azure Repos code repository.&lt;/li&gt;
&lt;li&gt;  Azure Active Directory: Your Azure AD tenant name.&lt;/li&gt;
&lt;li&gt;  Azure Repos Organization: Your Azure Repos organization name.&lt;/li&gt;
&lt;li&gt;  Project Name: Your Azure Repos project name.&lt;/li&gt;
&lt;li&gt;  Repository Name: Your Azure Repos code repository name.&lt;/li&gt;
&lt;li&gt;  Collaboration Branch: Your Azure Repos collaboration branch used for publishing.&lt;/li&gt;
&lt;li&gt;  Publish Branch: The branch where ARM templates related to publishing are stored.&lt;/li&gt;
&lt;li&gt;  Root Folder: Your root folder in your Azure Repos collaboration branch.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Ensure you enable the option in the management hub to include global parameters in the ARM template if you have declared any global parameters.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Parameter Definitions: Allows configuration without diving deep into the script.&lt;/span&gt;
&lt;span class="na"&gt;parameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;envTarget&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Deployment&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Environment'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;
  &lt;span class="na"&gt;values&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;Stage&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;Prod&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azureDFName&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Data&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Identifier'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;gitADFPath&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Git&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Path&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;for&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ADF&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Publishing'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azureRegion&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Deployment&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Region'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azureResourceGroup&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Resource&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Group&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;in&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Azure'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azureSubID&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Subscription&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Identifier'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;azureRMConnectionName&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Resource&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Manager&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Connection&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Identifier'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sourceDFName&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Source&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Data&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;for&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ARM&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Template'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;targetDFName&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Target&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Data&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;for&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Deployment'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;modifyGlobalParams&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Modify&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Global&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Parameters'&lt;/span&gt;
  &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;boolean&lt;/span&gt;
  &lt;span class="na"&gt;default&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;

&lt;span class="c1"&gt;# Build Phase: Validate and Create ARM templates for Data Factory using npm.&lt;/span&gt;
&lt;span class="na"&gt;stages&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;stage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Construct&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Compile&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;and&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Confirm'&lt;/span&gt;
  &lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;job&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CompileAndCheck&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Compile&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;and&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Confirm&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Azure&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Data&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Factory'&lt;/span&gt;
    &lt;span class="na"&gt;pool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;vmImage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ubuntu-latest'&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="c1"&gt;# Set up Node.js for npm tasks.&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;NodeTool@0&lt;/span&gt;
        &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;versionSpec&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;14.x'&lt;/span&gt;
        &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Set&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;up&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Node.js'&lt;/span&gt;

      &lt;span class="c1"&gt;# Set up required npm packages for Data Factory.&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Npm@1&lt;/span&gt;
        &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;command&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;install'&lt;/span&gt;
          &lt;span class="na"&gt;verbose&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
          &lt;span class="na"&gt;workingDir&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory/'&lt;/span&gt;
        &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Set&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;up&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;npm&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;modules'&lt;/span&gt;

      &lt;span class="c1"&gt;# Confirm Data Factory setup.&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Npm@1&lt;/span&gt;
        &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;command&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;custom'&lt;/span&gt;
          &lt;span class="na"&gt;workingDir&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory/'&lt;/span&gt;
          &lt;span class="na"&gt;customCommand&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;build&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;validate&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/subscriptions/$(azureSubID)/resourceGroups/$(azureResourceGroup)/providers/Microsoft.DataFactory/factories/$(azureDFName)'&lt;/span&gt;
        &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Confirm&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Data&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Setup'&lt;/span&gt;

      &lt;span class="c1"&gt;# Create ARM template for Data Factory.&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Npm@1&lt;/span&gt;
        &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;command&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;custom'&lt;/span&gt;
          &lt;span class="na"&gt;workingDir&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory/'&lt;/span&gt;
          &lt;span class="na"&gt;customCommand&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;run&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;build&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;export&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/subscriptions/$(azureSubID)/resourceGroups/$(azureResourceGroup)/providers/Microsoft.DataFactory/factories/$(azureDFName)&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"ArmTemplate"'&lt;/span&gt;
        &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Create&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ARM&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;template'&lt;/span&gt;

      &lt;span class="c1"&gt;# Share the created ARM template for later stages.&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishPipelineArtifact@1&lt;/span&gt;
        &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;targetPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.Repository.LocalPath)/data-factory/ArmTemplate'&lt;/span&gt;
          &lt;span class="na"&gt;artifact&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ArmTemplateArtifact'&lt;/span&gt;
          &lt;span class="na"&gt;publishLocation&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pipeline'&lt;/span&gt;
        &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Share&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ARM&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;template'&lt;/span&gt;

&lt;span class="c1"&gt;# Deployment Phase: Deploy the Data Factory using ARM template.&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;stage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DeployPhase&lt;/span&gt;
  &lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;deployment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DeployToTarget&lt;/span&gt;
    &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Deploy&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;to&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;parameters.envTarget&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;|&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ADF:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;parameters.azureDFName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}'&lt;/span&gt;
    &lt;span class="na"&gt;dependsOn&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Construct&lt;/span&gt;
    &lt;span class="na"&gt;condition&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;succeeded()&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.envTarget }}&lt;/span&gt;
    &lt;span class="na"&gt;pool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;vmImage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ubuntu-latest'&lt;/span&gt;
    &lt;span class="na"&gt;strategy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;runOnce&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;deploy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Skip repo checkout for faster deployment.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;checkout&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;none&lt;/span&gt;

            &lt;span class="c1"&gt;# Retrieve the ARM template from the build phase.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DownloadPipelineArtifact@2&lt;/span&gt;
              &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
                &lt;span class="na"&gt;buildType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;current'&lt;/span&gt;
                &lt;span class="na"&gt;artifactName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ArmTemplateArtifact'&lt;/span&gt;
                &lt;span class="na"&gt;targetPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Pipeline.Workspace)'&lt;/span&gt;
              &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Retrieve&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ARM&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;template"&lt;/span&gt;

            &lt;span class="c1"&gt;# Optionally modify global parameters if needed.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;${{ if eq(parameters.modifyGlobalParams, &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="s"&gt;) }}&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
              &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AzurePowerShell@5&lt;/span&gt;
                &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;(Optional)&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Modify&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Global&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Parameters'&lt;/span&gt;
                &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
                  &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureRMConnectionName }}&lt;/span&gt;
                  &lt;span class="na"&gt;azurePowerShellVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;LatestVersion'&lt;/span&gt;
                  &lt;span class="na"&gt;ScriptType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;FilePath'&lt;/span&gt;
                  &lt;span class="na"&gt;ScriptPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Pipeline.Workspace)/GlobalParametersUpdateScript.ps1'&lt;/span&gt;
                  &lt;span class="na"&gt;ScriptArguments&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;-globalParametersFilePath&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"$(Pipeline.Workspace)/*_GlobalParameters.json"&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-resourceGroupName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;parameters.azureResourceGroup&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-dataFactoryName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;parameters.sourceDFName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"'&lt;/span&gt;

            &lt;span class="c1"&gt;# Deactivate ADF Triggers after deployment.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;toggle-adf-trigger@2&lt;/span&gt;
              &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
                &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureRMConnectionName }}&lt;/span&gt;
                &lt;span class="na"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureResourceGroup }}&lt;/span&gt;
                &lt;span class="na"&gt;DatafactoryName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.targetDFName }}&lt;/span&gt;
                &lt;span class="na"&gt;TriggerStatus&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;stop'&lt;/span&gt;

            &lt;span class="c1"&gt;# Deploy using the ARM template. Override source ADF name with target ADF name.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AzureResourceManagerTemplateDeployment@3&lt;/span&gt;
              &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Deploy&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;using&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ARM&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Template'&lt;/span&gt;
              &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
                &lt;span class="na"&gt;azureResourceManagerConnection&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureRMConnectionName }}&lt;/span&gt;
                &lt;span class="na"&gt;subscriptionId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureSubID }}&lt;/span&gt;
                &lt;span class="na"&gt;resourceGroupName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureResourceGroup }}&lt;/span&gt;
                &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureRegion }}&lt;/span&gt;
                &lt;span class="na"&gt;csmFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Pipeline.Workspace)/ARMTemplateForFactory.json'&lt;/span&gt;
                &lt;span class="na"&gt;csmParametersFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Pipeline.Workspace)/ARMTemplateParametersForFactory.json'&lt;/span&gt;
                &lt;span class="na"&gt;overrideParameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;-factoryName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;parameters.targetDFName&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"'&lt;/span&gt;

            &lt;span class="c1"&gt;# Activate ADF Triggers after deployment.&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;toggle-adf-trigger@2&lt;/span&gt;
              &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
                &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureRMConnectionName }}&lt;/span&gt;
                &lt;span class="na"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.azureResourceGroup }}&lt;/span&gt;
                &lt;span class="na"&gt;DatafactoryName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ parameters.targetDFName }}&lt;/span&gt;
                &lt;span class="na"&gt;TriggerStatus&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;start'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Implementing the Pipeline Template
&lt;/h2&gt;

&lt;p&gt;This guide provides a comprehensive walkthrough on setting up and utilizing the Azure Data Factory CI/CD pipeline as defined in the YAML file. The pipeline streamlines the build and deployment of ADF artifacts to designated environments.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Azure Data Factory Instance: An active ADF instance in Azure.&lt;/li&gt;
&lt;li&gt;  Azure DevOps: The YAML is tailored for Azure DevOps. Ensure you have an active Azure DevOps organization and project.&lt;/li&gt;
&lt;li&gt;  Azure DevOps Agent: The pipeline employs the &lt;code&gt;ubuntu-latest&lt;/code&gt; VM image.&lt;/li&gt;
&lt;li&gt;  Node.js: The initial stage requires Node.js.&lt;/li&gt;
&lt;li&gt;  Azure Subscription: Necessary permissions on your Azure subscription are required.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To download necessery package for the npm, keep package.json file in the parent directory&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "scripts":{
        "build":"node node_modules/@microsoft/azure-data-factory-utilities/lib/index"
    },
    "dependencies":{
        "@microsoft/azure-data-factory-utilities":"^1.0.0"
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Repository Structure:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The ADF code should be housed in a folder named &lt;code&gt;data-factory-directory&lt;/code&gt; at the root of your repository.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup &amp;amp; Execution:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Azure Service Connection: Establish a service connection in Azure DevOps linked to your Azure subscription.&lt;/li&gt;
&lt;li&gt; GlobalParametersUpdateScript.ps1: If the &lt;code&gt;modifyGlobalParams&lt;/code&gt; flag is set to true, ensure a PowerShell script named &lt;code&gt;GlobalParamsScript.ps1&lt;/code&gt; is present at the root of your repository.&lt;/li&gt;
&lt;li&gt; Using the Pipeline: Upload the YAML file to your Azure DevOps repository, create a new pipeline, fill in the parameters, trigger the pipeline, and monitor the build and deployment.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Running the Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Push some changes to &lt;code&gt;development&lt;/code&gt;branch. and the pipeline will get triggered automatically&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wx_PI0sj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:378/1%2AMnhcym8CgT0JHrUtyJ3Gyg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wx_PI0sj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:378/1%2AMnhcym8CgT0JHrUtyJ3Gyg.png" alt="" width="378" height="555"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the pipeline finished see the artifact that got published.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--a_3QI8iy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:251/1%2A81EyoKuXbjacpHwoUPn--Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--a_3QI8iy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:251/1%2A81EyoKuXbjacpHwoUPn--Q.png" alt="" width="251" height="66"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Similarly once the pipeline gets approval for deployment, it will deploy the updated template to production,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NweNGyzv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:328/1%2A1YrpZ6Xhv8BGBuERjN1k2g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NweNGyzv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:328/1%2A1YrpZ6Xhv8BGBuERjN1k2g.png" alt="" width="328" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Modifications &amp;amp; Best Practices:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Global Parameter Update: Adjust the logic in the optional global parameter update task if needed.&lt;/li&gt;
&lt;li&gt;  Additional Tasks: Insert any extra tasks within the steps section of the stages.&lt;/li&gt;
&lt;li&gt;  Permissions: Not every team member should have update permissions. Implement a system where only a select few can publish to the Data Factory.&lt;/li&gt;
&lt;li&gt;  Using Azure Key Vault: For security, store connection strings or passwords in Azure Key Vault or use managed identity authentication for ADF Linked Services.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In conclusion, integrating CI/CD with Azure Data Factory not only streamlines the deployment process but also enhances collaboration, auditing, and overall efficiency. With the right setup and best practices, teams can ensure seamless and error-free deployments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>Mastering Kubernetes A Deep Dive into Cluster Management Tools Every DevOps Engineer Should Know</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:04 +0000</pubDate>
      <link>https://dev.to/kunaldas/mastering-kubernetes-a-deep-dive-into-cluster-management-tools-every-devops-engineer-should-know-371l</link>
      <guid>https://dev.to/kunaldas/mastering-kubernetes-a-deep-dive-into-cluster-management-tools-every-devops-engineer-should-know-371l</guid>
      <description>&lt;h1&gt;
  
  
  Mastering Kubernetes: A Deep Dive into Cluster Management Tools Every DevOps Engineer Should Know
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1jStNhAS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:608/1%2AXwnKb_efOFg5HtGeGjIGjg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1jStNhAS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:608/1%2AXwnKb_efOFg5HtGeGjIGjg.png" alt="" width="608" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Mastering Kubernetes: A Deep Dive into Cluster Management Tools Every DevOps Engineer Should Know

&lt;ul&gt;
&lt;li&gt;1. Kustomizer&lt;/li&gt;
&lt;li&gt;2. k9s&lt;/li&gt;
&lt;li&gt;3. Kudo&lt;/li&gt;
&lt;li&gt;4. node-problem-detector&lt;/li&gt;
&lt;li&gt;5. k0s&lt;/li&gt;
&lt;li&gt;6. Helm&lt;/li&gt;
&lt;li&gt;7. ClusterPedia&lt;/li&gt;
&lt;li&gt;8. KEDA&lt;/li&gt;
&lt;li&gt;9. kubectl snapshot&lt;/li&gt;
&lt;li&gt;10. Cert-manager&lt;/li&gt;
&lt;li&gt;11. Prometheus&lt;/li&gt;
&lt;li&gt;12. Metalk8s&lt;/li&gt;
&lt;li&gt;13. Kube-ops-view&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Kubernetes, often abbreviated as K8s, has revolutionized the world of container orchestration. As its popularity grows, so does the ecosystem around it, offering a myriad of tools designed to simplify, enhance, and optimize the Kubernetes experience. For DevOps engineers, understanding these tools is paramount. In this comprehensive guide, we’ll explore some of the most promising cluster management tools that can elevate your Kubernetes game.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Kustomizer
&lt;/h2&gt;

&lt;p&gt;Kustomizer is not just another configuration management tool. It stands out by allowing Kubernetes native applications to be customized without the need for templates. This means you can manage variations of Kubernetes YAML configurations without diving into complex templating engines.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://kubectl.docs.kubernetes.io/guides/introduction/kustomize/"&gt;Learn more about Kustomizer&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  2. k9s
&lt;/h2&gt;

&lt;p&gt;Imagine having a bird’s-eye view of your Kubernetes clusters right from your terminal. k9s offers a terminal-based UI that provides real-time insights into cluster activities and resources. It’s like having a dashboard but with the power of the command line.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://k9scli.io/"&gt;Explore k9s&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Kudo
&lt;/h2&gt;

&lt;p&gt;Building Kubernetes Operators can be challenging. Enter KUDO — the Kubernetes Universal Declarative Operator. It’s an open-source toolkit that simplifies the creation of Operators using a declarative approach, making the management of stateful applications on Kubernetes a breeze.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://kudo.dev/"&gt;Discover Kudo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  4. node-problem-detector
&lt;/h2&gt;

&lt;p&gt;Ensuring the health of your nodes is crucial. The node-problem-detector tool identifies common node issues, bridging the gap between the kernel and cluster management layers. It’s like having a health check-up for your nodes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/kubernetes/node-problem-detector"&gt;Check out node-problem-detector&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  5. k0s
&lt;/h2&gt;

&lt;p&gt;In environments where resources are limited, such as edge computing or IoT, k0s shines. It’s a lightweight, certified Kubernetes distribution tailored for such scenarios, ensuring you don’t compromise on performance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://k0sproject.io/"&gt;Dive into k0s&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Helm
&lt;/h2&gt;

&lt;p&gt;Helm is often dubbed the “package manager for Kubernetes.” It allows users to define, install, and upgrade even the most complex Kubernetes applications using charts, which are packages of pre-configured Kubernetes resources.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://helm.sh/"&gt;Discover Helm&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  7. ClusterPedia
&lt;/h2&gt;

&lt;p&gt;Searching and managing resources across clusters can be daunting. ClusterPedia, a unified search platform for Kubernetes clusters, streamlines this process, making resource management efficient and hassle-free.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://chat.openai.com/?model=gpt-4-plugins#"&gt;Learn about ClusterPedia&lt;/a&gt; &lt;em&gt;(Note: Link to be updated when available)&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  8. KEDA
&lt;/h2&gt;

&lt;p&gt;Event-driven architectures are gaining traction. KEDA (Kubernetes-based Event-Driven Autoscaling) is a component that brings event-driven autoscaling to your Kubernetes applications, ensuring they scale based on real-time demand.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://keda.sh/"&gt;Explore KEDA&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  9. kubectl snapshot
&lt;/h2&gt;

&lt;p&gt;Documentation and debugging are made easier with kubectl snapshot. This tool captures the current state of a Kubernetes cluster, providing a snapshot that can be analyzed or shared.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://chat.openai.com/?model=gpt-4-plugins#"&gt;Discover kubectl snapshot&lt;/a&gt; &lt;em&gt;(Note: Link to be updated when available)&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  10. Cert-manager
&lt;/h2&gt;

&lt;p&gt;Security is paramount. Cert-manager steps in as a native Kubernetes certificate management controller, automating the issuance and renewal of certificates from various sources, ensuring your applications remain secure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cert-manager.io/"&gt;Check out Cert-manager&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  11. Prometheus
&lt;/h2&gt;

&lt;p&gt;Monitoring is crucial in a Kubernetes environment. Prometheus, an open-source monitoring and alerting toolkit, integrates seamlessly with Kubernetes, providing insights into your clusters and applications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://prometheus.io/"&gt;Learn more about Prometheus&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  12. Metalk8s
&lt;/h2&gt;

&lt;p&gt;For those focusing on long-term on-prem deployments, especially on metal machines, Metalk8s is a go-to Kubernetes distribution. It’s opinionated, ensuring stability and performance in such specific scenarios.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://metal-k8s.readthedocs.io/"&gt;Dive into Metalk8s&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  13. Kube-ops-view
&lt;/h2&gt;

&lt;p&gt;Visual representation can simplify complex operations. Kube-ops-view offers a read-only system dashboard for multiple K8s clusters, providing a graphical overview of cluster operations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/hjacobs/kube-ops-view"&gt;Explore Kube-ops-view&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion: The Kubernetes ecosystem is vast and ever-evolving. For DevOps engineers, staying updated with the right tools can make the difference between a smoothly running cluster and a chaotic environment. This guide provides a starting point, but always ensure to evaluate tools based on your unique needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Comment down the tools you use I shall be updating the list whenever I find something cool.&lt;br&gt;&lt;br&gt;
keep it followed !!&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>docker</category>
      <category>kubernetes</category>
    </item>
    <item>
      <title>COST ESTIMATION FOR INFRASTRUCTURE</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:03 +0000</pubDate>
      <link>https://dev.to/kunaldas/cost-estimation-for-infrastructure-44lo</link>
      <guid>https://dev.to/kunaldas/cost-estimation-for-infrastructure-44lo</guid>
      <description>&lt;h1&gt;
  
  
  COST ESTIMATION FOR INFRASTRUCTURE
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
COST ESTIMATION FOR INFRASTRUCTURE

&lt;ul&gt;
&lt;li&gt;1. Design Goals&lt;/li&gt;
&lt;li&gt;1.1. Cost estimation&lt;/li&gt;
&lt;li&gt;1.2. Options available&lt;/li&gt;
&lt;li&gt;1.3. Available tools&lt;/li&gt;
&lt;li&gt;2. Infracost&lt;/li&gt;
&lt;li&gt;2.1. Overview&lt;/li&gt;
&lt;li&gt;2.2. Advantages&lt;/li&gt;
&lt;li&gt;2.3 Limitations&lt;/li&gt;
&lt;li&gt;2.4 Use cases&lt;/li&gt;
&lt;li&gt;2.5 Impact analysis&lt;/li&gt;
&lt;li&gt;3. Workflow&lt;/li&gt;
&lt;li&gt;3.1 Basic workflow&lt;/li&gt;
&lt;li&gt;3.2 DevOps workflow&lt;/li&gt;
&lt;li&gt;3.2.1 Azure DevOps&lt;/li&gt;
&lt;li&gt;3.2.2 Jenkins&lt;/li&gt;
&lt;li&gt;1. Implementation&lt;/li&gt;
&lt;li&gt;4.1 Steps for Azure DevOps&lt;/li&gt;
&lt;li&gt;4.1.1 Installing extension&lt;/li&gt;
&lt;li&gt;4.1.2 Making the repo ready&lt;/li&gt;
&lt;li&gt;4.1.3 Adding Infracost code in pipeline&lt;/li&gt;
&lt;li&gt;4.1.1 Additional settings&lt;/li&gt;
&lt;li&gt;4.1.2 Creating a Pull Request&lt;/li&gt;
&lt;li&gt;4.1.3 Viewing the dashboard&lt;/li&gt;
&lt;li&gt;4.1.1 Dashboard use cases&lt;/li&gt;
&lt;li&gt;1. Navigation&lt;/li&gt;
&lt;li&gt;5.1 PR cost difference&lt;/li&gt;
&lt;li&gt;5.1 Detailed Cost estimation&lt;/li&gt;
&lt;li&gt;6. Diagram&lt;/li&gt;
&lt;li&gt;6.1 New Branch&lt;/li&gt;
&lt;li&gt;6.2 Making changes&lt;/li&gt;
&lt;li&gt;6.3 Pull Request&lt;/li&gt;
&lt;li&gt;6.4 Release&lt;/li&gt;
&lt;li&gt;1. Scope for improvement&lt;/li&gt;
&lt;li&gt;2. Cost of Implementation&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  1. Design Goals
&lt;/h2&gt;

&lt;h2&gt;
  
  
  1.1. Cost estimation
&lt;/h2&gt;

&lt;p&gt;Cost estimation is the process of predicting the cost of a project, product, or service. It involves identifying all the resources that will be needed and determining the associated costs to arrive at a total estimate. Cost estimation is a key part of project management and helps stakeholders understand the financial implications of their decisions. It can also help organizations make informed decisions about whether to pursue a project or service and how to allocate budgets for the best return on investment. There are many factors that can affect the cost of a project, including the complexity of the work, the skills and expertise of the team, and the availability of resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  1.2. Options available
&lt;/h2&gt;

&lt;p&gt;There are several options for cost estimation in the cloud:&lt;/p&gt;

&lt;p&gt;ü Cost calculators: Most cloud providers offer cost calculators that allow you to estimate the cost of various cloud resources based on your usage patterns and requirements. These calculators can help you compare the cost of different options and choose the most cost-effective solution.&lt;/p&gt;

&lt;p&gt;ü Cost optimization tools: Many cloud providers offer tools and services to help organizations optimize their resource usage and costs. These tools can help identify cost drivers, forecast future costs, and recommend optimization strategies.&lt;/p&gt;

&lt;p&gt;ü Pricing plans: Many cloud providers offer a variety of pricing plans that offer different levels of resources and support at different price points. Carefully reviewing and comparing these plans can help you choose the one that best meets your needs and budget.&lt;/p&gt;

&lt;p&gt;ü Cost monitoring and management tools: There are a variety of tools and services available that can help you monitor your cloud resource usage and costs in real-time. These tools can alert you to unexpected cost spikes and help you identify opportunities for cost optimization.&lt;/p&gt;

&lt;p&gt;ü Negotiating with vendors: Some cloud providers may be willing to negotiate pricing or offer discounts for large or long-term commitments. It may be worth negotiating with your provider to see if you can get a better deal on your cloud resources.&lt;/p&gt;

&lt;p&gt;ü Resource optimization: Identifying and eliminating unnecessary or underutilized resources can help reduce your overall cloud costs. This could include retiring or decommissioning old resources, optimizing resource allocation, and consolidating resources where possible.&lt;/p&gt;

&lt;h2&gt;
  
  
  1.3. Available tools
&lt;/h2&gt;

&lt;p&gt;There are several tools available for cost estimation:&lt;/p&gt;

&lt;p&gt;ü Cost calculators: Many cloud providers offer cost calculators that allow you to estimate the cost of various cloud resources based on your usage patterns and requirements. These calculators can help you compare the cost of different options and choose the most cost-effective solution. E.g. Total Cost of Ownership (TCO) Calculator in AWS, The Azure Pricing Calculator, Google Cloud Pricing Calculator&lt;/p&gt;

&lt;p&gt;ü Cost monitoring and management tools: There are a variety of tools and services available that can help you monitor your cloud resource usage and costs in real-time. These tools can alert you to unexpected cost spikes and help you identify opportunities for cost optimization.&lt;/p&gt;

&lt;p&gt;ü Project management software: Many project management software platforms offer cost estimation and budgeting tools to help organizations plan and track project costs.&lt;/p&gt;

&lt;p&gt;ü Specialized cost estimation software: There are also specialized cost estimation software tools that are specifically designed for cost estimation. These tools often offer advanced features and capabilities, such as the ability to create detailed cost breakdowns and incorporate risk analysis.&lt;/p&gt;

&lt;p&gt;ü Infracost: Infracost is a tool that helps users optimize their costs when using Terraform for infrastructure management. It provides real-time cost estimates for infrastructure resources and suggests ways to reduce costs through optimization.&lt;/p&gt;

&lt;p&gt;ü Teracost: TeraCost is a cloud cost management and optimization platform that helps organizations optimize their cloud costs by providing visibility into resource usage and costs, identifying opportunities for optimization, and recommending actions to reduce costs. It offers a range of features, including cost forecasting, budget tracking, and resource optimization recommendations, and is designed to work with a variety of cloud providers, including Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Infracost
&lt;/h2&gt;

&lt;h2&gt;
  
  
  2.1. Overview
&lt;/h2&gt;

&lt;p&gt;Infracost is a tool that is specifically designed to optimize costs in Terraform environments. It works by analysing the resources that are being created in a Terraform configuration and providing recommendations for cost optimization. This can include suggestions for using lower cost resource types, resizing resources to a more cost-effective size, or identifying resources that are no longer needed and can be safely deleted. Infracost also provides real-time cost estimates for the resources in a Terraform configuration, which can help teams make more informed decisions about the cost of their infrastructure. Overall, Infracost is a powerful tool that can help teams save money on their infrastructure costs by identifying opportunities for cost optimization and providing actionable recommendations for implementing those optimizations.&lt;/p&gt;

&lt;h2&gt;
  
  
  2.2. Advantages
&lt;/h2&gt;

&lt;p&gt;Infracost is a tool that can be used for cost optimization in Terraform, a popular Infrastructure as Code (IaC) tool. It provides several advantages over other cost optimization tools available for Terraform:&lt;/p&gt;

&lt;p&gt;ü Real-time cost estimation: Infracost provides real-time cost estimates for all the resources in your Terraform code, so you can make informed decisions about your infrastructure costs.&lt;/p&gt;

&lt;p&gt;ü Integration with Terraform: Infracost integrates seamlessly with Terraform, so you can see the cost estimates for your infrastructure as you write your code. This helps you avoid costly mistakes and optimize your infrastructure costs from the start.&lt;/p&gt;

&lt;p&gt;ü Custom pricing: Infracost allows you to set custom pricing for your resources, so you can better reflect your organization’s negotiated rates or your unique usage patterns.&lt;/p&gt;

&lt;p&gt;ü Resource filtering: Infracost allows you to filter resources by tag, type, and name, making it easier to focus on specific resources and optimize their costs.&lt;/p&gt;

&lt;p&gt;ü Resource grouping: Infracost groups similar resources together, making it easier to compare costs and identify opportunities for optimization.&lt;/p&gt;

&lt;p&gt;ü Easy to use: Infracost is easy to use and requires minimal setup, so you can start optimizing your infrastructure costs quickly and easily.&lt;/p&gt;

&lt;p&gt;Overall, Infracost provides a comprehensive and user-friendly solution for cost optimization in Terraform, making it an excellent choice for organizations looking to optimize their infrastructure costs.&lt;/p&gt;

&lt;h2&gt;
  
  
  2.3 Limitations
&lt;/h2&gt;

&lt;p&gt;Infracost is a useful tool for gaining visibility into the cost of your infrastructure and optimizing your spend on cloud resources. However, it does have some limitations to consider:&lt;/p&gt;

&lt;p&gt;ü Limited to supported cloud providers: Infracost currently supports only a limited number of cloud providers, including AWS, Azure, and Google Cloud. If you are using a different cloud provider or a combination of multiple providers, Infracost may not be able to provide cost estimates for your infrastructure.&lt;/p&gt;

&lt;p&gt;ü Dependent on accurate resource pricing: Infracost relies on accurate resource pricing information from the supported cloud providers. If the pricing information is incorrect or out of date, Infracost cost estimates may not be accurate.&lt;/p&gt;

&lt;p&gt;ü Limited to resources supported by the cloud providers: Infracost can only provide cost estimates for resources that are supported by the cloud providers it integrates with. If you are using custom resources or resources from third-party providers, Infracost may not be able to provide cost estimates for them.&lt;/p&gt;

&lt;p&gt;ü Requires manual integration with infrastructure as code: While Infracost integrates with popular infrastructure as code tools such as Terraform and CloudFormation, it requires manual integration. This means you will need to set up Infracost and configure it to work with your infrastructure as code tools, which can be time-consuming and require some technical expertise.&lt;/p&gt;

&lt;p&gt;ü Limited to infrastructure costs: Infracost only provides estimates for the cost of your infrastructure resources, such as compute instances and storage. It does not include other costs such as data transfer fees or licensing costs for third-party software.&lt;/p&gt;

&lt;p&gt;ü Limited to current infrastructure: Infracost only provides cost estimates for your current infrastructure. It does not allow you to compare the cost of different infrastructure configurations or predict the cost of future infrastructure changes.&lt;/p&gt;

&lt;p&gt;Overall, while Infracost is a useful tool for gaining visibility into the cost of your infrastructure and optimizing your spend on cloud resources, it is limited in its scope and may not be suitable for all organizations or infrastructure configurations.&lt;/p&gt;

&lt;h2&gt;
  
  
  2.4 Use cases
&lt;/h2&gt;

&lt;p&gt;Some possible use cases for Infracost include:&lt;/p&gt;

&lt;p&gt;ü Identifying and reducing over-provisioned resources: Infracost can help you identify resources that are over-provisioned, which means they have more capacity than is necessary to meet your workload demands. By reducing the capacity of these resources, you can save on your cloud costs.&lt;/p&gt;

&lt;p&gt;ü Optimizing resource usage: Infracost can help you optimize resource usage by identifying underutilized resources and suggesting ways to improve their utilization. For example, if you have a virtual machine that is only running at 20% capacity, Infracost may suggest moving workloads to that machine to better utilize its capacity.&lt;/p&gt;

&lt;p&gt;ü Estimating costs for infrastructure changes: Infracost can be used to estimate the costs of making changes to your infrastructure, such as adding or removing resources. This can help you plan for changes and ensure that they are cost-effective.&lt;/p&gt;

&lt;p&gt;ü Analysing cloud cost trends: Infracost can provide detailed cost breakdowns and graphs that allow you to understand your cloud costs over time and identify trends that may impact your budget. This can help you plan and make decisions to optimize your costs.&lt;/p&gt;

&lt;p&gt;ü Auditing and compliance: Infracost can be used to audit your infrastructure and ensure that it complies with company policies or regulatory requirements. It can also help you identify resources that may be overpriced or misconfigured, allowing you to take corrective action.&lt;/p&gt;

&lt;h2&gt;
  
  
  2.5 Impact analysis
&lt;/h2&gt;

&lt;p&gt;Infracost is a tool that allows users to perform cost analysis on their infrastructure-as-code (IaC) resources. This means that users can see the estimated cost of their resources before they are deployed, allowing them to make informed decisions about their infrastructure and optimize for cost. One key use case for Infracost is in the development phase, where users can use it to identify and address potential cost issues before they become a problem. It can also be used in the operations phase to monitor the ongoing cost of infrastructure and identify opportunities for optimization. By providing visibility into the cost of infrastructure, Infracost can help organizations make informed decisions about how to allocate their resources and stay within budget. Overall, the use of Infracost can have a significant impact on the cost efficiency of an organization’s infrastructure, helping to reduce waste and ensure that resources are being used effectively.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Workflow
&lt;/h2&gt;

&lt;h2&gt;
  
  
  3.1 Basic workflow
&lt;/h2&gt;

&lt;p&gt;ü The workflow of infracost involves the following steps:&lt;/p&gt;

&lt;p&gt;ü Install the infracost CLI tool: Infracost can be installed as a standalone CLI tool or as a Terraform plugin. To install the CLI tool, you can use the following command:&lt;/p&gt;

&lt;p&gt;curl -sfL &lt;a href="https://raw.githubusercontent.com/infracost/infracost/master/scripts/install.sh"&gt;https://raw.githubusercontent.com/infracost/infracost/master/scripts/install.sh&lt;/a&gt; | sh&lt;/p&gt;

&lt;p&gt;ü Initialize infracost: Once you have installed the infracost CLI tool, you can initialize it by running the following command in your Terraform directory:&lt;/p&gt;

&lt;p&gt;infracost init&lt;/p&gt;

&lt;p&gt;ü This will create a default configuration file (infracost.yml) in your Terraform directory, which you can customize as per your needs.&lt;/p&gt;

&lt;p&gt;ü Add infracost to your Terraform workflow: You can add infracost to your Terraform workflow by using the infracost command before or after running terraform plan or terraform apply. For example, you can run the following command to generate a cost estimate before applying your changes:&lt;/p&gt;

&lt;p&gt;infracost plan&lt;/p&gt;

&lt;p&gt;ü Alternatively, you can run the following command to generate a cost estimate after applying your changes:&lt;/p&gt;

&lt;p&gt;infracost apply&lt;/p&gt;

&lt;p&gt;ü Review the cost estimate: Infracost will generate a cost estimate in the form of a table, which will show you the estimated cost of each resource in your Terraform configuration. You can review the cost estimate and make any necessary changes to your configuration to optimize costs.&lt;/p&gt;

&lt;p&gt;ü Repeat the process: You can repeat the process of running infracost before or after applying your changes as many times as you like, until you are satisfied with the cost estimate.&lt;/p&gt;

&lt;p&gt;ü Use infracost with version control: You can also use infracost with version control systems like Git to track changes to your cost estimates over time. This can be useful for comparing the costs of different versions of your Terraform configuration and for identifying cost-saving opportunities.&lt;/p&gt;

&lt;p&gt;ü Use infracost with CI/CD: You can also integrate infracost with your CI/CD pipeline to automate the process of generating and reviewing cost estimates. This can be especially useful if you have a large number of Terraform configurations that need to be regularly reviewed for cost optimization.&lt;/p&gt;

&lt;h2&gt;
  
  
  3.2 DevOps workflow
&lt;/h2&gt;

&lt;h2&gt;
  
  
  3.2.1 Azure DevOps
&lt;/h2&gt;

&lt;p&gt;ü The workflow for using infracost with Azure DevOps involves integrating infracost into your Azure DevOps pipeline. This can be done by adding the infracost task to your pipeline and configuring it to run at the appropriate stage.&lt;/p&gt;

&lt;p&gt;ü To begin, you will need to install the infracost extension from the Azure DevOps marketplace. Next, you will need to create a pipeline and add the infracost task to it. In the task configuration, you will need to specify the path to your Terraform configuration files, as well as any additional arguments that you want to pass to infracost.&lt;/p&gt;

&lt;p&gt;ü Once the task is configured, you can run your pipeline as usual. As part of the pipeline execution, infracost will analyze your Terraform configuration and provide cost estimates for the resources that are being created. This information will be displayed in the pipeline output, allowing you to easily see the cost implications of your changes.&lt;/p&gt;

&lt;p&gt;ü In addition to running infracost as part of your pipeline, you can also use it to create pull request checks. This allows you to enforce cost limits on pull requests, ensuring that any changes that are made do not exceed a certain budget.&lt;/p&gt;

&lt;p&gt;ü Overall, integrating infracost into your Azure DevOps workflow can help you better understand the cost implications of your infrastructure changes, and ensure that you are making cost-effective decisions as you develop and deploy your applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  3.2.2 Jenkins
&lt;/h2&gt;

&lt;p&gt;The workflow for using infracost with Jenkins typically involves the following steps:&lt;/p&gt;

&lt;p&gt;ü Install the infracost plugin in Jenkins. This can be done through the Jenkins Plugin Manager.&lt;/p&gt;

&lt;p&gt;ü Configure the infracost plugin in the Jenkins Global Configuration page. This includes specifying the AWS access keys, the Terraform version, and the desired frequency for cost scans.&lt;/p&gt;

&lt;p&gt;ü In your Jenkins job, add a build step to run infracost. This can be done by selecting “Infracost” from the Add Build Step dropdown menu.&lt;/p&gt;

&lt;p&gt;ü In the infracost build step, specify the path to your Terraform configuration files and any additional command line options.&lt;/p&gt;

&lt;p&gt;ü Run the Jenkins job to trigger the infracost scan. The infracost plugin will scan your Terraform configuration files and generate a report on the estimated cost of your infrastructure.&lt;/p&gt;

&lt;p&gt;ü View the infracost report in the Jenkins build output. The report will contain a breakdown of the cost estimates for each resource, as well as a total estimate for the entire infrastructure.&lt;/p&gt;

&lt;p&gt;ü Optionally, you can configure infracost to fail the Jenkins build if the estimated cost exceeds a certain threshold. This can help prevent overspending on infrastructure costs.&lt;/p&gt;

&lt;p&gt;ü By integrating infracost into your Jenkins workflow, you can easily track the cost of your infrastructure and make informed decisions about resource allocation and optimization.&lt;/p&gt;

&lt;p&gt;Apart from all Infracost supports multiple CI/CD platforms shown below,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UT3A4a9H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AYE7MA6bv8msUvs771jnRAA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UT3A4a9H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AYE7MA6bv8msUvs771jnRAA.png" alt="" width="700" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Implementation
&lt;/h2&gt;

&lt;h2&gt;
  
  
  4.1 Steps for Azure DevOps
&lt;/h2&gt;

&lt;h2&gt;
  
  
  4.1.1 Installing extension
&lt;/h2&gt;

&lt;p&gt;ü Go to your organization setting page in Azure Devops&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://dev.azure.com/{org\_name}/\_settings/extensions

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;ü Go to the marketplace link and install Infracost extension&lt;/p&gt;

&lt;p&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=Infracost.infracost-tasks"&gt;https://marketplace.visualstudio.com/items?itemName=Infracost.infracost-tasks&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Go to the Infracost Dashboard URL and login with your account&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dashboard.infracost.io/"&gt;https://dashboard.infracost.io/&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  4.1.2 Making the repo ready
&lt;/h2&gt;

&lt;p&gt;ü Upon login go to organization settings and copy the API and keep it in safe place for later use&lt;/p&gt;

&lt;p&gt;ü If you do not already have a Terraform file, create one or coy from my repo,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Kunaldastiger/Terraform-Sample/blob/main/Virtual-machines/MicrosoftWindowsServer.tf"&gt;https://github.com/Kunaldastiger/Terraform-Sample/blob/main/Virtual-machines/MicrosoftWindowsServer.tf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Apart from this you must have a pipeline yaml file or a pipeline created in classic fashion,&lt;br&gt;&lt;br&gt;
code for the same can be found here,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Kunaldastiger/Terraform-Sample/blob/main/azure-deployment.yaml"&gt;https://github.com/Kunaldastiger/Terraform-Sample/blob/main/azure-deployment.yaml&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Now run the pipeline and check if resource is getting created or not, needless to say, you also need to have a service connection to azure.&lt;/p&gt;

&lt;p&gt;If you do not have a service connection kindly follow below steps,&lt;/p&gt;

&lt;p&gt;Ø Sign in to Azure DevOps and navigate to the Project Settings page.&lt;/p&gt;

&lt;p&gt;Ø In the left menu, under “Pipelines,” select “Service Connections.”&lt;/p&gt;

&lt;p&gt;Ø Click the “New Service Connection” button.&lt;/p&gt;

&lt;p&gt;Ø In the “Add new service connection” dialog, select “Azure Resource Manager” from the “Type” dropdown menu.&lt;/p&gt;

&lt;p&gt;Ø Give the service connection a name and click “Next.”&lt;/p&gt;

&lt;p&gt;Ø In the “Authorize Azure Resource Manager” dialog, select the Azure subscription you want to use and click “Authorize.”&lt;/p&gt;

&lt;p&gt;Ø After the authorization process is complete, click “Finish” to create the service connection.&lt;/p&gt;

&lt;p&gt;You can now use this service connection in your Azure DevOps pipelines to access resources in your Azure subscription. For example, you could use it to deploy an application to an Azure web app, create or delete Azure resources, or run Azure Resource Manager templates.&lt;/p&gt;
&lt;h2&gt;
  
  
  4.1.3 Adding Infracost code in pipeline
&lt;/h2&gt;

&lt;p&gt;Create a new pipeline, selecting&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Azure Repos Git&lt;/strong&gt; when prompted in the &lt;strong&gt;“Connect”&lt;/strong&gt; stage&lt;/li&gt;
&lt;li&gt; Select the appropriate repo you wish to integrate Infracost with in the &lt;strong&gt;“Select”&lt;/strong&gt; stage&lt;/li&gt;
&lt;li&gt; Choose “Starter Pipeline” in the &lt;strong&gt;“Configure”&lt;/strong&gt; stage&lt;/li&gt;
&lt;li&gt; Replace the Starter Pipeline yaml with the following:
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# The Azure Pipelines docs (https://docs.microsoft.com/en-us/azure/devops/pipelines/process/tasks) describe other options.&lt;/span&gt;
&lt;span class="s"&gt;6.&lt;/span&gt; &lt;span class="c1"&gt;# Running on pull requests to `master` (or your default branch) is a good default.&lt;/span&gt;
&lt;span class="na"&gt;7. pr&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="s"&gt;8.   - main&lt;/span&gt;
&lt;span class="s"&gt;9.&lt;/span&gt; 
&lt;span class="na"&gt;10. variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="na"&gt;11.   - name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TF_ROOT&lt;/span&gt;
&lt;span class="na"&gt;12.     value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Terraform&lt;/span&gt;
&lt;span class="s"&gt;13.&lt;/span&gt;   &lt;span class="c1"&gt;# If you use private modules you'll need this env variable to use &lt;/span&gt;
&lt;span class="s"&gt;14.&lt;/span&gt;   &lt;span class="c1"&gt;# the same ssh-agent socket value across all steps. &lt;/span&gt;
&lt;span class="na"&gt;15.   - name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;SSH_AUTH_SOCK&lt;/span&gt;
&lt;span class="na"&gt;16.     value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/tmp/ssh_agent.sock&lt;/span&gt;
&lt;span class="s"&gt;17.&lt;/span&gt;   &lt;span class="c1"&gt;# This instructs the CLI to send cost estimates to Infracost Cloud. Our SaaS product&lt;/span&gt;
&lt;span class="s"&gt;18.&lt;/span&gt;   &lt;span class="c1"&gt;#   complements the open source CLI by giving teams advanced visibility and controls.&lt;/span&gt;
&lt;span class="s"&gt;19.&lt;/span&gt;   &lt;span class="c1"&gt;#   The cost estimates are transmitted in JSON format and do not contain any cloud &lt;/span&gt;
&lt;span class="s"&gt;20.&lt;/span&gt;   &lt;span class="c1"&gt;#   credentials or secrets (see https://infracost.io/docs/faq/ for more information).&lt;/span&gt;
&lt;span class="na"&gt;21.   - name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;INFRACOST_ENABLE_CLOUD&lt;/span&gt;
&lt;span class="na"&gt;22.     value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;span class="s"&gt;23.&lt;/span&gt;   &lt;span class="c1"&gt;# If you're using Terraform Cloud/Enterprise and have variables stored on there&lt;/span&gt;
&lt;span class="s"&gt;24.&lt;/span&gt;   &lt;span class="c1"&gt;# you can specify the following to automatically retrieve the variables:&lt;/span&gt;
&lt;span class="s"&gt;25.&lt;/span&gt;   &lt;span class="c1"&gt;# env:&lt;/span&gt;
&lt;span class="s"&gt;26.&lt;/span&gt;   &lt;span class="c1"&gt;# - name: INFRACOST_TERRAFORM_CLOUD_TOKEN&lt;/span&gt;
&lt;span class="s"&gt;27.&lt;/span&gt;   &lt;span class="c1"&gt;#   value: $(tfcToken)&lt;/span&gt;
&lt;span class="s"&gt;28.&lt;/span&gt;   &lt;span class="c1"&gt;# - name: INFRACOST_TERRAFORM_CLOUD_HOST&lt;/span&gt;
&lt;span class="s"&gt;29.&lt;/span&gt;   &lt;span class="c1"&gt;#   value: app.terraform.io # Change this if you're using Terraform Enterprise&lt;/span&gt;
&lt;span class="s"&gt;30.&lt;/span&gt; 
&lt;span class="na"&gt;31. jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="na"&gt;32.   - job&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;infracost&lt;/span&gt;
&lt;span class="na"&gt;33.     displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run Infracost&lt;/span&gt;
&lt;span class="na"&gt;34.     pool&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="na"&gt;35.       vmImage&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
&lt;span class="s"&gt;36.&lt;/span&gt; 
&lt;span class="na"&gt;37.     steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="s"&gt;38.&lt;/span&gt;       &lt;span class="c1"&gt;# If you use private modules, add a base 64 encoded secret&lt;/span&gt;
&lt;span class="s"&gt;39.&lt;/span&gt;       &lt;span class="c1"&gt;# called gitSshKeyBase64 with your private key, so Infracost can access&lt;/span&gt;
&lt;span class="s"&gt;40.&lt;/span&gt;       &lt;span class="c1"&gt;# private repositories (similar to how Terraform/Terragrunt does).&lt;/span&gt;
&lt;span class="s"&gt;41.&lt;/span&gt;       &lt;span class="c1"&gt;# - bash: |&lt;/span&gt;
&lt;span class="s"&gt;42.&lt;/span&gt;       &lt;span class="c1"&gt;#     ssh-agent -a $(SSH_AUTH_SOCK)&lt;/span&gt;
&lt;span class="s"&gt;43.&lt;/span&gt;       &lt;span class="c1"&gt;#     mkdir -p ~/.ssh&lt;/span&gt;
&lt;span class="s"&gt;44.&lt;/span&gt;       &lt;span class="c1"&gt;#     echo "$(echo $GIT_SSH_KEY_BASE_64 | base64 -d)" | tr -d '\r' | ssh-add -&lt;/span&gt;
&lt;span class="s"&gt;45.&lt;/span&gt;       &lt;span class="c1"&gt;#     # Update this to github.com, gitlab.com, bitbucket.org, ssh.dev.azure.com or your source control server's domain&lt;/span&gt;
&lt;span class="s"&gt;46.&lt;/span&gt;       &lt;span class="c1"&gt;#     ssh-keyscan ssh.dev.azure.com &amp;gt;&amp;gt; ~/.ssh/known_hosts&lt;/span&gt;
&lt;span class="s"&gt;47.&lt;/span&gt;       &lt;span class="c1"&gt;#   displayName: Add GIT_SSH_KEY&lt;/span&gt;
&lt;span class="s"&gt;48.&lt;/span&gt;       &lt;span class="c1"&gt;#   env:&lt;/span&gt;
&lt;span class="s"&gt;49.&lt;/span&gt;       &lt;span class="c1"&gt;#     GIT_SSH_KEY_BASE_64: $(gitSshKeyBase64)&lt;/span&gt;
&lt;span class="s"&gt;50.&lt;/span&gt; 
&lt;span class="s"&gt;51.&lt;/span&gt;       &lt;span class="c1"&gt;# Install the Infracost CLI, see https://github.com/infracost/infracost-azure-devops#infracostsetup&lt;/span&gt;
&lt;span class="s"&gt;52.&lt;/span&gt;       &lt;span class="c1"&gt;# for other inputs such as version, and pricingApiEndpoint (for self-hosted users).&lt;/span&gt;
&lt;span class="na"&gt;53.       - task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;InfracostSetup@1&lt;/span&gt;
&lt;span class="na"&gt;54.         displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Setup Infracost&lt;/span&gt;
&lt;span class="na"&gt;55.         inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="na"&gt;56.           apiKey&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(infracostApiKey)&lt;/span&gt;
&lt;span class="s"&gt;57.&lt;/span&gt; 
&lt;span class="s"&gt;58.&lt;/span&gt;       &lt;span class="c1"&gt;# Clone the base branch of the pull request (e.g. main/master) into a temp directory.&lt;/span&gt;
&lt;span class="s"&gt;59.&lt;/span&gt;       &lt;span class="c1"&gt;# - bash: |&lt;/span&gt;
&lt;span class="s"&gt;60.&lt;/span&gt;       &lt;span class="c1"&gt;#     branch=$(System.PullRequest.TargetBranch)&lt;/span&gt;
&lt;span class="s"&gt;61.&lt;/span&gt;       &lt;span class="c1"&gt;#     branch=${branch#refs/heads/}&lt;/span&gt;
&lt;span class="s"&gt;62.&lt;/span&gt;       &lt;span class="c1"&gt;#     # Try adding the following to git clone if you're having issues cloning a private repo: --config http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)"&lt;/span&gt;
&lt;span class="na"&gt;63.       - task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Bash@3&lt;/span&gt;
&lt;span class="na"&gt;64.         inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="na"&gt;65.               targetType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;inline'&lt;/span&gt;
&lt;span class="na"&gt;66.               script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;git&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;clone&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;@dev.azure.com/{orgname}/{project_name}/{repo_path}"&amp;gt;https://$(PATToken)@dev.azure.com/{orgname}/{project_name}/{repo_path}&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--branch=main&lt;/span&gt;&lt;span class="nv"&gt;  &lt;/span&gt;&lt;span class="s"&gt;/tmp/base'&lt;/span&gt;
&lt;span class="s"&gt;67.&lt;/span&gt;       &lt;span class="c1"&gt;#   displayName: Checkout base branch&lt;/span&gt;
&lt;span class="s"&gt;68.&lt;/span&gt; 
&lt;span class="s"&gt;69.&lt;/span&gt;       &lt;span class="c1"&gt;# Generate an Infracost cost estimate baseline from the comparison branch, so that Infracost can compare the cost difference.&lt;/span&gt;
&lt;span class="na"&gt;70.       - bash&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;1.           infracost breakdown --path=$(TF_ROOT) \&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;2.                               --format=json \&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;3.                               --out-file=/tmp/infracost-base.json&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;4.         displayName: Generate Infracost cost estimate baseline&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;5. &lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;6.       # Generate an Infracost diff and save it to a JSON file.&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;7.       - bash: |&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;8.           infracost diff --path=$(TF_ROOT) \&lt;/span&gt;
&lt;span class="err"&gt;7&lt;/span&gt;&lt;span class="s"&gt;9.                          --format=json \&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;0.                          --compare-to=/tmp/infracost-base.json \&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;1.                          --out-file=/tmp/infracost.json&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;2.         displayName: Generate Infracost diff&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;3. &lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;4.       # Posts a comment to the PR using the 'update' behavior.&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;5.       # This creates a single comment and updates it. The "quietest" option.&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;6.       # The other valid behaviors are:&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;7.       #   delete-and-new - Delete previous comments and create a new one.&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;8.       #   new - Create a new cost estimate comment on every push.&lt;/span&gt;
&lt;span class="err"&gt;8&lt;/span&gt;&lt;span class="s"&gt;9.       # See https://www.infracost.io/docs/features/cli_commands/#comment-on-pull-requests for other options.&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;0.       - bash: |&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;1.            infracost comment azure-repos --path=/tmp/infracost.json \&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;2.                                          --azure-access-token=$(System.AccessToken) \&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;3.                                          --pull-request=$(System.PullRequest.PullRequestId) \&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;4.                                          --repo-url= '@dev.azure.com/kunaldas0966/test_project/_git/terracost-sample'"&amp;gt;https://$(PATToken)@dev.azure.com/kunaldas0966/test_project/_git/terracost-sample' \&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;5.                                          --behavior=update&lt;/span&gt;
&lt;span class="err"&gt;9&lt;/span&gt;&lt;span class="s"&gt;6.         displayName: Post Infracost comment&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In the pipeline, variables add the below variables&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HPbNcWWL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:643/1%2AN40nSKnlEH0lfQDLAvpEPQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HPbNcWWL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:643/1%2AN40nSKnlEH0lfQDLAvpEPQ.png" alt="" width="643" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ø To do the same,&lt;br&gt;&lt;br&gt;
Sign into Azure DevOps and navigate to your project.&lt;/p&gt;

&lt;p&gt;Ø In the left menu, under “Pipelines,” select “Pipelines.”&lt;/p&gt;

&lt;p&gt;Ø Select the pipeline you want to add a variable to and click the “Edit” button.&lt;/p&gt;

&lt;p&gt;Ø In the pipeline editor, click the “Variables” tab.&lt;/p&gt;

&lt;p&gt;Ø Click the “Add” button to add a new variable.&lt;/p&gt;

&lt;p&gt;Ø In the “Add variable” dialog, enter a name and value for the variable. You can also specify whether the variable is secret and should be masked in the log output.&lt;/p&gt;

&lt;p&gt;Ø Click “Save” to add the variable to the pipeline.&lt;/p&gt;

&lt;p&gt;You can now use the pipeline variable in your pipeline tasks by using the syntax $(variableName). You can also use pipeline variables to control the flow of your pipeline, by using conditions or branching.&lt;/p&gt;
&lt;h2&gt;
  
  
  4.1.1 Additional settings
&lt;/h2&gt;

&lt;p&gt;ü Enable pull request build triggers. Without this, Azure Pipelines do not trigger builds with the pull request ID, thus comments cannot be posted by the integration.&lt;/p&gt;

&lt;p&gt;ü From your Azure DevOps organization, click on your project &amp;gt; Project Settings &amp;gt; Repositories&lt;/p&gt;

&lt;p&gt;ü Select the repository that your created the pipeline for in step 1&lt;/p&gt;

&lt;p&gt;ü Select the Policies tab and under the Branch Policies select on your default branch (master or main)&lt;/p&gt;

&lt;p&gt;ü Scroll to Build Validation and click + sign to add one if you don’t have one already&lt;/p&gt;

&lt;p&gt;ü Set your ‘Build pipeline’ to the pipeline you created in step 1, leave ‘Path filter’ blank, set ‘Trigger’ to Automatic, and ‘Policy requirement’ to Optional (you can also use Required but we don’t recommend it).&lt;/p&gt;

&lt;p&gt;ü Enable Azure Pipelines to post pull request comments&lt;/p&gt;

&lt;p&gt;ü From your Azure DevOps organization, click on your project &amp;gt; Project Settings &amp;gt; Repositories &amp;gt; your repository.&lt;/p&gt;

&lt;p&gt;ü Click on the Securities tab, scroll down to Users and click on the ‘[project name] Build Service ([org name])’ user, and set the ‘Contribute to pull requests’ to Allow.&lt;/p&gt;

&lt;p&gt;If you are using github instead of azure repo the code for the same is available at below link,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=Infracost.infracost-tasks&amp;amp;targetId=10277548-b72b-4200-99f2-565bf446c70d&amp;amp;utm_source=vstsproduct&amp;amp;utm_medium=ExtHubManageList"&gt;https://marketplace.visualstudio.com/items?itemName=Infracost.infracost-tasks&amp;amp;targetId=10277548-b72b-4200-99f2-565bf446c70d&amp;amp;utm_source=vstsproduct&amp;amp;utm_medium=ExtHubManageList&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  4.1.2 Creating a Pull Request
&lt;/h2&gt;

&lt;p&gt;Ø Not you have to create a new branch and make some changes in the new branch to that the resource cost may increase or decrease,&lt;br&gt;&lt;br&gt;
To create a new branch and do a pull request in an Azure Repo, you’ll need to follow these steps:&lt;/p&gt;

&lt;p&gt;Ø Navigate to the Azure Repos page in Azure DevOps.&lt;/p&gt;

&lt;p&gt;Ø Select the repository that you want to create the branch and pull request in.&lt;/p&gt;

&lt;p&gt;Ø On the repository page, click on the “Branches” tab.&lt;/p&gt;

&lt;p&gt;Ø Click on the “New branch” button.&lt;/p&gt;

&lt;p&gt;Ø Enter a name for the new branch and select the branch that you want to base the new branch on.&lt;/p&gt;

&lt;p&gt;Ø Click on the “Create branch” button to create the new branch.&lt;/p&gt;

&lt;p&gt;Once the new branch has been created, you can switch to it and make the necessary changes. When you’re ready to submit the changes for review, you can follow the steps above to create a pull request.&lt;/p&gt;

&lt;p&gt;Ø Navigate to the Azure Repos page in Azure DevOps.&lt;/p&gt;

&lt;p&gt;Ø Select the repository that you want to create the pull request in.&lt;/p&gt;

&lt;p&gt;Ø On the repository page, click on the “Pull requests” tab.&lt;/p&gt;

&lt;p&gt;Ø Click on the “New pull request” button.&lt;/p&gt;

&lt;p&gt;Ø Select the source and target branches for the pull request. The source branch is the branch that contains the changes that you want to merge, and the target branch is the branch that you want to merge the changes into.&lt;/p&gt;

&lt;p&gt;Ø Review the changes that will be included in the pull request. You can use the “Files changed” tab to view the individual changes and make any necessary adjustments.&lt;/p&gt;

&lt;p&gt;Ø Add a title and optional description for the pull request.&lt;/p&gt;

&lt;p&gt;Ø If you want to request a review from specific people or teams, you can use the “Reviewers” field to add them.&lt;/p&gt;

&lt;p&gt;Ø When you’re ready to create the pull request, click on the “Create pull request” button.&lt;/p&gt;

&lt;p&gt;After this you should see your pipeline starts running as the trigger is set as main pull request!&lt;/p&gt;
&lt;h2&gt;
  
  
  4.1.3 Viewing the dashboard
&lt;/h2&gt;

&lt;p&gt;ü Log in to &lt;a href="https://dashboard.infracost.io/"&gt;https://dashboard.infracost.io/&lt;/a&gt; and select the repo ,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--irT-WVhA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Alr335paLDs5BXxIMUFiJdw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--irT-WVhA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Alr335paLDs5BXxIMUFiJdw.png" alt="" width="700" height="482"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You should be able to see similar kind of dashboard, there are many things to go through in the GUI,&lt;/p&gt;

&lt;p&gt;The Infracost dashboard is a web-based tool that provides an overview of the cost of your infrastructure resources. It displays information about the total cost of your resources, as well as detailed information about each resource and its cost.&lt;/p&gt;

&lt;p&gt;The dashboard is divided into several sections:&lt;/p&gt;

&lt;p&gt;Ø Overview: This section provides a summary of your infrastructure costs, including the total cost of your resources and the breakdown of costs by resource type.&lt;/p&gt;

&lt;p&gt;Ø Resources: This section lists all of the infrastructure resources in your project, along with their cost, resource type, and resource ID. You can use this section to view the cost of individual resources and identify opportunities for cost optimization.&lt;/p&gt;

&lt;p&gt;Ø Billing: This section provides information about your billing history, including the total amount billed, the billing period, and the date of the most recent bill.&lt;/p&gt;

&lt;p&gt;Ø Costs by resource type: This section displays a breakdown of your costs by resource type, showing you which types of resources are the most expensive.&lt;/p&gt;

&lt;p&gt;Ø Costs by tag: This section displays a breakdown of your costs by tag, allowing you to see which resources are associated with a particular tag and their costs.&lt;/p&gt;

&lt;p&gt;Ø Cost trends: This section displays a chart showing the trend in your infrastructure costs over time. You can use this chart to identify trends in your costs and identify opportunities for cost optimization.&lt;/p&gt;
&lt;h2&gt;
  
  
  4.1.1 Dashboard use cases
&lt;/h2&gt;

&lt;p&gt;There are several things that you can do from the Infracost dashboard to manage and optimize the cost of your infrastructure resources:&lt;/p&gt;

&lt;p&gt;Ø View a summary of your infrastructure costs: The dashboard provides an overview of your total infrastructure costs, as well as a breakdown of costs by resource type. This can help you understand the overall cost of your infrastructure and identify areas where you might be able to save money.&lt;/p&gt;

&lt;p&gt;Ø View detailed information about individual resources: The dashboard provides detailed information about each infrastructure resource, including its cost, resource type, and resource ID. This can help you understand the cost of individual resources and identify opportunities for cost optimization.&lt;/p&gt;

&lt;p&gt;Ø View your billing history: The dashboard includes a section on billing that provides information about your total amount billed, the billing period, and the date of the most recent bill. This can help you understand your billing history and track changes in your costs over time.&lt;/p&gt;

&lt;p&gt;Ø View costs by resource type and tag: The dashboard provides a breakdown of costs by resource type and tag, which can help you understand which types of resources the most expensive and which resources are are associated with a particular tag.&lt;/p&gt;

&lt;p&gt;Ø View cost trends over time: The dashboard includes a chart showing the trend in your infrastructure costs over time. This can help you identify trends in your costs and identify opportunities for cost optimization.&lt;/p&gt;
&lt;h2&gt;
  
  
  1. Navigation
&lt;/h2&gt;
&lt;h2&gt;
  
  
  5.1 PR cost difference
&lt;/h2&gt;

&lt;p&gt;ü Click on the repo and select the repository you are using in the Infracost dashboard,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WNLCywjy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2APoTqM2wA9xHgEBiPttzqSw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WNLCywjy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2APoTqM2wA9xHgEBiPttzqSw.png" alt="" width="700" height="258"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, you will see all the price difference before you choose to merge the PR.&lt;/p&gt;
&lt;h2&gt;
  
  
  5.1 Detailed Cost estimation
&lt;/h2&gt;

&lt;p&gt;Click to see the full cost estimate and it will open the detailed cost estimates in a separate&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6B-qb2-z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AmneFcpcqfz2on7hPsXwpjQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6B-qb2-z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AmneFcpcqfz2on7hPsXwpjQ.png" alt="" width="700" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Here in the diff output tab you can see the difference that will occur if you merge the changes,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gQSc62yG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2A70chuQkrt0JR9XwO85e3BQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gQSc62yG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2A70chuQkrt0JR9XwO85e3BQ.png" alt="" width="700" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Click on the Table output to see all the involved cost,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WJc6d1Zs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AGEAfTOXK8nr2XZyWioBklg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WJc6d1Zs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AGEAfTOXK8nr2XZyWioBklg.png" alt="" width="700" height="316"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü Under each resource you can see detailed cost report of the individual items,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cDSXVQ4p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ABUm6Xnn_x2r0dWh7dSvisA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cDSXVQ4p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ABUm6Xnn_x2r0dWh7dSvisA.png" alt="" width="700" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ü You may also choose to export the data in CSV format by clicking on export data.&lt;/p&gt;
&lt;h2&gt;
  
  
  6. Diagram
&lt;/h2&gt;

&lt;p&gt;I have prepared a simple flow diagram that is very easy to understand, let’s go through each block one by one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xninMLjl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ACjGEdkxtCmDqcuMmCvS8PA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xninMLjl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2ACjGEdkxtCmDqcuMmCvS8PA.png" alt="" width="700" height="218"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  6.1 New Branch
&lt;/h2&gt;

&lt;p&gt;ü Here basically we are creating a new branch so that, we can start making changes to existing infrastructure, creating a new branch in a version control system like Azure Repos allows you to work on a copy of your codebase and make changes to it without affecting the main branch (often called the “master” branch). This is useful when you want to make changes to your infrastructure, as it allows you to test and validate your changes before merging them into the main branch.&lt;/p&gt;

&lt;p&gt;ü By creating a new branch, you can make changes to your infrastructure in a safe and isolated environment, without worrying about breaking anything in the main branch. This can help you avoid disruptions to your infrastructure and ensure that any changes you make are well-tested and ready for production.&lt;/p&gt;
&lt;h2&gt;
  
  
  6.2 Making changes
&lt;/h2&gt;

&lt;p&gt;ü When you make changes to your codebase in a feature branch, you are working on a copy of the code that is separate from the main branch (often called the “master” branch). This allows you to make changes and test them without affecting the main branch, which can be useful when you are developing new features or making changes to your infrastructure. So you can just pull the code to your IDE like VS code and start making changes in the TF file, once done you can then push it to the dev/feature branch,&lt;/p&gt;
&lt;h2&gt;
  
  
  6.3 Pull Request
&lt;/h2&gt;

&lt;p&gt;ü Create a pull request: When you’re ready to merge your changes into the main branch, you’ll need to create a pull request. This will allow other members of your team to review your changes and decide whether to merge them into the main branch.&lt;/p&gt;

&lt;p&gt;ü Here once you create a pull request the pipeline will start running,&lt;/p&gt;

&lt;p&gt;ü Once pipeline run finishes you can go to Infracost dashboard and view the cost estimation.&lt;/p&gt;
&lt;h2&gt;
  
  
  6.4 Release
&lt;/h2&gt;

&lt;p&gt;If satisfied with the cost estimation then merge the changes and as soon as it gets merged to the main branch, it will trigger the terraform pipeline which will then create the desired resources.&lt;/p&gt;
&lt;h2&gt;
  
  
  1. Scope for improvement
&lt;/h2&gt;

&lt;p&gt;The pipeline can be further automated to pass the Pull request automatically if certain conditions are met.&lt;br&gt;&lt;br&gt;
suppose if cost increases/decreases &amp;lt; 5% then the pipeline should read that and merge the code,&lt;br&gt;&lt;br&gt;
It needs further research and collaboration.&lt;/p&gt;
&lt;h2&gt;
  
  
  2. Cost of Implementation
&lt;/h2&gt;

&lt;p&gt;Infracost is an open-source tool though it has some pricing options, the free version provides plenty of options to get the job done, the features offered in the free versions are:&lt;/p&gt;

&lt;p&gt;ü Open source&lt;/p&gt;

&lt;p&gt;ü Get cost breakdowns and diffs&lt;/p&gt;

&lt;p&gt;ü CI/CD integrations (GitHub, GitLab, Bitbucket, Azure DevOps…)&lt;/p&gt;

&lt;p&gt;ü Works with Terraform Cloud &amp;amp; Terragrunt&lt;/p&gt;

&lt;p&gt;ü Use our hosted Cloud Pricing API or self-host&lt;/p&gt;

&lt;p&gt;ü Community supported&lt;/p&gt;

&lt;p&gt;For an annual fee of $30 some additionally functionalities can be availed,&lt;/p&gt;

&lt;p&gt;ü In addition to Infracost Community:&lt;/p&gt;

&lt;p&gt;ü Visibility across all changes, see pull requests that increase/decrease costs the most&lt;/p&gt;

&lt;p&gt;ü Guardrails with custom messages/notifications/actions&lt;/p&gt;

&lt;p&gt;ü Policies in pull requests (e.g. gp2 to gp3)&lt;/p&gt;

&lt;p&gt;ü Custom price books and discounts&lt;/p&gt;

&lt;p&gt;ü Jira integration&lt;/p&gt;

&lt;p&gt;ü Custom reports&lt;/p&gt;

&lt;p&gt;ü GitHub App integration with pull request status/metadata&lt;/p&gt;

&lt;p&gt;ü Team management&lt;/p&gt;

&lt;p&gt;ü SSO&lt;/p&gt;
&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>How to publish from Release Pipeline in Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:02 +0000</pubDate>
      <link>https://dev.to/kunaldas/how-to-publish-from-release-pipeline-in-azure-devops-4gki</link>
      <guid>https://dev.to/kunaldas/how-to-publish-from-release-pipeline-in-azure-devops-4gki</guid>
      <description>&lt;h1&gt;
  
  
  How to publish from Release Pipeline in Azure DevOps
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UwzusFiG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AA_IveIUhnhHOzi_a04hecA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UwzusFiG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AA_IveIUhnhHOzi_a04hecA.png" alt="" width="700" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
How to publish from Release Pipeline in Azure DevOps

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Pushing the artifact in a git repo&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this section, I shall describe how you can get the artifact from the release pipeline,&lt;/p&gt;

&lt;p&gt;Publishing any artifact from the build pipeline is pretty easy and there are plenty of tutorials available on the internet,&lt;/p&gt;

&lt;p&gt;To publish an artifact from a build pipeline in Azure DevOps, you can use the &lt;code&gt;Publish Build Artifacts&lt;/code&gt; task. Here's how you can do it:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; In your build pipeline, add the &lt;code&gt;Publish Build Artifacts&lt;/code&gt; task to a phase that runs after the build is completed.&lt;/li&gt;
&lt;li&gt; In the &lt;code&gt;Path to publish&lt;/code&gt; field, specify the path to the folder that contains the artifacts you want to publish. You can use a wildcard pattern to include multiple files and folders.&lt;/li&gt;
&lt;li&gt; In the &lt;code&gt;Artifact name&lt;/code&gt; field, enter a name for the artifact.&lt;/li&gt;
&lt;li&gt; In the &lt;code&gt;Artifact publish location&lt;/code&gt; field, choose whether you want to publish the artifact to the pipeline or to a file share.&lt;/li&gt;
&lt;li&gt; If you are publishing to a file share, enter the path to the share in the &lt;code&gt;Path to publish on the server&lt;/code&gt; field.&lt;/li&gt;
&lt;li&gt; Save and run your pipeline.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That’s it! The &lt;code&gt;Publish Build Artifacts&lt;/code&gt; task will publish the specified artifacts to the pipeline or file share you specified. You can then use the artifact in a release pipeline or download it from the Artifacts page in Azure DevOps.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishPipelineArtifact@1&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Publish&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Pipeline&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Artifact'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;targetPath&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;$(Build.ArtifactStagingDirectory)'&lt;/span&gt;
    &lt;span class="na"&gt;artifact&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;drop&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now for any reason, you may want to get the same result from the release pipeline as well, but unfortunately, the PublishPipelineArtifact@1 task does not support in release pipeline and even if you run it you will get some error, to solve this you can do one thing,&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Pushing the artifact in a git repo&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;in this way essentially you can save the copy of your updated artifacts in a separate folder inside your git repo.&lt;/p&gt;

&lt;p&gt;let me show my scenario,&lt;/p&gt;

&lt;p&gt;we are following the git flow branching strategy, so as you can see in each environment we need to save a copy of the updated ARM template .&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cvupmaHt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AN02QCq_JlDm7I_FWyNanSg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cvupmaHt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AN02QCq_JlDm7I_FWyNanSg.png" alt="" width="700" height="332"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;How did I achieve this?&lt;/p&gt;

&lt;p&gt;Well, let me guide you step by step.&lt;/p&gt;

&lt;p&gt;First, give all the permission required so that the pipeline can access the repo and push to it!&lt;/p&gt;

&lt;p&gt;Go to project settings → Repository → select your repo →security → project collection Administrators → contribute → ALLOW&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cJhZhHLS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AAlJk6VeImXA3urf7XfHt_Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cJhZhHLS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AAlJk6VeImXA3urf7XfHt_Q.png" alt="" width="700" height="324"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This setting essentially allows pushing from the build pipeline to the repo.&lt;/p&gt;

&lt;p&gt;also, you have to check one more setting,&lt;/p&gt;

&lt;p&gt;for the classic pipeline, select the below option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1t9O6_Ny--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:328/1%2AImXJ-rMt4oEHY9BAHriYaw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1t9O6_Ny--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:328/1%2AImXJ-rMt4oEHY9BAHriYaw.png" alt="" width="328" height="31"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and for YAML, add the below code before the YAML code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;system_accesstoken&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;$(System.AccessToken)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;now add this bash task in the pipeline.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;bash&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
   &lt;span class="s"&gt;git config --global user.email "azuredevops@microsoft.com"&lt;/span&gt;
   &lt;span class="s"&gt;git config --global user.name "Azure DevOps"&lt;/span&gt;

   &lt;span class="s"&gt;REPO="$(System.TeamFoundationCollectionUri)$(System.TeamProject)/_git/$(Build.Repository.Name)"&lt;/span&gt;
   &lt;span class="s"&gt;EXTRAHEADER="Authorization: Bearer $(System.AccessToken)"&lt;/span&gt;
   &lt;span class="s"&gt;git -c http.extraheader="$EXTRAHEADER" clone $REPO &lt;/span&gt;
   &lt;span class="s"&gt;cd $(Build.Repository.Name)&lt;/span&gt;

   &lt;span class="s"&gt;mkdir 'QA-env'&lt;/span&gt;
   &lt;span class="s"&gt;cd 'QA-env'&lt;/span&gt;

   &lt;span class="s"&gt;cp '$(System.DefaultWorkingDirectory)/ARM/TemplateForWorkspace.json' .&lt;/span&gt;
   &lt;span class="s"&gt;cp '$(System.DefaultWorkingDirectory)/ARM/TemplateParametersForWorkspace.json' .&lt;/span&gt;

   &lt;span class="s"&gt;cd ..&lt;/span&gt;

   &lt;span class="s"&gt;git add Template-QA/TemplateForWorkspace.json&lt;/span&gt;
   &lt;span class="s"&gt;git add Template-QA/TemplateParametersForWorkspace.json&lt;/span&gt;


   &lt;span class="s"&gt;MAINBRANCHNAME=main&lt;/span&gt;
   &lt;span class="s"&gt;git config http.$REPO.extraHeader "$EXTRAHEADER"&lt;/span&gt;
   &lt;span class="s"&gt;git commit -a -m "added QA json updated files"&lt;/span&gt;

   &lt;span class="s"&gt;echo -- Merge $(Build.SourceBranchName) to $MAINBRANCHNAME --&lt;/span&gt;
   &lt;span class="s"&gt;git fetch origin $(Build.SourceBranchName) --prune&lt;/span&gt;
   &lt;span class="s"&gt;git merge origin/$(Build.SourceBranchName) -m "merge $(Build.SourceBranchName) to $MAINBRANCHNAME" --no-ff --allow-unrelated-histories&lt;/span&gt;



   &lt;span class="s"&gt;git push origin $MAINBRANCHNAME&lt;/span&gt;
   &lt;span class="s"&gt;git push origin --tags&lt;/span&gt;

  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Bash&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Script'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you are using a classic pipeline add a bash task&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ddEyB0v2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AdKCLnmrNT6qGFCkxWOYD6A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ddEyB0v2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AdKCLnmrNT6qGFCkxWOYD6A.png" alt="" width="700" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And add the below script as inline,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git config &lt;span class="nt"&gt;--global&lt;/span&gt; user.email &lt;span class="s2"&gt;"azuredevops@microsoft.com"&lt;/span&gt;
git config &lt;span class="nt"&gt;--global&lt;/span&gt; user.name &lt;span class="s2"&gt;"Azure DevOps"&lt;/span&gt;

&lt;span class="nv"&gt;REPO&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;System.TeamFoundationCollectionUri&lt;span class="si"&gt;)$(&lt;/span&gt;System.TeamProject&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;/_git/&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;Build.Repository.Name&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;span class="nv"&gt;EXTRAHEADER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"Authorization: Bearer &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;System.AccessToken&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
git &lt;span class="nt"&gt;-c&lt;/span&gt; http.extraheader&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$EXTRAHEADER&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; clone &lt;span class="nv"&gt;$REPO&lt;/span&gt; 
&lt;span class="nb"&gt;cd&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;Build.Repository.Name&lt;span class="si"&gt;)&lt;/span&gt;

&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="s1"&gt;'qa1-ause-asy-01'&lt;/span&gt;
&lt;span class="nb"&gt;cd&lt;/span&gt; &lt;span class="s1"&gt;'qa1-ause-asy-01'&lt;/span&gt;

&lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="s1"&gt;'$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateForWorkspace.json'&lt;/span&gt; &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="s1"&gt;'$(System.DefaultWorkingDirectory)/_Synapse-CI-pipeline/drop/ARM/TemplateParametersForWorkspace.json'&lt;/span&gt; &lt;span class="nb"&gt;.&lt;/span&gt;

&lt;span class="nb"&gt;cd&lt;/span&gt; ..

git add Template-QA/TemplateForWorkspace.json
git add Template-QA/TemplateParametersForWorkspace.json


&lt;span class="nv"&gt;MAINBRANCHNAME&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;main
git config http.&lt;span class="nv"&gt;$REPO&lt;/span&gt;.extraHeader &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$EXTRAHEADER&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
git commit &lt;span class="nt"&gt;-a&lt;/span&gt; &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s2"&gt;"added QA json updated files"&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="nt"&gt;--&lt;/span&gt; Merge &lt;span class="si"&gt;$(&lt;/span&gt;Build.SourceBranchName&lt;span class="si"&gt;)&lt;/span&gt; to &lt;span class="nv"&gt;$MAINBRANCHNAME&lt;/span&gt; &lt;span class="nt"&gt;--&lt;/span&gt;
git fetch origin &lt;span class="si"&gt;$(&lt;/span&gt;Build.SourceBranchName&lt;span class="si"&gt;)&lt;/span&gt; &lt;span class="nt"&gt;--prune&lt;/span&gt;
git merge origin/&lt;span class="si"&gt;$(&lt;/span&gt;Build.SourceBranchName&lt;span class="si"&gt;)&lt;/span&gt; &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s2"&gt;"merge &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;Build.SourceBranchName&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; to &lt;/span&gt;&lt;span class="nv"&gt;$MAINBRANCHNAME&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="nt"&gt;--no-ff&lt;/span&gt; &lt;span class="nt"&gt;--allow-unrelated-histories&lt;/span&gt;



git push origin &lt;span class="nv"&gt;$MAINBRANCHNAME&lt;/span&gt;
git push origin &lt;span class="nt"&gt;--tags&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;you can update the commit message according to your need,&lt;/p&gt;

&lt;p&gt;In this script, the following actions are being performed:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; The &lt;code&gt;mkdir&lt;/code&gt; command is creating a new directory called &lt;code&gt;QA-env&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt; The &lt;code&gt;cd&lt;/code&gt; command is changing the current working directory to &lt;code&gt;QA-env&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt; The &lt;code&gt;cp&lt;/code&gt; command is copying the files &lt;code&gt;TemplateForWorkspace.json&lt;/code&gt; and &lt;code&gt;TemplateParametersForWorkspace.json&lt;/code&gt; from the &lt;code&gt;ARM&lt;/code&gt; subdirectory of the default working directory to the current working directory (&lt;code&gt;QA-env&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt; The &lt;code&gt;cd ..&lt;/code&gt; command is changing the current working directory back to the parent directory.&lt;/li&gt;
&lt;li&gt; The &lt;code&gt;git add&lt;/code&gt; command is adding the files &lt;code&gt;TemplateForWorkspace.json&lt;/code&gt; and &lt;code&gt;TemplateParametersForWorkspace.json&lt;/code&gt; to the staging area in Git. This means that these files will be included in the next commit.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;once done you will see the changes getting merged in the repo&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zgT5ECDX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AuFy8Cz23_1TFnnbKBoOCEQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zgT5ECDX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AuFy8Cz23_1TFnnbKBoOCEQ.png" alt="" width="700" height="102"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That is it,&lt;/p&gt;

&lt;p&gt;I guess you can try the steps and get back to me if any help is required!&lt;/p&gt;

&lt;p&gt;Credit :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops"&gt;https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-build-artifacts-v1?view=azure-pipelines"&gt;https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-build-artifacts-v1?view=azure-pipelines&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://stackoverflow.com/questions/52837980/how-to-allow-scripts-to-access-oauth-token-from-yaml-builds"&gt;https://stackoverflow.com/questions/52837980/how-to-allow-scripts-to-access-oauth-token-from-yaml-builds&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://chuvash.eu/2021/04/09/git-merge-develop-to-main-in-an-azure-devops-release/"&gt;https://chuvash.eu/2021/04/09/git-merge-develop-to-main-in-an-azure-devops-release/&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>The trick to not using a self-hosted agent in Azure DevOps</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Sun, 21 Jan 2024 12:36:01 +0000</pubDate>
      <link>https://dev.to/kunaldas/the-trick-to-not-using-a-self-hosted-agent-in-azure-devops-24pa</link>
      <guid>https://dev.to/kunaldas/the-trick-to-not-using-a-self-hosted-agent-in-azure-devops-24pa</guid>
      <description>&lt;h1&gt;
  
  
  The trick to not using a self-hosted agent in Azure DevOps
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tIEuJl5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fill:44:44/1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author" width="44" height="44"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WnfAYKNd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AabFuG6O6narmeqHG.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WnfAYKNd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/0%2AabFuG6O6narmeqHG.png" alt="" width="700" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Well, when you want to design the build and release pipeline for any software development the one thing that comes to mind is where to do the whole operation.&lt;br&gt;&lt;br&gt;
apparently, There are two ways of doing it,&lt;br&gt;&lt;br&gt;
The &lt;strong&gt;free&lt;/strong&gt; and easy way is to use a Microsoft Hosted build agent,&lt;br&gt;&lt;br&gt;
But in some cases, we may need to use a Self-hosted build agent,&lt;br&gt;&lt;br&gt;
let’s see what Microsoft has to say about self-hosted build agents.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“&lt;em&gt;An agent that you set up and manage on your own to run jobs is a self-hosted agent. You can use self-hosted agents in Azure Pipelines or Azure DevOps Server, formerly named Team Foundation Server (TFS). Self-hosted agents give you more control to install dependent software needed for your builds and deployments. Also, machine-level caches and configuration persist from run to run, which can boost speed.&lt;/em&gt;”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;So, Self-hosted build agents do provide some benefits, but in my experience, some of these benefits can be achieved in Microsoft-hosted build agents as well.&lt;/p&gt;

&lt;p&gt;For instance, if you have a relatively small build/release pipeline there’s no need to cache dependencies as it may give you a faster build time but do you really need it?&lt;/p&gt;

&lt;p&gt;Now let’s see what a Microsoft hosted agent is as per them,&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“If your pipelines are in Azure Pipelines, then you’ve got a convenient option to run your jobs using a Microsoft-hosted agent. With Microsoft-hosted agents, maintenance and upgrades are taken care of for you. Each time you run a pipeline, you get a fresh virtual machine for each job in the pipeline. The virtual machine is discarded after one job (which means any change that a job makes to the virtual machine file system, such as checking out code, will be unavailable to the next job). Microsoft-hosted agents can run jobs &lt;a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops"&gt;directly on the VM&lt;/a&gt; or &lt;a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops"&gt;in a container&lt;/a&gt;.”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;So as you can see these agents are ephemeral by nature so we can not use them for something permanent right?&lt;br&gt;&lt;br&gt;
well, yes but there are some other wat to look at it.&lt;/p&gt;

&lt;p&gt;In this article, I have a solution for a different problem, which is using a self-hosted build agent to overcome firewall restrictions set in different azure resources.&lt;br&gt;&lt;br&gt;
But first, let’s understand the problem.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--h5A4rmt5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Arwz0eIvDmzvio7MFr43h3A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--h5A4rmt5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Arwz0eIvDmzvio7MFr43h3A.png" alt="" width="700" height="486"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Firewall restriction in a Synapse workspace&lt;/p&gt;

&lt;p&gt;In the image, you can see public network access can be disabled. in that case, you will not be able to connect to this resource from the DevOps pipeline,&lt;br&gt;&lt;br&gt;
In some places, I have seen people using a self-hosted agent just for this purpose but actually, this is not required.&lt;/p&gt;

&lt;p&gt;The process here is adding the IP address as a rule into the firewall&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--agIeluFo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ag2IzHucCpL6hTXDkObsuNQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--agIeluFo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2Ag2IzHucCpL6hTXDkObsuNQ.png" alt="" width="700" height="109"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and you can do just that from the CI/CD pipeline itself,&lt;br&gt;&lt;br&gt;
Let’s look at how to achieve that.&lt;/p&gt;

&lt;p&gt;I am taking the example of Synapse,&lt;br&gt;&lt;br&gt;
So just before the deployment task add an Azure CLI task in the pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UVT299Ra--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AYgKInr79mSDS2UZ7BkKwCw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UVT299Ra--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://miro.medium.com/v2/resize:fit:700/1%2AYgKInr79mSDS2UZ7BkKwCw.png" alt="" width="700" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and write the PowerShell script to add the build agent IP into the synapse workspace, needless to say, you need to give the service connection proper role to perform the action.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;az synapse workspace firewall-rule create &lt;span class="nt"&gt;--name&lt;/span&gt; devops-build-agent-ip&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--workspace-name&lt;/span&gt; synapse-prod &lt;span class="nt"&gt;--resource-group&lt;/span&gt; synapse-prod-rg&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--start-ip-address&lt;/span&gt;  &lt;span class="o"&gt;(&lt;/span&gt;Invoke-RestMethod http://ipinfo.io/json | Select &lt;span class="nt"&gt;-exp&lt;/span&gt; ip&lt;span class="o"&gt;)&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--end-ip-address&lt;/span&gt;  &lt;span class="o"&gt;(&lt;/span&gt;Invoke-RestMethod http://ipinfo.io/json | Select &lt;span class="nt"&gt;-exp&lt;/span&gt; ip&lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you like to use Yaml then here is the code!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AzureCLI@2&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Add&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Build&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;agent&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Ip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;to&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;firewall'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;azureSubscription&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sp-dev'&lt;/span&gt;
    &lt;span class="na"&gt;scriptType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ps&lt;/span&gt;
    &lt;span class="na"&gt;scriptLocation&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;inlineScript&lt;/span&gt;
    &lt;span class="na"&gt;inlineScript&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;az&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;synapse&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;workspace&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;firewall-rule&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;create&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--name&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;devops-build-agent-ip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--workspace-name&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;qa1-ause-asy-01&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--resource-group&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;QA1-AUSE-ASY-ARG-01&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--start-ip-address&lt;/span&gt;&lt;span class="nv"&gt;  &lt;/span&gt;&lt;span class="s"&gt;(Invoke-RestMethod&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;http://ipinfo.io/json&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;|&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Select&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-exp&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ip)&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;--end-ip-address&lt;/span&gt;&lt;span class="nv"&gt;  &lt;/span&gt;&lt;span class="s"&gt;(Invoke-RestMethod&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;http://ipinfo.io/json&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;|&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Select&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-exp&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;ip)'&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, as we can add the firewall rule we can delete the rule too, for that use the below code&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;az synapse workspace firewall-rule create &lt;span class="nt"&gt;--name&lt;/span&gt; devops-build-agent-ip&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--workspace-name&lt;/span&gt; synapse-prod &lt;span class="nt"&gt;--resource-group&lt;/span&gt; synapse-prod-rg &lt;span class="nt"&gt;--yes&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and that is it.&lt;br&gt;&lt;br&gt;
once you run the deployment it will copy the IP to the build agent add it into the firewall rule then run the deployment and then delete it, but&lt;/p&gt;

&lt;p&gt;Feel free to reach me incase of any issues!!&lt;br&gt;&lt;br&gt;
adios&lt;/p&gt;
&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FrhCZTrV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/TgYYM9w.png" alt="Medium Logo" width="62" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rhp9uSXy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/bp3qHWb.png" alt="Dev.to Logo" width="35" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev"&gt;&lt;br&gt;
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_veP3uic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/iwZwo2S.png" alt="Hashnode Logo" width="204" height="35"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4A7Waxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/VaorXDP.png" alt="kunald_official" width="43" height="35"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U70a7xDy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.imgur.com/ktIHVxm.png" alt="kunaldaskd" width="35" height="35"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cicd</category>
      <category>devops</category>
    </item>
    <item>
      <title>Deploying Azure Data Factory, Azure Data bricks, Azure Data Lake storage &amp; MySql DB using Terraform</title>
      <dc:creator>Kunal Das</dc:creator>
      <pubDate>Thu, 18 Jan 2024 18:58:09 +0000</pubDate>
      <link>https://dev.to/kunaldas/deploying-azure-data-factory-azure-data-bricks-azure-data-lake-storage-mysql-db-using-terraform-4fo7</link>
      <guid>https://dev.to/kunaldas/deploying-azure-data-factory-azure-data-bricks-azure-data-lake-storage-mysql-db-using-terraform-4fo7</guid>
      <description>&lt;h1&gt;
  
  
  Deploying Azure Data Factory, Azure Data bricks, Azure Data Lake storage &amp;amp; MySql DB using Terraform
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fv2%2Fresize%3Afill%3A44%3A44%2F1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fv2%2Fresize%3Afill%3A44%3A44%2F1%2AkfaefcgQPHrPsNobjuiiSg.jpeg" alt="Kunal Das, Author"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Reach at : &lt;a href="https://heylink.me/kunaldas" rel="noopener noreferrer"&gt;https://heylink.me/kunaldas&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here I am going to share some terraform code to deploy ADF, ADLS, ADB, and several other necessary resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
Deploying Azure Data Factory, Azure Data bricks, Azure Data Lake storage &amp;amp; MySql DB using Terraform

&lt;ul&gt;
&lt;li&gt;Table of Contents&lt;/li&gt;
&lt;li&gt;Resource Group&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Azure Data Factory:&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Azure Data Bricks:&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Virtual network:&lt;/li&gt;
&lt;li&gt;Network Security group for ADB:&lt;/li&gt;
&lt;li&gt;Public subnet for Databricks:&lt;/li&gt;
&lt;li&gt;Private subnet for Databricks:&lt;/li&gt;
&lt;li&gt;Network security group for Public subnet:&lt;/li&gt;
&lt;li&gt;Network security group for Privatesubnet:&lt;/li&gt;
&lt;li&gt;Data Lake storage account:&lt;/li&gt;
&lt;li&gt;Storage account container:&lt;/li&gt;
&lt;li&gt;Storage Admin password:&lt;/li&gt;
&lt;li&gt;SQL server :&lt;/li&gt;
&lt;li&gt;SQL Database:&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Let’s start with a resource group where we will store all the resources required.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resource Group
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;data&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_client_config"&lt;/span&gt; &lt;span class="s2"&gt;"Current"&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_resource_group"&lt;/span&gt; &lt;span class="s2"&gt;"RG"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt;     &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;var&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ResourceGroup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Name&lt;/span&gt;
  &lt;span class="nx"&gt;location&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;var&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ResourceGroup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Location&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;points to note that we will fetch the RG name and RG location in the next resource declaration.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Azure Data Factory:&lt;/strong&gt;
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_data_factory"&lt;/span&gt; &lt;span class="s2"&gt;"DataFactory"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt;                &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"DataFactory Name"&lt;/span&gt;
  &lt;span class="nx"&gt;location&lt;/span&gt;            &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;location&lt;/span&gt;
  &lt;span class="nx"&gt;resource_group_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;

  &lt;span class="nx"&gt;identity&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;type&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"SystemAssigned"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Azure Data Bricks:&lt;/strong&gt;
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_databricks_workspace"&lt;/span&gt; &lt;span class="s2"&gt;"Databricks"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;location&lt;/span&gt;                      &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;location&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt;                          &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Databricks Name"&lt;/span&gt;
  &lt;span class="nx"&gt;resource_group_name&lt;/span&gt;           &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;
  &lt;span class="nx"&gt;managed_resource_group_name&lt;/span&gt;   &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Databricks Managed Resource Group"&lt;/span&gt;
  &lt;span class="nx"&gt;sku&lt;/span&gt;                           &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Databricks Sku"&lt;/span&gt;

  &lt;span class="nx"&gt;custom_parameters&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;no_public_ip&lt;/span&gt;        &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
    &lt;span class="nx"&gt;virtual_network_id&lt;/span&gt;  &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_virtual_network&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;DatabricksVnet&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;
    &lt;span class="nx"&gt;public_subnet_name&lt;/span&gt;  &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_subnet&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;DatabricksSubnetPublic&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;
    &lt;span class="nx"&gt;private_subnet_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_subnet&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;DatabricksSubnetPrivate&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="nx"&gt;depends_on&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="nx"&gt;azurerm_subnet_network_security_group_association&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;public&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;azurerm_subnet_network_security_group_association&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;private&lt;/span&gt;
  &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Virtual network:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight terraform"&gt;&lt;code&gt;&lt;span class="k"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"azurerm_virtual_network"&lt;/span&gt; &lt;span class="s2"&gt;"DatabricksVnet"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt;                     &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"VNET NAME"&lt;/span&gt;
  &lt;span class="nx"&gt;resource_group_name&lt;/span&gt;      &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;
  &lt;span class="nx"&gt;location&lt;/span&gt;                 &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;azurerm_resource_group&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;RG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;location&lt;/span&gt;
  &lt;span class="nx"&gt;address_space&lt;/span&gt;            &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"VNET CIDR"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Network Security group for ADB:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_network_security_group" "DatabricksNSG" {
  name                     = "VirtualNetwork NSG Name"
  resource_group_name      = azurerm_resource_group.RG.name
  location                 = azurerm_resource_group.RG.location
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Public subnet for Databricks:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_subnet" "DatabricksSubnetPublic" {
  name                 = "VirtualNetwork PublicSubnet Name"
  resource_group_name  = azurerm_resource_group.RG.name
  virtual_network_name = azurerm_virtual_network.DatabricksVnet.name
  address_prefixes     = ["VirtualNetwork PublicSubnet CIDR"]
  service_endpoints    = ["Microsoft.Storage"]

  delegation {
    name = "Microsoft.Databricks.workspaces"
    service_delegation {
      name = "Microsoft.Databricks/workspaces"
      actions = [
        "Microsoft.Network/virtualNetworks/subnets/join/action",
        "Microsoft.Network/virtualNetworks/subnets/prepareNetworkPolicies/action",
        "Microsoft.Network/virtualNetworks/subnets/unprepareNetworkPolicies/action"]
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Private subnet for Databricks:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_subnet" "DatabricksSubnetPrivate" {
  name                 = "VirtualNetwork PrivateSubnet Name"
  resource_group_name  = azurerm_resource_group.RG.name
  virtual_network_name = azurerm_virtual_network.DatabricksVnet.name
  address_prefixes     = ["VirtualNetwork PrivateSubnet CIDR"]

  delegation {
    name = "Microsoft.Databricks.workspaces"
    service_delegation {
      name = "Microsoft.Databricks/workspaces"
      actions = [
        "Microsoft.Network/virtualNetworks/subnets/join/action",
        "Microsoft.Network/virtualNetworks/subnets/prepareNetworkPolicies/action",
        "Microsoft.Network/virtualNetworks/subnets/unprepareNetworkPolicies/action"]
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Network security group for Public subnet:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_subnet_network_security_group_association" "public" {
  subnet_id                 = azurerm_subnet.DatabricksSubnetPublic.id
  network_security_group_id = azurerm_network_security_group.DatabricksNSG.id
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Network security group for Privatesubnet:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_subnet_network_security_group_association" "private" {
  subnet_id                 = azurerm_subnet.DatabricksSubnetPrivate.id
  network_security_group_id = azurerm_network_security_group.DatabricksNSG.id
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now as all the associated network configuration done let’s move to the DATA LAKE STORAGE account creation&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Lake storage account:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_storage_account" "DataLake" {
  name                     = "DataLake Name"
  resource_group_name      = azurerm_resource_group.RG.name
  location                 = azurerm_resource_group.RG.location
  account_tier             = "DataLake Tier"
  account_replication_type = "DataLake Replication"
  is_hns_enabled           = true
  min_tls_version          = "DataLake TLSVersion"

  network_rules {
    # bypass                     = "AzureServices"
    default_action             = "Allow"    
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Storage account container:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_storage_container" "DataLakeContainer" {  
  for_each              = "DataLake Container"
  name                  = each.key
  storage_account_name  = azurerm_storage_account.DataLake.name
  container_access_type = "private"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, let us create SQL related resources&lt;/p&gt;

&lt;h2&gt;
  
  
  Storage Admin password:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "random_string" "SQLAdminPassword" {
  length      = 5
  special     = true
  min_upper   = 2
  min_numeric = 2
  min_special = 2
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  SQL server :
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_mssql_server" "SQLServer" {
  name                         = "SQLServer Name"
  resource_group_name          = azurerm_resource_group.RG.name
  location                     = azurerm_resource_group.RG.location
  version                      = "SQLServer Version"
  administrator_login          = "SQLServer AdministratorLogin"
  administrator_login_password = random_string.SQLAdminPassword.result
  minimum_tls_version          = "SQLServer  TLS Version"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  SQL Database:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "azurerm_mssql_database" "SQLDatabase" {
  name           = "SQLDatabase Name"
  server_id      = azurerm_mssql_server.SQLServer.id
  collation      = "SQL_collation"
  max_size_gb    = "SQLDatabase MaxSizeGB"
  sku_name       = "SQLDatabase SKU"
  zone_redundant = "SQLDatabase ZoneRedundant"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a complete part by part snippets to create a running ADB ADF system, feel free to reach me in case any clarification required!&lt;/p&gt;

&lt;h2&gt;
  
  
  Read my blogs:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://kunaldaskd.medium.com" rel="noopener noreferrer"&gt;&lt;br&gt;
    &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FTgYYM9w.png" alt="Medium Logo"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/kunaldas"&gt;&lt;br&gt;
    &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2Fbp3qHWb.png" alt="Dev.to Logo"&gt;&lt;br&gt;
&lt;/a&gt;&lt;br&gt;
&lt;a href="https://kunaldas.hashnode.dev" rel="noopener noreferrer"&gt;&lt;br&gt;
    &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FiwZwo2S.png" alt="Hashnode Logo"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connect with Me:
&lt;/h2&gt;

&lt;p&gt;
&lt;a href="https://twitter.com/kunald_official" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FVaorXDP.png" alt="kunald_official"&gt;&lt;/a&gt;
&lt;a href="https://linkedin.com/in/kunaldaskd" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FktIHVxm.png" alt="kunaldaskd"&gt;&lt;/a&gt;
&lt;/p&gt;

</description>
      <category>azure</category>
      <category>terraform</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
