<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: lotanna obianefo</title>
    <description>The latest articles on DEV Community by lotanna obianefo (@lotanna_obianefo).</description>
    <link>https://dev.to/lotanna_obianefo</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/lotanna_obianefo"/>
    <language>en</language>
    <item>
      <title>Building a Complete CI/CD Pipeline with Node.js, Docker, and K8s.</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Tue, 03 Mar 2026 16:53:45 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/building-a-complete-cicd-pipeline-with-nodejs-docker-and-k8s-4o44</link>
      <guid>https://dev.to/lotanna_obianefo/building-a-complete-cicd-pipeline-with-nodejs-docker-and-k8s-4o44</guid>
      <description>&lt;p&gt;Modern software delivery demands speed, reliability, and scalability. Continuous Integration and Continuous Deployment (CI/CD) pipelines enable teams to ship code faster while maintaining high quality and operational stability. When combined with Node.js, Docker, and Kubernetes (K8s), CI/CD becomes a powerful system for building cloud-native, production-ready applications.&lt;/p&gt;

&lt;p&gt;This article provides a complete technical walkthrough of designing and implementing a CI/CD pipeline.&lt;/p&gt;

&lt;p&gt;Before starting, you need to install these tools on your machine.&lt;/p&gt;

&lt;p&gt;Required Downloads:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Node.js&lt;/strong&gt; (v18 or higher) - Current LTS: v20.x&lt;br&gt;
Download from: &lt;a href="https://nodejs.org/" rel="noopener noreferrer"&gt;https://nodejs.org/&lt;/a&gt;&lt;br&gt;
Choose the LTS version (20.x as of 2025)&lt;br&gt;
Verify installation: &lt;strong&gt;node --version&lt;/strong&gt; and &lt;strong&gt;npm --version&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Git&lt;/strong&gt; - Latest stable version&lt;br&gt;
Download from: &lt;a href="https://git-scm.com/downloads" rel="noopener noreferrer"&gt;https://git-scm.com/downloads&lt;/a&gt;&lt;br&gt;
Choose your operating system version&lt;br&gt;
Verify installation: &lt;strong&gt;git --version&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Docker Desktop&lt;/strong&gt; - Latest version&lt;br&gt;
Download from: &lt;a href="https://www.docker.com/products/docker-desktop/" rel="noopener noreferrer"&gt;https://www.docker.com/products/docker-desktop/&lt;/a&gt;&lt;br&gt;
Install and start Docker Desktop&lt;br&gt;
Verify installation: &lt;strong&gt;docker --version&lt;/strong&gt; and &lt;strong&gt;docker-compose --version&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;GitHub Account&lt;/strong&gt;&lt;br&gt;
Sign up at: &lt;a href="https://github.com" rel="noopener noreferrer"&gt;https://github.com&lt;/a&gt;&lt;br&gt;
You'll need this for hosting your code and CI/CD pipeline&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;VS Code&lt;/strong&gt;&lt;br&gt;
Download: &lt;a href="https://code.visualstudio.com/" rel="noopener noreferrer"&gt;https://code.visualstudio.com/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Verify Everything is Installed&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevt00cs722v94g0jejbk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevt00cs722v94g0jejbk.png" alt="ygtrty" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1&lt;/strong&gt;: &lt;strong&gt;Set Up Git for Version Control&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Configures Git on your machine so it knows who you are when you make commits and sets up proper project tracking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;One-time Git Configuration&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     git config --global user.name "Your Name"
     git config --global user.email "you@example.com"
     git config --global init.defaultBranch main
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3wyx720h7b837asgreb0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3wyx720h7b837asgreb0.png" alt="trreyu" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To displays all Git configuration settings that are currently applied for your environment use the command &lt;strong&gt;git config --list&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogt7pfhxk3u7gevcepps.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogt7pfhxk3u7gevcepps.png" alt="jdjsjcsds" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create and Initialize Project&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To create and initialize the project, first create a new directory, navigate into it, and ensure the directory is initialized with a local Git repository using the below commands.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     mkdir my-devops-project
     cd my-devops-project
     git init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flxkk9863l2dgfbguvsj7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flxkk9863l2dgfbguvsj7.png" alt="kjgtft" width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;: &lt;strong&gt;Build a Node.js Web App&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Creates a web application using Node.js that can serve web pages and API endpoints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Initialize Node.js Project&lt;/strong&gt;&lt;br&gt;
This command creates a package.json file that describes your project and manages dependencies.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      npm init -y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;em&gt;The -y flag automatically responds “yes” to all confirmation prompts during command execution&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5ih33v2ibxc7urmlx4y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5ih33v2ibxc7urmlx4y.png" alt="gfytftut" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Update package.json&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This customizes the package.json with proper scripts and metadata for your DevOps project.&lt;/p&gt;

&lt;p&gt;Create/edit &lt;strong&gt;package.json&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      {
        "name": "devops-project01",
        "version": "1.0.0",
       "description": "DevOps learning project with Node.js",
        "main": "app.js",
        "scripts": {
          "start": "node app.js",
          "test": "jest",
          "dev": "node app.js",
          "lint": "eslint ."
        },
        "keywords": ["devops", "nodejs", "docker"],
        "author": "lotanna",
        "license": "MIT",
        "engines": {
          "node": "&amp;gt;=18.0.0"
        },
        "devDependencies": {
          "jest": "^29.7.0",
          "eslint": "^8.57.0",
          "supertest": "^7.1.4"
        }
      }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmxn2wcffbtkv8s9iefbv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmxn2wcffbtkv8s9iefbv.png" alt="uhgtfft" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Application File&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To create the application file, use the &lt;strong&gt;touch&lt;/strong&gt; command to generate &lt;strong&gt;app.js&lt;/strong&gt;, then insert the required code, which defines the following functionalities and outlines.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;HTTP server that listens on port 3000&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Serves different endpoints (/, /health, /info, /metrics)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Includes security headers and proper error handling&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provides graceful shutdown capability&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Exports the server for testing&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create&lt;/strong&gt; &lt;strong&gt;app.js&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      // core modules
      const http = require("http");
      const url = require("url");

      // environment configuration
      const PORT = process.env.PORT || 3000;
      const ENVIRONMENT = process.env.NODE_ENV || "development";

      let requestCount = 0;

      // helper: send JSON responses
      function sendJSON(res, statusCode, data) {
        res.statusCode = statusCode;
        res.setHeader("Content-Type", "application/json");
        res.end(JSON.stringify(data, null, 2));
      }

      // helper: send HTML responses
      function sendHTML(res, statusCode, content) {
        res.statusCode = statusCode;
        res.setHeader("Content-Type", "text/html");
        res.end(content);
      }

      // helper: send Prometheus metrics
      function sendMetrics(res) {
        const mem = process.memoryUsage();
        const metrics = `
      # HELP http_requests_total Total HTTP requests
      # TYPE http_requests_total counter
      http_requests_total ${requestCount}

      # HELP app_uptime_seconds Application uptime in seconds
      # TYPE app_uptime_seconds gauge
      app_uptime_seconds ${process.uptime()}

      # HELP nodejs_memory_usage_bytes Node.js memory usage
      # TYPE nodejs_memory_usage_bytes gauge
      nodejs_memory_usage_bytes{type="rss"} ${mem.rss}
      nodejs_memory_usage_bytes{type="heapUsed"} ${mem.heapUsed}
      nodejs_memory_usage_bytes{type="heapTotal"} ${mem.heapTotal}
      nodejs_memory_usage_bytes{type="external"} ${mem.external}
      `;
        res.statusCode = 200;
        res.setHeader("Content-Type", "text/plain");
        res.end(metrics);
      }

      // main server
      const server = http.createServer((req, res) =&amp;gt; {
        requestCount++;
        const timestamp = new Date().toISOString();
        const { pathname } = url.parse(req.url, true);

        // logging
        console.log(
          `${timestamp} - ${req.method} ${pathname} - ${
            req.headers["user-agent"] || "Unknown"
          }`
        );

        // CORS headers
        res.setHeader("Access-Control-Allow-Origin", "*");
        res.setHeader("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE");
        res.setHeader("Access-Control-Allow-Headers", "Content-Type");

        // security headers
        res.setHeader("X-Content-Type-Options", "nosniff");
        res.setHeader("X-Frame-Options", "DENY");
        res.setHeader("X-XSS-Protection", "1; mode=block");

        // route handling
        switch (pathname) {
          case "/":
           sendHTML(
              res,
              200,
              `
      &amp;lt;!DOCTYPE html&amp;gt;
      &amp;lt;html&amp;gt;
      &amp;lt;head&amp;gt;
        &amp;lt;title&amp;gt;DevOps Lab 2025&amp;lt;/title&amp;gt;
        &amp;lt;style&amp;gt;
          body { font-family: Arial, sans-serif; max-width: 800px; margin: 50px auto; padding: 20px; }
          .header { background: linear-gradient(135deg, #667eea 0%, #764ba2 100%); color: white; padding: 20px; border-radius: 8px; }
          .endpoint { background: #f8f9fa; padding: 15px; margin: 10px 0; border-radius: 5px; border-left: 4px solid #007bff; }
        &amp;lt;/style&amp;gt;
      &amp;lt;/head&amp;gt;
      &amp;lt;body&amp;gt;
        &amp;lt;div class="header"&amp;gt;
          &amp;lt;h1&amp;gt;DevOps Lab 2025&amp;lt;/h1&amp;gt;
          &amp;lt;p&amp;gt;Modern Node.js application with CI/CD pipeline&amp;lt;/p&amp;gt;
        &amp;lt;/div&amp;gt;
        &amp;lt;h2&amp;gt;Available Endpoints:&amp;lt;/h2&amp;gt;
        &amp;lt;div class="endpoint"&amp;gt;&amp;lt;strong&amp;gt;GET /&amp;lt;/strong&amp;gt; - This welcome page&amp;lt;/div&amp;gt;
        &amp;lt;div class="endpoint"&amp;gt;&amp;lt;strong&amp;gt;GET /health&amp;lt;/strong&amp;gt; - Health check (JSON)&amp;lt;/div&amp;gt;
        &amp;lt;div class="endpoint"&amp;gt;&amp;lt;strong&amp;gt;GET /info&amp;lt;/strong&amp;gt; - System information&amp;lt;/div&amp;gt;
        &amp;lt;div class="endpoint"&amp;gt;&amp;lt;strong&amp;gt;GET /metrics&amp;lt;/strong&amp;gt; - Prometheus metrics&amp;lt;/div&amp;gt;
        &amp;lt;p&amp;gt;Environment: &amp;lt;strong&amp;gt;${ENVIRONMENT}&amp;lt;/strong&amp;gt;&amp;lt;/p&amp;gt;
        &amp;lt;p&amp;gt;Server time: &amp;lt;strong&amp;gt;${timestamp}&amp;lt;/strong&amp;gt;&amp;lt;/p&amp;gt;
        &amp;lt;p&amp;gt;Requests served: &amp;lt;strong&amp;gt;${requestCount}&amp;lt;/strong&amp;gt; &amp;lt;/p&amp;gt;
      &amp;lt;/body&amp;gt;
      &amp;lt;/html&amp;gt;`
            );
            break;

         case "/health":
             sendJSON(res, 200, {
              status: "healthy",
              timestamp,
              uptime: process.uptime(),
              environment: ENVIRONMENT,
              version: "1.0.0",
              node_version: process.version,
              requests_served: requestCount,
            });
            break;

          case "/info":
            sendJSON(res, 200, {
              platform: process.platform,
              architecture: process.arch,
              node_version: process.version,
              memory_usage: process.memoryUsage(),
              environment: ENVIRONMENT,
              pid: process.pid,
              uptime: process.uptime(),
            });
            break;

          case "/metrics":
            sendMetrics(res);
            break;

          default:
            sendJSON(res, 404, {
              error: "Not Found",
              message: `Route ${pathname} not found`,
              timestamp,
            });
        }
      });

      // graceful shutdown
      function shutdown(signal) {
        console.log(`\nReceived ${signal}, shutting down gracefully...`);
        server.close(() =&amp;gt; {
          console.log("Server closed");
          process.exit(0);
        });
      }
      process.on("SIGTERM", () =&amp;gt; shutdown("SIGTERM"));
      process.on("SIGINT", () =&amp;gt; shutdown("SIGINT"));

      // start server
      server.listen(PORT, () =&amp;gt; {
        console.log(`🚀 Server running at http://localhost:${PORT}/`);
        console.log(`Environment: ${ENVIRONMENT}`);
        console.log(`Node.js version: ${process.version}`);
      });

      // export for testing
      module.exports = server;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;touch &lt;strong&gt;app.js&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd08whnqwketisfeklhn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd08whnqwketisfeklhn6.png" alt="jdjdjdj" width="800" height="358"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy09zacedz2pv4zpu2vfz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy09zacedz2pv4zpu2vfz.png" alt="idjdjdj" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install Dependencies&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Install testing and development tools
      npm install --save-dev jest eslint supertest

      # Install all dependencies (creates node_modules folder)
      npm install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;npm install --save-dev jest eslint supertest&lt;/strong&gt; installs development-only tools for testing (Jest), code quality enforcement (ESLint), and HTTP/API testing (Supertest), and records them under devDependencies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;npm install&lt;/strong&gt; installs all dependencies defined in package.json (both dependencies and devDependencies), creates the node_modules directory, and ensures version consistency using package-lock.json.&lt;/p&gt;

&lt;p&gt;After running the code, you'll see:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A node_modules/ folder with all installed packages&lt;/li&gt;
&lt;li&gt;A package-lock.json file that locks dependency versions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2o683opyob0xir9w6s4k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2o683opyob0xir9w6s4k.png" alt="rgefsrge" width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3&lt;/strong&gt;: &lt;strong&gt;Create Proper Tests&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step sets up automated testing so you can verify your application works correctly every time you make changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create tests directory and test file&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Create a folder for your tests
      mkdir tests

      # Create the main test file
      touch tests/app.test.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftogtozgu5mgetb8w8pyw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftogtozgu5mgetb8w8pyw.png" alt="defeqfw" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create test file&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Copy this code into already created &lt;strong&gt;tests/app.test.js&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      const request = require('supertest');
      const server = require('../app');

      describe('App Endpoints', () =&amp;gt; {
        afterAll(() =&amp;gt; {
          server.close();
        });

        test('GET / should return welcome page', async () =&amp;gt; {
          const response = await request(server).get('/');
          expect(response.status).toBe(200);
          expect(response.text).toContain('DevOps Lab 2025');
        });

        test('GET /health should return health status', async () =&amp;gt; {
          const response = await request(server).get('/health');
          expect(response.status).toBe(200);
          expect(response.body.status).toBe('healthy');
          expect(response.body.timestamp).toBeDefined();
          expect(typeof response.body.uptime).toBe('number');
        });

        test('GET /info should return system info', async () =&amp;gt; {
          const response = await request(server).get('/info');
          expect(response.status).toBe(200);
          expect(response.body.platform).toBeDefined();
          expect(response.body.node_version).toBeDefined();
        });

        test('GET /metrics should return prometheus metrics', async () =&amp;gt; {
          const response = await request(server).get('/metrics');
          expect(response.status).toBe(200);
          expect(response.text).toContain('http_requests_total');
          expect(response.text).toContain('app_uptime_seconds');
        });

        test('GET /nonexistent should return 404', async () =&amp;gt; {
          const response = await request(server).get('/nonexistent');
          expect(response.status).toBe(404);
          expect(response.body.error).toBe('Not Found');
        });
      });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6z9qe6ib8ke9o2ie5aqe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6z9qe6ib8ke9o2ie5aqe.png" alt="sgrwesrgew" width="800" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Jest configuration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create a &lt;strong&gt;jest.config.js&lt;/strong&gt; file, add the required configuration content, and save the file.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      module.exports = {
        testEnvironment: 'node',
        collectCoverage: true,
        coverageDirectory: 'coverage',
        testMatch: ['**/tests/**/*.test.js'],
        verbose: true
      };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmubpcvt57h0y1waumy2o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmubpcvt57h0y1waumy2o.png" alt="ugyftftty" width="800" height="360"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu7866tmjakmysgiudy1p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu7866tmjakmysgiudy1p.png" alt="sagrgewr" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4&lt;/strong&gt;: &lt;strong&gt;GitHub Actions CI/CD Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This Creates an automated pipeline that runs tests and builds Docker images every time you push code to GitHub.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create workflow directory&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;.github/workflows&lt;/strong&gt; directory is the local repository path where GitHub Actions workflows are defined and executed to manage CI/CD processes.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Create the GitHub Actions directory structure
      mkdir -p .github/workflows
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlf29akivccadwazdp1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlf29akivccadwazdp1r.png" alt="gyftyft" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create CI/CD pipeline file&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create &lt;strong&gt;.github/workflows/ci.yml&lt;/strong&gt;: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2s1bjd2yxvdkn6sastwj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2s1bjd2yxvdkn6sastwj.png" alt="ugygty" width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now add the required configuration content and save the file.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      name: CI/CD Pipeline

      on:
        push:
          branches: [ main, develop ]
          tags: [ 'v*' ]
        pull_request:
          branches: [ main ]

      env:
        REGISTRY: ghcr.io
        IMAGE_NAME: ${{ github.repository }}

      concurrency:
        group: ${{ github.workflow }}-${{ github.ref }}
        cancel-in-progress: true

      jobs:
        test:
          name: Test &amp;amp; Lint
          runs-on: ubuntu-latest

          strategy:
            matrix:
              node-version: [20, 22]

          steps:
            - name: Checkout code
              uses: actions/checkout@v4

            - name: Setup Node.js ${{ matrix.node-version }}
              uses: actions/setup-node@v4
              with:
                node-version: ${{ matrix.node-version }}
                cache: 'npm'

            - name: Install dependencies
              run: npm ci

            - name: Run linting
              run: npm run lint

            - name: Run tests
              run: npm test

            - name: Security audit
              run: npm audit --audit-level=critical || echo "Audit completed with warnings"

       build:
          name: Build &amp;amp; Push Image
          runs-on: ubuntu-latest
          needs: test
          if: github.event_name == 'push'

          permissions:
            contents: read
            packages: write
            security-events: write

          outputs:
            image-tag: ${{ steps.meta.outputs.tags }}
            image-digest: ${{ steps.build.outputs.digest }}

          steps:
            - name: Checkout code
              uses: actions/checkout@v4

            - name: Set up Docker Buildx
              uses: docker/setup-buildx-action@v3
              with:
                platforms: linux/amd64,linux/arm64

            - name: Log in to Container Registry
              uses: docker/login-action@v3
              with:
                registry: ${{ env.REGISTRY }}
                username: ${{ github.actor }}
                password: ${{ secrets.GITHUB_TOKEN }}

            - name: Extract metadata
              id: meta
              uses: docker/metadata-action@v5
              with:
                images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
                tags: |
                  type=ref,event=branch
                  type=ref,event=pr
                  type=semver,pattern={{version}}
                  type=semver,pattern={{major}}.{{minor}}
                  type=sha,prefix={{branch}}-
                  type=raw,value=${{ github.run_id }}
                  type=raw,value=latest,enable={{is_default_branch}}
                labels: |
                  org.opencontainers.image.title=DevOps Lab 2025
                  org.opencontainers.image.description=Modern Node.js DevOps application

            - name: Build and push Docker image
              id: build
              uses: docker/build-push-action@v5
              with:
                context: .
                platforms: linux/amd64,linux/arm64
                push: true
                tags: ${{ steps.meta.outputs.tags }}
                labels: ${{ steps.meta.outputs.labels }}
                cache-from: type=gha
                cache-to: type=gha,mode=max
                target: production

            - name: Run Trivy vulnerability scanner
              uses: aquasecurity/trivy-action@0.24.0
              with:
                image-ref: ${{ steps.meta.outputs.tags }}
                format: 'sarif'
                output: 'trivy-results.sarif'
                severity: 'CRITICAL,HIGH'
              continue-on-error: true

            - name: Upload Trivy scan results
              uses: github/codeql-action/upload-sarif@v3
              if: always() &amp;amp;&amp;amp; hashFiles('trivy-results.sarif') != ''
              with:
                sarif_file: 'trivy-results.sarif'

        deploy-staging:
          name: Deploy to Staging
          runs-on: ubuntu-latest
          needs: build
          if: github.ref == 'refs/heads/develop'
          environment: staging

          steps:
            - name: Deploy to Staging
              run: |
                echo "🚀 Deploying to staging environment..."
                echo "Image: ${{ needs.build.outputs.image-tag }}"
                echo "Digest: ${{ needs.build.outputs.image-digest }}"
                # Add your staging deployment commands here (kubectl, helm, etc.)

        deploy-production:
          name: Deploy to Production
          runs-on: ubuntu-latest
          needs: build
          if: github.ref == 'refs/heads/main'
          environment: production

          steps:
            - name: Deploy to Production
              run: |
                echo "🎯 Deploying to production environment..."
                echo "Image: ${{ needs.build.outputs.image-tag }}"
                echo "Digest: ${{ needs.build.outputs.image-digest }}"
                # Add your production deployment commands here
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F19dimrsamyp2iet5tlrk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F19dimrsamyp2iet5tlrk.png" alt="ytfrtryt" width="800" height="471"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5&lt;/strong&gt;: &lt;strong&gt;Dockerfile&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step defines the build instructions Docker uses to create a portable container image of the application, ensuring consistent execution across environments. &lt;/p&gt;

&lt;p&gt;The Dockerfile performs the following functions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Uses multi-stage builds for smaller image size&lt;/li&gt;
&lt;li&gt;Installs curl for health checks&lt;/li&gt;
&lt;li&gt;Creates a non-root user for security&lt;/li&gt;
&lt;li&gt;Sets up proper file permissions&lt;/li&gt;
&lt;li&gt;Configures health checks&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Create &lt;strong&gt;Dockerfile&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;Input these commands into the &lt;strong&gt;Dockerfile&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Multi-stage build for optimized image
      FROM node:20-alpine AS dependencies

      # Update packages for security
      RUN apk update &amp;amp;&amp;amp; apk upgrade --no-cache

      WORKDIR /app

      # Copy package files first for better caching
      COPY package*.json ./

      # Install only production dependencies
      RUN npm ci --only=production &amp;amp;&amp;amp; npm cache clean --force

      # Production stage  
      FROM node:20-alpine AS production

      # Update packages and install necessary tools
      RUN apk update &amp;amp;&amp;amp; apk upgrade --no-cache &amp;amp;&amp;amp; \
          apk add --no-cache curl dumb-init &amp;amp;&amp;amp; \
          rm -rf /var/cache/apk/*

      # Create non-root user with proper permissions
      RUN addgroup -g 1001 -S nodejs &amp;amp;&amp;amp; \
          adduser -S nodeuser -u 1001 -G nodejs

      WORKDIR /app

      # Copy dependencies from previous stage with proper ownership
      COPY --from=dependencies --chown=nodeuser:nodejs /app/node_modules ./node_modules

      # Copy application code with proper ownership
      COPY --chown=nodeuser:nodejs package*.json ./
      COPY --chown=nodeuser:nodejs app.js ./

      # Switch to non-root user
      USER nodeuser

      # Expose port
      EXPOSE 3000

      # Health check with proper timing for Node.js startup
      HEALTHCHECK --interval=30s --timeout=10s --start-period=15s --retries=3 \
        CMD curl -f http://localhost:3000/health || exit 1

      # Use dumb-init for proper signal handling in containers
      ENTRYPOINT ["dumb-init", "--"]

      # Start application
      CMD ["npm", "start"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3ai8gpju4indainshsw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3ai8gpju4indainshsw.png" alt="Ihjgffd" width="800" height="362"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sgvhbqpxz0oz91bqgf3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sgvhbqpxz0oz91bqgf3.png" alt="kyfttfrfyh" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6&lt;/strong&gt;: &lt;strong&gt;Essential Configuration Files&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This creates configuration files that tell various tools what to ignore, how to behave, and what settings to use.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create&lt;/strong&gt; &lt;strong&gt;.dockerignore&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create .dockerignore:&lt;/p&gt;

&lt;p&gt;Input these commands into &lt;strong&gt;.dockerignore&lt;/strong&gt; and &lt;strong&gt;save&lt;/strong&gt;.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # =========================
      # Dependencies &amp;amp; Package Managers
      # =========================
      node_modules
      npm-debug.log*
      coverage
      .nyc_output

      # =========================
      # Environment Variables
      # =========================
      .env
      .env.local
      .env.*.local

      # =========================
      # Version Control
      # =========================
      .git
      .github

      # =========================
      # Logs
      # =========================
      logs
      *.log

      # =========================
      # Editor &amp;amp; IDE Files
      # =========================
      .vscode
      .idea
      *.swp
      *.swo

      # =========================
      # OS Files
      # =========================
      .DS_Store
      Thumbs.db

      # =========================
      # Testing &amp;amp; Tooling
      # =========================
      tests/
      jest.config.js
      .eslintrc*

      # =========================
      # Documentation
      # =========================
      README.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3eczbbwk4xg7xhu3hvez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3eczbbwk4xg7xhu3hvez.png" alt="Ikfdgdgd" width="800" height="386"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflta88x9k6uc0vbc85zh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflta88x9k6uc0vbc85zh.png" alt="rgereyeryw" width="800" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create&lt;/strong&gt; &lt;strong&gt;.gitignore&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create .gitignore:&lt;/p&gt;

&lt;p&gt;Input these commands into already created &lt;strong&gt;.gitignore&lt;/strong&gt; and &lt;strong&gt;save&lt;/strong&gt;.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;         # ===============================
         # Dependencies
         # ===============================
         node_modules/
         npm-debug.log*
         yarn-debug.log*
         yarn-error.log*

         # ===============================
         # Runtime Data
         # ===============================
         pids
         *.pid
         *.seed
         *.pid.lock

         # ===============================
         # Coverage &amp;amp; Test Reports
         # ===============================
         coverage/
         .nyc_output/

         # ===============================
         # Environment Variables
         # ===============================
         .env
         .env.local
         .env.*.local

         # ===============================
         # Logs
         # ===============================
         logs/
         *.log

         # ===============================
         # IDE / Editor
         # ===============================
         .vscode/
         .idea/
         *.swp
         *.swo

         # ===============================
         # OS Files
         # ===============================
         .DS_Store
         Thumbs.db
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0008ea0808ln5og7usr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0008ea0808ln5og7usr.png" alt="hggchg" width="800" height="385"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frueolxjcdmaqwxjbho9b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frueolxjcdmaqwxjbho9b.png" alt="ffdfghf" width="800" height="499"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create&lt;/strong&gt; &lt;strong&gt;environment&lt;/strong&gt; &lt;strong&gt;template&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create .env.example:&lt;/p&gt;

&lt;p&gt;Input these commands into already created &lt;strong&gt;.env.example&lt;/strong&gt; and &lt;strong&gt;save&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       # ===============================
       # Server Configuration
       # ===============================
       PORT=3000
       NODE_ENV=production

       # ===============================
       # Logging Configuration
       # ===============================
       LOG_LEVEL=info
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvy0wgwu0mp6zl238maro.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvy0wgwu0mp6zl238maro.png" alt="dfsdgre" width="800" height="376"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhoocl8ji6b80o6qfq473.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhoocl8ji6b80o6qfq473.png" alt="rhrhtregr" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create&lt;/strong&gt; &lt;strong&gt;ESLint&lt;/strong&gt; &lt;strong&gt;configuration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create .eslintrc.js:&lt;/p&gt;

&lt;p&gt;Input these commands into already created &lt;strong&gt;.eslintrc.js&lt;/strong&gt; and &lt;strong&gt;save&lt;/strong&gt;.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      module.exports = {
        env: {
          node: true,
          es2021: true,
          jest: true
        },
        extends: ['eslint:recommended'],
        parserOptions: {
          ecmaVersion: 12,
          sourceType: 'module'
        },
        rules: {
          'no-console': 'off',
          'no-unused-vars': ['error', { 'argsIgnorePattern': '^_' }]
        }
      };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fabnh2wjj0bqnirii01bd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fabnh2wjj0bqnirii01bd.png" alt="Ihgfytin" width="800" height="388"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3qr1qpjvyrfag1tn9cjn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3qr1qpjvyrfag1tn9cjn.png" alt="hgyugyuhuiu" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 7&lt;/strong&gt;: &lt;strong&gt;Docker Compose for Development&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This creates a Docker Compose file that makes it easy to run your application and any supporting services with a single command.&lt;/p&gt;

&lt;p&gt;Create docker-compose.yml:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      version: '3.8'

      services:
        app:
          build: .
          ports:
            - "3000:3000"
          environment:
            - NODE_ENV=development
            - PORT=3000
          restart: unless-stopped
          healthcheck:
            test: ["CMD", "curl", "-f", "http://localhost:3000/health"]
            interval: 30s
            timeout: 10s
            retries: 3
            start_period: 10s
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsw2vused0ddmkmwpkbjx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsw2vused0ddmkmwpkbjx.png" alt="I76tr65ry6n" width="800" height="378"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9t05b4bhso0rbk995nfw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9t05b4bhso0rbk995nfw.png" alt="yuttyun" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 8&lt;/strong&gt;: &lt;strong&gt;Test Everything Locally&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This shows you how to actually run and test your application locally before deploying it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install and Test Locally&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Install all dependencies from package.json
      npm install
      # Run your test suite to make sure everything works
      npm test
      # Start the application server
      npm start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrpu08k7doqegzymzzug.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrpu08k7doqegzymzzug.png" alt="ytfyfytfyt" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9tf8khj66fefkhp7o0kq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9tf8khj66fefkhp7o0kq.png" alt="ygyuyy" width="800" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Tests should pass with green checkmarks: ✓ GET / should return welcome page&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F238yw0r6f2ifss2831fz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F238yw0r6f2ifss2831fz.png" alt="hfdtrtyhg" width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Server starts and shows: 🚀 Server running at &lt;a href="http://localhost:3000/" rel="noopener noreferrer"&gt;http://localhost:3000/&lt;/a&gt;&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Test endpoints (in a new terminal window)&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      curl http://localhost:3000/         # Homepage
      curl http://localhost:3000/health   # Health check JSON
      curl http://localhost:3000/info     # System info JSON  
      curl http://localhost:3000/metrics  # Prometheus metrics
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwkzwmw0ckfyxwb7erxrg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwkzwmw0ckfyxwb7erxrg.png" alt="jdfcjsc" width="800" height="404"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqljsj1pooebwgll0t6gy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqljsj1pooebwgll0t6gy.png" alt="idsfsfs" width="800" height="396"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fts5cnfitdirop20ksrs1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fts5cnfitdirop20ksrs1.png" alt="fgdgeage" width="800" height="401"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1ulk7b8ahhqcwq5irxb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1ulk7b8ahhqcwq5irxb.png" alt="drwbhywsr" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Docker Commands&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Build image
      docker build -t my-devops-app:latest .

      # Run container
      docker run -d \
        --name my-devops-container \
        -p 3000:3000 \
        --restart unless-stopped \
        my-devops-app:latest

      # Check container status
      docker ps
      docker logs my-devops-container

      # Test health check
      curl http://localhost:3000/health

      # Stop container
      docker stop my-devops-container
      docker rm my-devops-container
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Build image&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnj9b09fcxxkabgz532wj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnj9b09fcxxkabgz532wj.png" alt="htfytytdt" width="800" height="426"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Run container&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq1uu950whvbjoypykagb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq1uu950whvbjoypykagb.png" alt="Iygytfyy" width="800" height="372"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Check container status&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4cag8jzdco8xg1v8p1g5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4cag8jzdco8xg1v8p1g5.png" alt="Iyugyu" width="800" height="375"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Test health check&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fasnzzzpcmotveozer9hx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fasnzzzpcmotveozer9hx.png" alt="dsfcsd" width="800" height="404"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Stop container&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3o7kqw0p1hnkd7tiskv7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3o7kqw0p1hnkd7tiskv7.png" alt="Iytytytn" width="800" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Docker Compose Commands&lt;/strong&gt;&lt;br&gt;
Docker compose is not used to test in production environment but locally.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Start all services defined in docker-compose.yml
      docker-compose up -d

      # View real-time logs from all services
      docker-compose logs -f

      # Stop all services and clean up
      docker-compose down
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Start all services defined in docker-compose.yml&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flkj8dkk0j9icxyu9c86i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flkj8dkk0j9icxyu9c86i.png" alt="frdtrgfy" width="800" height="503"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;View real-time logs from all services&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjo7nq44ta0vuk5jabptc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjo7nq44ta0vuk5jabptc.png" alt="jgytfyt" width="800" height="364"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Stop all services and clean up&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fudvlfbc13nl741k5kkwj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fudvlfbc13nl741k5kkwj.png" alt="yugytfyt" width="800" height="359"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 9&lt;/strong&gt;: &lt;strong&gt;Deploy to GitHub&lt;/strong&gt;&lt;br&gt;
This commits your code to Git and pushes it to GitHub so the automated CI/CD pipeline can start working.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Initial commit&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Add all files to Git staging area
      git add .

      # Create your first commit with a descriptive message
      git commit -m "Initial commit: Complete DevOps setup with working CI/CD"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkr6v09q9hs2hjaejq7oo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkr6v09q9hs2hjaejq7oo.png" alt="Iygtftfyy" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connect to GitHub&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before running these commands:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to &lt;strong&gt;GitHub.com&lt;/strong&gt; and create a new repository called &lt;strong&gt;my-devops-project&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;DO NOT initialize it with &lt;strong&gt;README&lt;/strong&gt;, &lt;strong&gt;.gitignore&lt;/strong&gt;, or &lt;strong&gt;license&lt;/strong&gt; (we already have these)&lt;/li&gt;
&lt;li&gt;Copy the repository &lt;strong&gt;URL&lt;/strong&gt; from GitHub&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Replace &lt;strong&gt;YOUR_GITHUB_USERNAME&lt;/strong&gt; below with your actual GitHub username&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   # Set main as the default branch
   git branch -M main

   # Connect to your GitHub repository (UPDATE THIS URL!)
   git remote add origin https://github.com/YOUR_GITHUB_USERNAME/my-devops-project.git

   # Push your code to GitHub for the first time
   git push -u origin main
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fear17dvgmu4tg7lsr08z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fear17dvgmu4tg7lsr08z.png" alt="Ikjhyftrdy" width="800" height="368"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Your code appears on GitHub, and the CI/CD pipeline starts running automatically&lt;/em&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9f3w9ryvb0ktnvlf26sc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9f3w9ryvb0ktnvlf26sc.png" alt="hygtyuj" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 10&lt;/strong&gt;: &lt;strong&gt;Kubernetes Deployment Configurations&lt;/strong&gt;&lt;br&gt;
This step creates Kubernetes configuration files that define how your application should run in staging and production environments.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create directories&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       # Create directories for Kubernetes configurations
       mkdir -p k8s/staging k8s/production
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0ayjn4iu04ap1e1d1ju.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0ayjn4iu04ap1e1d1ju.png" alt="gftfyhgy" width="800" height="355"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Staging Deployment&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Create&lt;/strong&gt; k8s/staging/deployment.yml&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      apiVersion: apps/v1
      kind: Deployment
      metadata:
        name: devops-app-staging
        namespace: staging
      spec:
        replicas: 2
        selector:
          matchLabels:
            app: devops-app
            environment: staging
        template:
          metadata:
            labels:
              app: devops-app
              environment: staging
          spec:
            containers:
            - name: app
              image: ghcr.io/YOUR_GITHUB_USERNAME/my-devops-project:develop-latest
              ports:
              - containerPort: 3000
              env:
              - name: NODE_ENV
                value: "staging"
              - name: PORT
                value: "3000"
              livenessProbe:
                httpGet:
                  path: /health
                  port: 3000
                initialDelaySeconds: 30
                periodSeconds: 10
              readinessProbe:
                httpGet:
                  path: /health
                  port: 3000
                initialDelaySeconds: 5
                periodSeconds: 5
      ---
      apiVersion: v1
      kind: Service
      metadata:
        name: devops-app-service-staging
        namespace: staging
      spec:
        selector:
          app: devops-app
          environment: staging
        ports:
        - protocol: TCP
          port: 80
          targetPort: 3000
        type: LoadBalancer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgzp2ncpl8rivcihwj6z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgzp2ncpl8rivcihwj6z.png" alt="isisfifisi" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Production Deployment&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Create&lt;/strong&gt; k8s/production/deployment.yml&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      apiVersion: apps/v1
      kind: Deployment
      metadata:
        name: devops-app-production
        namespace: production
      spec:
        replicas: 3
        selector:
          matchLabels:
            app: devops-app
            environment: production
        template:
          metadata:
            labels:
              app: devops-app
              environment: production
          spec:
            containers:
            - name: app
              image: ghcr.io/YOUR_GITHUB_USERNAME/my-devops-project:latest
              ports:
              - containerPort: 3000
              env:
              - name: NODE_ENV
                value: "production"
              - name: PORT
                value: "3000"
              resources:
                requests:
                  memory: "128Mi"
                  cpu: "100m"
                limits:
                  memory: "256Mi"
                  cpu: "200m"
              livenessProbe:
                httpGet:
                  path: /health
                  port: 3000
                initialDelaySeconds: 30
                periodSeconds: 10
              readinessProbe:
                httpGet:
                  path: /health
                  port: 3000
                initialDelaySeconds: 5
                periodSeconds: 5
      ---
      apiVersion: v1
      kind: Service
      metadata:
        name: devops-app-service-production
        namespace: production
      spec:
        selector:
          app: devops-app
          environment: production
        ports:
        - protocol: TCP
          port: 80
          targetPort: 3000
        type: LoadBalancer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffere3m06answ5mxp421k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffere3m06answ5mxp421k.png" alt="Ikdsfwewf" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 11&lt;/strong&gt;: &lt;strong&gt;Complete Deployment Workflow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step shows you how to use the complete CI/CD pipeline with proper branching strategy for staging and production deployments.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Branch-based Deployment Strategy&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;develop branch&lt;/strong&gt; → Automatically deploys to staging environment&lt;br&gt;
&lt;strong&gt;main branch&lt;/strong&gt; → Automatically deploys to production environment&lt;br&gt;
&lt;strong&gt;Pull requests&lt;/strong&gt; → Run tests only (no deployment)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploy Changes&lt;/strong&gt;&lt;br&gt;
Deploy to staging:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Create and switch to develop branch
      git checkout -b develop

      # Make your changes, then commit and push
      git add .
      git status
      git commit -m "Add new feature"
      git push origin develop
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feybd22jwesep4c382ndb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feybd22jwesep4c382ndb.png" alt="dfghsrthr" width="800" height="502"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;GitHub Actions automatically runs tests and deploys to staging&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Deploy to production:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Switch to main branch
      git checkout main

      # Merge changes from develop
      git merge develop

      # Push to trigger production deployment
      git push origin main
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqgcp7s2te84457jwtks.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqgcp7s2te84457jwtks.png" alt="jdfvdijgids" width="800" height="501"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kl2tva9w4xf0xuihelt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kl2tva9w4xf0xuihelt.png" alt="Ioikgkrjgwes" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Monitor Deployments&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can execute this command to monitor the GitHub Actions workflow execution and verify the health status of the deployed application.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Check GitHub Actions status
      # Visit: https://github.com/YOUR_GITHUB_USERNAME/my-devops-project/actions

      # Check your container registry
      # Visit: https://github.com/YOUR_GITHUB_USERNAME/my-devops-project/pkgs/container/my-devops-project

      # Test your deployed applications (once you have URLs)
      curl https://your-staging-url.com/health      # Staging health check
      curl https://your-production-url.com/health   # Production health check
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This project implements a production grade CI/CD pipeline for a containerized Node.js application using GitHub Actions, Docker, and Kubernetes, demonstrating end-to-end DevOps engineering practices. The application exposes health, metrics, and system endpoints with graceful shutdown handling for Kubernetes environments.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>k8s</category>
      <category>cicd</category>
      <category>node</category>
    </item>
    <item>
      <title>Mastering Git for Production: Branching, Merging &amp; Squash Strategies Every Engineer Should Know</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Thu, 19 Feb 2026 12:23:27 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/mastering-git-for-production-branching-merging-squash-strategies-every-engineer-should-know-5flp</link>
      <guid>https://dev.to/lotanna_obianefo/mastering-git-for-production-branching-merging-squash-strategies-every-engineer-should-know-5flp</guid>
      <description>&lt;p&gt;Modern software development depends heavily on version control systems to manage code changes, support collaboration, and maintain stability. Git is the industry-standard distributed version control system that enables teams to work concurrently without overwriting each other’s progress.&lt;/p&gt;

&lt;p&gt;Core to Git’s power are its branching and history management capabilities specifically branching, merging, squashing, and rebasing. Understanding these concepts is essential for maintaining a clean, traceable, and production ready codebase.&lt;/p&gt;

&lt;p&gt;By the end of this article, engineers will understand how to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create branches to work safely&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Merge branches using different strategies&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Fix merge conflicts&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Squash many commits into one&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Push a complete project to GitHub&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;PROJECT SETUP&lt;/strong&gt; -&lt;strong&gt;Part 1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1&lt;/strong&gt;. &lt;strong&gt;Create Project Folder&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      mkdir git-merge-lab
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsnuoz1je41ufqs8k748.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffsnuoz1je41ufqs8k748.png" alt="kdsfAW" width="800" height="371"&gt;&lt;/a&gt;&lt;br&gt;
This creates a new, empty directory named &lt;strong&gt;git-merge-lab&lt;/strong&gt;, establishing an isolated workspace that prevents interference with existing files and ensures a controlled environment for Git merge practice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2&lt;/strong&gt;. &lt;strong&gt;Enter the Folder&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     cd git-merge-lab
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywms2o0xrqrl29ctkx6k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywms2o0xrqrl29ctkx6k.png" alt="utytr5fr" width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This navigates the terminal session into the project directory, setting the folder as the current working environment.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Git operations are context-sensitive and can only be executed within an initialized repository or project directory.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
All subsequent files, commits, and version control activities will be contained within this directory, ensuring organized project structure and traceability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3&lt;/strong&gt;. &lt;strong&gt;Initialize Git&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft34accsd62kqfds0kkvt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft34accsd62kqfds0kkvt.png" alt="ygfdredt" width="800" height="404"&gt;&lt;/a&gt;&lt;br&gt;
This Initializes the directory as a Git repository by generating a hidden .git subdirectory that stores metadata, configuration settings, and the complete version history.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Repository initialization is required for Git to begin tracking file changes and managing version control.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The directory is now under Git management, enabling change tracking, staging, committing, and collaboration workflows.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4&lt;/strong&gt;. &lt;strong&gt;Create README File&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      echo "# Team Project" &amp;gt; README.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpho8hh7muam7jwfhdq5m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpho8hh7muam7jwfhdq5m.png" alt="fytftygyt" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This creates a README.md file and populates it with initial content.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Git requires at least one tracked file to generate a commit and establish version history within the repository.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The README.md file is present in the working directory but remains untracked until it is staged and committed to the repository.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5&lt;/strong&gt;. &lt;strong&gt;Stage the File&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git add README.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe8zekbww35z1m1x19wf6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe8zekbww35z1m1x19wf6.png" alt="hjgfdrrr" width="800" height="380"&gt;&lt;/a&gt;&lt;br&gt;
This stages the file for inclusion in the upcoming commit by adding it to the Git index.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Git only records changes that have been explicitly staged, allowing developers to control which modifications are captured in a commit.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
README.md is now in the staging area and prepared to be committed to the repository history.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6&lt;/strong&gt;. &lt;strong&gt;Commit the File&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git commit -m "Initial commit"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jj8apiisx9hxyehc3pi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jj8apiisx9hxyehc3pi.png" alt="rdsthryuty" width="800" height="470"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This creates a commit that captures a snapshot of the project state within the repository history.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Committing establishes a restore point, enabling version rollback and facilitating traceability if future changes introduce issues.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The repository now contains its initial commit, marking the first recorded version of the project.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7&lt;/strong&gt;. &lt;strong&gt;Rename Branch to main&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git branch -M main
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85o67x060ynxwdbtxf49.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85o67x060ynxwdbtxf49.png" alt="ftrdryugftr" width="800" height="440"&gt;&lt;/a&gt;&lt;br&gt;
This renames the active branch to main, aligning it with modern repository naming conventions.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Most hosting platforms and enterprise workflows designate main as the default branch, promoting consistency across development environments.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The primary development branch is now labeled main, serving as the central integration point for future changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FAST-FORWARD MERGE&lt;/strong&gt; -&lt;strong&gt;Part 2&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;8&lt;/strong&gt;. &lt;strong&gt;Create Feature Branch&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     git checkout -b feature-login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft03h0ekud8q39ew5khs6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft03h0ekud8q39ew5khs6.png" alt="Ivggfn" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
This creates a new Git branch and checks it out as the active working branch.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Feature development is isolated in dedicated branches to prevent instability in the main (or production) branch and to support controlled integration workflows.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The working directory is now tracking the feature-login branch instead of main, ensuring all subsequent commits apply only to this feature stream.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9&lt;/strong&gt;. &lt;strong&gt;Add Feature File&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      echo "Login page created" &amp;gt; login.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmjaen7y7tr7d8dbzr21.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmjaen7y7tr7d8dbzr21.png" alt="jgfytuy" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This creates a new source file associated with the login functionality.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
This step represents the introduction of new application logic and simulates the implementation of a feature within the codebase.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;10&lt;/strong&gt;. &lt;strong&gt;Commit Feature&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git add login.txt
      git commit -m "Add login feature"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh8dliytyzh3imt8w8yqu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh8dliytyzh3imt8w8yqu.png" alt="iuytden" width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Purpose of both commands:&lt;br&gt;
&lt;strong&gt;git add&lt;/strong&gt; stages the modified file by placing it into the index, preparing it for version control tracking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;git commit&lt;/strong&gt; records the staged changes into the repository history as a new snapshot.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The feature changes are now securely versioned and stored within the feature branch’s commit history.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;11&lt;/strong&gt;. &lt;strong&gt;Switch Back to Main&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoamgmddu1lfr57iltzd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoamgmddu1lfr57iltzd.png" alt="Ikhyt6tyt" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This switches the working context back to the main branch.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
In Git, merge operations are executed into the currently checked-out branch, so you must be on the target branch before initiating a merge.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;12&lt;/strong&gt;. &lt;strong&gt;Merge Feature (Fast-Forward)&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git merge feature-login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpzs4oa4vbzpv17sqd9x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpzs4oa4vbzpv17sqd9x.png" alt="khftfyuy" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The main branch advances its pointer to incorporate the commits from the feature branch.&lt;/p&gt;

&lt;p&gt;Purpose:&lt;br&gt;
Since main has not diverged and contains no new commits since the branch was created, Git performs a fast-forward merge rather than generating an additional merge commit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3-WAY MERGE (PARALLEL WORK)&lt;/strong&gt; -&lt;strong&gt;Part 3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;13&lt;/strong&gt;. &lt;strong&gt;Create Profile Feature&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     git checkout -b feature-profile
     echo "Profile page created" &amp;gt; profile.txt
     git add profile.txt
     git commit -m "Add profile feature"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7nsf8n6jzshokjeddgk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7nsf8n6jzshokjeddgk.png" alt="iuytrtt" width="800" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This establishes an isolated development context that allows a new feature to be implemented independently, without impacting the stability or history of other branches.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;;&lt;br&gt;
&lt;em&gt;&lt;strong&gt;git checkout main&lt;/strong&gt;&lt;/em&gt; switches the working directory to the main branch, ensuring the new work is based on the latest stable code.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;git checkout -b feature-settings&lt;/strong&gt;&lt;/em&gt; creates a new branch named feature-settings from main and immediately switches to it. These isolates feature development from the main codebase.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;echo "Settings page created" &amp;gt; settings.txt&lt;/strong&gt;&lt;/em&gt; creates a new file called settings.txt and writes initial content to it, representing the implementation of the settings feature.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;git add settings.txt&lt;/strong&gt;&lt;/em&gt; stages the newly created file by adding it to the Git index, marking it for inclusion in the next commit.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;git commit -m "Add settings feature"&lt;/strong&gt;&lt;/em&gt; creates a commit that records the staged changes in the repository history with a descriptive message, capturing the initial implementation of the settings feature.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;14&lt;/strong&gt;. &lt;strong&gt;Create Settings Feature&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
      git checkout -b feature-settings
      echo "Settings page created" &amp;gt; settings.txt
      git add settings.txt
      git commit -m "Add settings feature"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mxsi5g9kz9dbpjzkk2r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mxsi5g9kz9dbpjzkk2r.png" alt="Ihuygyu" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At this point, the codebase has diverged, with independent changes introduced on two separate branches, resulting in distinct commit histories.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;15&lt;/strong&gt;. &lt;strong&gt;Merge Profile&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
      git merge feature-profile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqlome97me4xua8ap5h89.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqlome97me4xua8ap5h89.png" alt="Iugtt" width="800" height="378"&gt;&lt;/a&gt;&lt;br&gt;
This ensures the main branch is updated first with the profile feature, establishing it as the authoritative baseline before integrating additional feature work.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;16&lt;/strong&gt;. &lt;strong&gt;Merge Settings (3-Way)&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git merge feature-settings
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0jlvhqaook144a6i68et.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0jlvhqaook144a6i68et.png" alt="Iifisfd" width="800" height="355"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;;&lt;br&gt;
&lt;em&gt;Demonstrates familiarity with the Vim editor by entering a commit message within the Vim interface and exiting correctly using the &lt;strong&gt;Esc&lt;/strong&gt; key followed by &lt;strong&gt;:wq&lt;/strong&gt; to write and quit&lt;/em&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5i8jl7daaapvynss1rk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5i8jl7daaapvynss1rk.png" alt="dsfsdsdf" width="800" height="470"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Git generates a merge commit that reconciles and combines the divergent commit histories from both branches.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
The main and feature branches have progressed independently, resulting in non-linear histories that require an explicit merge commit to integrate their changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MERGE CONFLICT&lt;/strong&gt; -&lt;strong&gt;PART 4&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;17&lt;/strong&gt;. &lt;strong&gt;Bugfix Changes README&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout -b bugfix-title
      echo "# Team Project Version 2" &amp;gt; README.md
      git add README.md
      git commit -m "Update title in bugfix"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrgxpp6ourr1937sonhp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrgxpp6ourr1937sonhp.png" alt="bgrgtrgrt" width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;18&lt;/strong&gt;. &lt;strong&gt;Feature Also Changes README&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
      git checkout -b feature-title-update
      echo "# Awesome Team Project" &amp;gt; README.md
      git add README.md
      git commit -m "Update title in feature"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffziutlekwkvhv2yep0rb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffziutlekwkvhv2yep0rb.png" alt="Iigtfrruyg" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;19&lt;/strong&gt;. &lt;strong&gt;Trigger Conflict&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
      git merge bugfix-title
      git merge feature-title-update
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02pfmrelqfnvyju497rt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02pfmrelqfnvyju497rt.png" alt="kjsdsdf" width="800" height="371"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Git is unable to automatically complete the merge process and halts the operation.&lt;/p&gt;

&lt;p&gt;Outcome:&lt;br&gt;
Conflicting changes exist, and Git cannot determine which version should be applied, requiring manual conflict resolution by the developer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;20&lt;/strong&gt;. &lt;strong&gt;Resolve Conflict&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Open &lt;strong&gt;README.md&lt;/strong&gt; delete the conflict marker and write the following command.&lt;/p&gt;

&lt;p&gt;Note: &lt;em&gt;Merge conflicts must be resolved manually by the developer. To do this, open the affected file in the Vim editor using &lt;strong&gt;vim README.md&lt;/strong&gt;, review the conflict markers, and explicitly select or reconcile the appropriate changes&lt;/em&gt;.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      # Awesome Team Project Version 2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git add README.md
      git commit -m "Resolve merge conflict"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F659zabm5tjnmorujzyvb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F659zabm5tjnmorujzyvb.png" alt="jgytft" width="800" height="380"&gt;&lt;/a&gt;&lt;br&gt;
This notifies Git that the merge conflict has been resolved and the finalized version of the file has been saved by staging the corrected file for commit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SQUASH MERGE&lt;/strong&gt; -&lt;strong&gt;PART 5&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;21&lt;/strong&gt;. &lt;strong&gt;Create Feature With Many Commits &amp;amp; Squash It&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout -b feature-dashboard
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;-&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      echo "Dashboard layout" &amp;gt; dashboard.txt
      git add dashboard.txt
      git commit -m "Add dashboard layout"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkppofrra4a92o1ej2cz7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkppofrra4a92o1ej2cz7.png" alt="hyugyty" width="800" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      echo "Add charts" &amp;gt;&amp;gt; dashboard.txt
      git add dashboard.txt
      git commit -m "Add charts"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7dow8ksth39akaquh35g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7dow8ksth39akaquh35g.png" alt="ytfytrytyt" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      echo "Fix alignment" &amp;gt;&amp;gt; dashboard.txt
      git add dashboard.txt
      git commit -m "Fix alignment"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzh4au9z3o4k4x457sl3l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzh4au9z3o4k4x457sl3l.png" alt="ygytfygyu" width="800" height="350"&gt;&lt;/a&gt;&lt;br&gt;
Now squash:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git checkout main
      git merge --squash feature-dashboard
      git commit -m "Add dashboard feature"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d6i1eif4kx2d20nqpik.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d6i1eif4kx2d20nqpik.png" alt="Ihftrhgyft" width="800" height="367"&gt;&lt;/a&gt;&lt;br&gt;
Squashing is used to consolidate multiple incremental development commits into a single, cohesive commit, ensuring that the main branch history remains clean, concise, and focused on meaningful changes rather than granular development steps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;22&lt;/strong&gt;. &lt;strong&gt;View History&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     git log --oneline --graph --all
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ppt1x009rmdom5ytrkp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ppt1x009rmdom5ytrkp.png" alt="Ifgdgresd" width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is to visually inspect and validate the type of merge performed, ensuring the resulting commit history aligns with the intended integration strategy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PUSH TO GITHUB&lt;/strong&gt; -&lt;strong&gt;PART 6&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;23&lt;/strong&gt;. &lt;strong&gt;Create Repo on GitHub&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A remote repository must be created and accessible prior to performing a push operation, as Git requires a valid remote endpoint to receive and store the local commits.&lt;/p&gt;

&lt;p&gt;DO NOT check: Add &lt;strong&gt;README&lt;/strong&gt;, Add &lt;strong&gt;.gitignore&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;24&lt;/strong&gt;. &lt;strong&gt;Connect Local to GitHub&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      git remote add origin https://github.com/YOUR_USERNAME/git-merge-lab.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4tcdarc2oqlj2z3usnjf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4tcdarc2oqlj2z3usnjf.png" alt="jgtftrt" width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This defines the remote endpoint, informing Git of the destination repository to which local commits and branches should be transmitted.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;25&lt;/strong&gt;. &lt;strong&gt;Push All Branches&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;         git push -u origin --all
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvg3dn79qj1bt8oikvijc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvg3dn79qj1bt8oikvijc.png" alt="ifdvdfjsid" width="800" height="467"&gt;&lt;/a&gt;&lt;br&gt;
This Pushes all local branches (such as main, feature, and bugfix) to the remote GitHub repository and establishes upstream tracking relationships, enabling simplified and consistent push and pull operations in subsequent workflows.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22vvtxna3aocl4pmlh4l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22vvtxna3aocl4pmlh4l.png" alt="kfdfsfs" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Branching, merging, and squashing form the backbone of efficient Git workflows. Branching enables safe experimentation, merging integrates work and squashing keeps history readable. Mastering when and how to use each technique allows engineering teams to scale collaboration while maintaining a stable and maintainable codebase.&lt;/p&gt;

&lt;p&gt;For modern DevOps environments where automation, rapid releases, and distributed teams are the norm these practices are not optional, they are foundational to high quality software delivery.&lt;/p&gt;

</description>
      <category>github</category>
      <category>devops</category>
      <category>cicd</category>
      <category>azure</category>
    </item>
    <item>
      <title>Get Started with Elastic Beanstalk</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Sat, 27 Dec 2025 15:01:28 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/get-started-with-elastic-beanstalk-3936</link>
      <guid>https://dev.to/lotanna_obianefo/get-started-with-elastic-beanstalk-3936</guid>
      <description>&lt;p&gt;Amazon Elastic Beanstalk (EB) is a Platform as a Service (PaaS) offering from AWS that simplifies the deployment, management, and scaling of web applications. It allows developers to focus on writing code while AWS handles infrastructure provisioning, load balancing, auto scaling, monitoring, and application health management.&lt;/p&gt;

&lt;p&gt;Elastic Beanstalk is ideal for teams that want speed, scalability, and operational simplicity without sacrificing control over the underlying AWS resources.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To create an application&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Search for &lt;strong&gt;Elastic Beanstalk&lt;/strong&gt; on the search console and click on &lt;strong&gt;create Application&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkq2txtbvekfjpyr0f2yp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkq2txtbvekfjpyr0f2yp.png" alt="hjhduhcs" width="800" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The console provides a six-step process for creating an application and configuring an environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step1&lt;/strong&gt;: &lt;strong&gt;Configure Environment&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;Environment Tier&lt;/strong&gt; section, select &lt;strong&gt;Web Server Environment&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Provide an &lt;strong&gt;Application Name&lt;/strong&gt; (for example, Fintec_app25).&lt;br&gt;
The system will automatically generate the corresponding &lt;strong&gt;Environment Name&lt;/strong&gt; by appending _env to the application name.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under the &lt;strong&gt;Platform configuration&lt;/strong&gt;, select &lt;strong&gt;Node.js&lt;/strong&gt;.&lt;br&gt;
The recommended Platform Branch and Version will be automatically populated based on the selection.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpd4a066aiidtmnjedo0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpd4a066aiidtmnjedo0.png" alt="ijuhygyy" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fox4r9at1ej9kw2m6kkgj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fox4r9at1ej9kw2m6kkgj.png" alt="jhugtft" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Leave all other settings at their default values, then proceed by clicking &lt;strong&gt;Next&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwao0cac9rcinmyg6pmo6.png" alt="dffgdge" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;: &lt;strong&gt;Configure Service Access&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For the Service Role, select &lt;strong&gt;Create role&lt;/strong&gt; if no existing role is available.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;During the role creation process, leave all configuration parameters at their default values.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Proceed by clicking &lt;strong&gt;Next&lt;/strong&gt; on both the &lt;strong&gt;Select trusted entity&lt;/strong&gt; and &lt;strong&gt;Add permissions steps&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update the &lt;strong&gt;Role name&lt;/strong&gt; to a unique and descriptive identifier.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Complete the process by clicking &lt;strong&gt;Create role&lt;/strong&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7gentx2gpd5ooisckwjv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7gentx2gpd5ooisckwjv.png" alt="zfdsssfs" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzrvvhklpnhq642cst0dy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzrvvhklpnhq642cst0dy.png" alt="fsfwefw" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfn11x1ntok3sfdt8udu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfn11x1ntok3sfdt8udu.png" alt="Isscs" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw83ftgguq9kgclic7lz7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw83ftgguq9kgclic7lz7.png" alt="dfgdge" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0cclsosv48igvindhi91.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0cclsosv48igvindhi91.png" alt="sfsfw" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just as the previous, For &lt;strong&gt;EC2 instance profile&lt;/strong&gt; select &lt;strong&gt;Create role&lt;/strong&gt; if no existing role is available.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;During the role creation process, leave all configuration parameters at their default values.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Proceed by clicking &lt;strong&gt;Next&lt;/strong&gt; on both the &lt;strong&gt;Select trusted entity&lt;/strong&gt; and &lt;strong&gt;Add permissions steps&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update the &lt;strong&gt;Role name&lt;/strong&gt; to a unique and descriptive identifier.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Complete the process by clicking &lt;strong&gt;Create role&lt;/strong&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpvms37six52sgcqp2qb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpvms37six52sgcqp2qb.png" alt="jhgttttrd" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1f2cv9gohmt7zpu5a9qp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1f2cv9gohmt7zpu5a9qp.png" alt="ytrruyyy" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the EC2 Key Pair, locate the service using the AWS Management Console search bar.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a new key pair by providing a unique and descriptive name.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leave all other configuration options at their default settings.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Complete the process by selecting &lt;strong&gt;Create key pair&lt;/strong&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tzek9uuawxe15qipw0q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tzek9uuawxe15qipw0q.png" alt="gygftfyu" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmamiwww8f8qw838yheaf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmamiwww8f8qw838yheaf.png" alt="hgtftrty" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;Next&lt;/strong&gt; on Configure service page afterwards.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzesa99q6mfd0uat5e2rg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzesa99q6mfd0uat5e2rg.png" alt="**Image description**" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step3&lt;/strong&gt;: &lt;strong&gt;Set up networking, database, and tags&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select the default VPC from the available options in the dropdown menu.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable &lt;strong&gt;Public IP address&lt;/strong&gt; assignment for the instances.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the Instance Subnet options, select any available subnet.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Leave all remaining parameters at their default values, then click &lt;strong&gt;Next&lt;/strong&gt; to proceed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmp8awveg9d8ccxvs01r0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmp8awveg9d8ccxvs01r0.png" alt="IXSLS" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0mn9stmrawbeqilhi9rc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0mn9stmrawbeqilhi9rc.png" alt="dgdge" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step4&lt;/strong&gt;: &lt;strong&gt;Configure instance traffic and scaling&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select an existing &lt;strong&gt;EC2 Security Group&lt;/strong&gt; from the dropdown list.&lt;/li&gt;
&lt;li&gt;Leave all other configuration parameters at their default values, then click &lt;strong&gt;Next&lt;/strong&gt; to continue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9wj1luuhr69hxfjerscm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9wj1luuhr69hxfjerscm.png" alt="ygtdrdrt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcghnxykjrysh7b6b3z2j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcghnxykjrysh7b6b3z2j.png" alt="uhytt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step5&lt;/strong&gt;: &lt;strong&gt;Configure updates, monitoring, and logging&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the appropriate &lt;strong&gt;CloudWatch metrics instance&lt;/strong&gt; and the corresponding &lt;strong&gt;CloudWatch metrics environment&lt;/strong&gt; from their respective fields simultaneously&lt;/li&gt;
&lt;li&gt;Leave all other configuration parameters at their default values, then click &lt;strong&gt;Next&lt;/strong&gt; to continue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmla12xb54k9yoo9cg8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmla12xb54k9yoo9cg8p.png" alt="hgtftgyt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4t5njonugshebdnoczec.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4t5njonugshebdnoczec.png" alt="uhyttrrt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now review the entire steps and click &lt;strong&gt;create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Once the Environment is successfully lunched and the health status shows green and &lt;strong&gt;OK&lt;/strong&gt;. Click the URL link listed for &lt;strong&gt;Domain&lt;/strong&gt; to browse your application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06l1mmc3hqd39mtt1rng.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06l1mmc3hqd39mtt1rng.png" alt="DFDGES" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczaptnfjm6mrqoawhfqp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczaptnfjm6mrqoawhfqp.png" alt="sddrgevev" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;To access the &lt;strong&gt;Elastic Beanstalk–managed EC2 instance shell&lt;/strong&gt;, navigate to &lt;strong&gt;Amazon Elastic Compute Cloud (EC2)&lt;/strong&gt; and select the relevant &lt;strong&gt;Instance ID&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the &lt;strong&gt;Instance&lt;/strong&gt; Overview page, click &lt;strong&gt;Connect&lt;/strong&gt;, then select &lt;strong&gt;Connect&lt;/strong&gt; again to initiate an in-browser SSH session.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsa7lowtgzx12y4yy811.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsa7lowtgzx12y4yy811.png" alt="Idfefd" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v0wpxfg5zutdzx4v720.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v0wpxfg5zutdzx4v720.png" alt="4t4fref" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fygd69ju8xh7pepfy3wcn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fygd69ju8xh7pepfy3wcn.png" alt="regdgrg" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F315t16n8nn67iuitj1uv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F315t16n8nn67iuitj1uv.png" alt="rgewge" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Amazon Elastic Beanstalk strikes a balance between simplicity and control. It abstracts infrastructure management while still allowing developers to customize and scale their applications using familiar AWS services.&lt;/p&gt;

&lt;p&gt;For teams seeking fast deployments, built-in scaling, and reduced operational overhead, Elastic Beanstalk remains a powerful and reliable choice.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudnative</category>
      <category>devops</category>
      <category>awschallenge</category>
    </item>
    <item>
      <title>Prepare your app deployment tools and resources</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Tue, 02 Dec 2025 02:40:32 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/prepare-your-app-deployment-tools-and-resources-4ia6</link>
      <guid>https://dev.to/lotanna_obianefo/prepare-your-app-deployment-tools-and-resources-4ia6</guid>
      <description>&lt;p&gt;Many companies are looking for ways to simplify and modernize their DevOps processes. In some cases, their teams already run containerized applications on Azure Kubernetes Service (AKS) but aren’t fully using its advanced features, such as custom service mesh and autoscaling. To reduce complexity and improve efficiency, Azure Container Apps offers a lighter, more scalable, and cost-effective alternative.&lt;/p&gt;

&lt;p&gt;By switching to Azure Container Apps, teams can streamline how they deploy and manage containerized applications, cut down on DevOps overhead, and benefit from built-in autoscaling and scale-to-zero features to make better use of resources.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup the environment&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Install Docker Desktop&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open your web browser and go to the Docker Desktop installation page &lt;a href="https://docs.docker.com/desktop/install/windows-install/" rel="noopener noreferrer"&gt;https://docs.docker.com/desktop/install/windows-install/&lt;/a&gt;. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi4472sdy4uhr1wev2218.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi4472sdy4uhr1wev2218.png" alt="f7ttfyy" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Make sure your computer meets the required system specifications and follow the instructions provided on the website to complete the Docker Desktop installation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Install the .NET Software Development Kit&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open a browser window and then navigate to the .NET download page. &lt;a href="https://dotnet.microsoft.com/download" rel="noopener noreferrer"&gt;https://dotnet.microsoft.com/download&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpvn1m83yf58a1j5qduit.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpvn1m83yf58a1j5qduit.png" alt="Ibvgft6" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;select the latest Long-Term Support (LTS) version, Double-click the installation file to begin the installation process. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the .NET SDK Installer window, select &lt;strong&gt;Install&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Install Visual Studio Code with Docker and Azure App Service extensions&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open a browser window and then navigate to: &lt;a href="https://code.visualstudio.com" rel="noopener noreferrer"&gt;https://code.visualstudio.com&lt;/a&gt;.
In the browser window, select Download for Windows or Mac.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fofu1rl03o1cb6xdtihsg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fofu1rl03o1cb6xdtihsg.png" alt="hygy88" width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The Download page for Visual Studio Code automatically detects your operating system. It displays the version to download for your operating system, such as Linux, macOS, or Windows&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Wait for the installer file to finish downloading, and then use a file explorer application to navigate to your computer’s downloads folder.&lt;/li&gt;
&lt;li&gt;In your file explorer application, select and run the Visual Studio Code installer file.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;You can install Visual Studio Code using either the User Installer or System Installer. The User Installer installs Visual Studio Code just for the current user, while the System Installer installs Visual Studio Code for all users. The User Installer is the recommended option for most users&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;I accept the license agreement&lt;/strong&gt;, and then continue following the online instructions to complete the installation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Accept the default options during the remainder of the installation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that you have Visual Studio Code open.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Activity bar, select &lt;strong&gt;Extensions&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the Search Extensions in Marketplace textbox, enter &lt;strong&gt;C#&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Entering "C#" filters the list of extensions to show only the extensions that have something to do with C# coding.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;In the filtered list of available extensions, select the extension labeled "&lt;strong&gt;C# Dev Kit&lt;/strong&gt; - Official C# extension from Microsoft" that's published by Microsoft.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To install the extension, select &lt;strong&gt;Install&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the installation to complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzc3aqj8nuytt68zasgp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzc3aqj8nuytt68zasgp.png" alt="gf6t6y8y8u" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;EXTENSIONS&lt;/strong&gt; view, replace C# with &lt;strong&gt;docker&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the filtered list of available extensions, select the extension labeled Docker that's published by Microsoft.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To install the extension, select &lt;strong&gt;Install&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the installation to complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fk2brqqkxzeavyylaxy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fk2brqqkxzeavyylaxy.png" alt="ci7t6t77" width="800" height="501"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;EXTENSIONS&lt;/strong&gt; view, replace docker with &lt;strong&gt;azure app service&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the filtered list of available extensions, select the extension labeled Azure App Service that's published by Microsoft.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To install the extension, select &lt;strong&gt;Install&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the installation to complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rcopul6mgf0go2rz0zs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rcopul6mgf0go2rz0zs.png" alt="ytr6fytiiy" width="800" height="501"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Close Visual Studio Code.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install Azure CLI and the containerapp extension&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps install Azure CLI and the containerapp extension.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open a browser window, and then navigate to: /cli/azure/install-azure-cli.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the browser window, follow the instructions for installing/updating Azure CLI for your computer's operating system.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;The current version of the Azure CLI is 2.65.0. For information about the latest release, see the release notes. To find your installed version and see if you need to update, run az version. You can run az upgrade to install the latest version&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open a command line or terminal application, such as Windows Command Prompt.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Sign in to Azure using the az login command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Follow the prompts to complete the authentication process.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install the Azure Container Apps extension using the &lt;strong&gt;az extension add --name containerapp --upgrade command&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Install Microsoft PowerShell&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to install Microsoft PowerShell.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open a browser window, and then navigate to: /powershell/scripting/install/installing-powershell.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the browser window, follow the instructions for installing/updating PowerShell for your computer's operating system.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3c7w2mvn34ze5ppqfsj9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3c7w2mvn34ze5ppqfsj9.png" alt="uyg6tfr" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure a Resource Group for your Azure resources&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to configure a resource group for your Azure resources.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open a browser window, and then navigate to the Azure portal: &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;https://portal.azure.com/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that your Azure account has permission to create resources and assign RBAC permissions. Check the RBAC role(s) assigned to your account before you continue.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Contributor role isn't able to assign Azure RBAC permissions. We recommend using an account that has been assigned the Owner, Azure account administrator, or Azure co-administrator role for your Azure subscription.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the top search bar of the Azure portal, in the Search textbox, enter &lt;strong&gt;resource group&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results, select Resource groups, and then select &lt;strong&gt;+ Create&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Basics tab, configure the resource group as follows:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Specify the Azure subscription that you're using&lt;/p&gt;

&lt;p&gt;Resource group: Enter &lt;strong&gt;RG1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Region: Select &lt;strong&gt;Central US&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;Review + create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F286e8pomtmgilp9zwxcs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F286e8pomtmgilp9zwxcs.png" alt="J98UH" width="800" height="400"&gt;&lt;/a&gt;&lt;br&gt;
Once validation has passed, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fke3kgn0upne811bhi7qy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fke3kgn0upne811bhi7qy.png" alt="n8u877" width="800" height="403"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuv3nfti7kyd9y8m7d4cm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuv3nfti7kyd9y8m7d4cm.png" alt="pliu8hh" width="800" height="203"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure a Virtual Network and subnets&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to configure a Virtual Network and subnets.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ensure that you have your Azure portal open in a browser window.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the top search bar of the Azure portal, in the Search textbox, enter &lt;strong&gt;virtual network&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results, select &lt;strong&gt;Virtual networks&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Create virtual network&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Basics tab, configure your virtual network as follows:&lt;/p&gt;

&lt;p&gt;Subscription: Specify the Azure subscription that you're using&lt;br&gt;
Resource group name: Select &lt;strong&gt;RG1&lt;/strong&gt;&lt;br&gt;
Virtual network name: Enter &lt;strong&gt;VNET1&lt;/strong&gt;&lt;br&gt;
Region: Ensure that &lt;strong&gt;Central US&lt;/strong&gt; is selected.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the IP addresses tab.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgvl8nt198gj3eccgrnf5.png" alt="JT6ftttgg" width="800" height="442"&gt;
&lt;/li&gt;
&lt;li&gt;On the IP addresses tab, under Subnets, select &lt;strong&gt;default&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Edit subnet page, configure the subnet as follows:&lt;/p&gt;

&lt;p&gt;Name: Enter &lt;strong&gt;PESubnet&lt;/strong&gt;&lt;br&gt;
Starting address: Ensure that &lt;strong&gt;10.0.0.0&lt;/strong&gt; is specified.&lt;br&gt;
Subnet size: Ensure that &lt;strong&gt;/24 (256 addresses)&lt;/strong&gt; is specified.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Save&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fttrs7q1zoxxemzntz1yp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fttrs7q1zoxxemzntz1yp.png" alt="uy7ff5f" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the IP addresses tab, select &lt;strong&gt;+ Add&lt;/strong&gt; a subnet.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Add a subnet page, configure the subnet as follows:&lt;/p&gt;

&lt;p&gt;Name: Enter &lt;strong&gt;ACASubnet&lt;/strong&gt;&lt;br&gt;
Starting address: Ensure that &lt;strong&gt;10.0.4.0&lt;/strong&gt; is specified.&lt;br&gt;
Subnet size: Ensure that &lt;strong&gt;/23 (512 addresses)&lt;/strong&gt; is specified.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Add&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fai1t685taommecrk3o22.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fai1t685taommecrk3o22.png" alt="6fr65ftrf" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Review + create&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqehft54nexi1inl160u5.png" alt="ygygtcc" width="800" height="400"&gt;
&lt;/li&gt;
&lt;li&gt;Once validation has passed, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the deployment to complete.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftgps273hzncbwhdg4ro9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftgps273hzncbwhdg4ro9.png" alt="fgugyugig" width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Service Bus&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to configure a Service Bus instance.&lt;/p&gt;

&lt;p&gt;Ensure that you have your Azure portal open in a browser window.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the top search bar of the Azure portal, in the Search textbox, enter service bus&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results, select &lt;strong&gt;Service Bus&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select Create service bus namespace.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Basics tab, configure your Service bus namespace as follows:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Ensure that the Azure subscription is selected.&lt;br&gt;
Resource group name: Select &lt;strong&gt;RG1&lt;/strong&gt;&lt;br&gt;
Namespace name: Enter &lt;strong&gt;sb-az2003-LT&lt;/strong&gt;.&lt;br&gt;
Location: Ensure that &lt;strong&gt;Central US&lt;/strong&gt; is selected.&lt;br&gt;
Pricing tier: Select &lt;strong&gt;Basic&lt;/strong&gt;.&lt;br&gt;
Select &lt;strong&gt;Review + create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Once the Validation succeeded message appears, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb6d6k6g157kfzrnc0cle.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb6d6k6g157kfzrnc0cle.png" alt="IDFRE" width="800" height="358"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frxpcguub7sbcmrk6bvmv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frxpcguub7sbcmrk6bvmv.png" alt="eftc5c" width="800" height="398"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftukrrquq53maariioijs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftukrrquq53maariioijs.png" alt="dht7u7" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Wait for the deployment to complete.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Azure Container Registry&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to configure a Container Registry instance.&lt;/p&gt;

&lt;p&gt;Ensure that you have your Azure portal open in a browser window.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the top search bar of the Azure portal, in the Search textbox, enter container registry&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results, select &lt;strong&gt;Container registries&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Container registries page, select &lt;strong&gt;Create container registry or + Create&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Basics tab of the Create container registry page, specify the following information:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Ensure that the Azure subscription is selected.&lt;br&gt;
Resource group: Select &lt;strong&gt;RG1&lt;/strong&gt;.&lt;br&gt;
Registry name: Enter &lt;strong&gt;acraz2003LT2025&lt;/strong&gt; &lt;br&gt;
Location: Ensure that &lt;strong&gt;Central US&lt;/strong&gt; is selected.&lt;br&gt;
SKU: Select &lt;strong&gt;Premium&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The name of your Registry must be unique. Also, the Premium tier is required for private link with private endpoints&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select Review + create.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select Create.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjqrvkzvddwher8k5t53d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjqrvkzvddwher8k5t53d.png" alt="YTYUTT" width="800" height="398"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpc9yyv0ohht5cf3ynxu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpc9yyv0ohht5cf3ynxu.png" alt="5656ytf7" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1nalbemu3n2a8kvs0dga.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1nalbemu3n2a8kvs0dga.png" alt="tyrr6766gff" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;After the deployment has completed, open the deployed resource.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Settings, select &lt;strong&gt;Networking&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Networking page, on the &lt;strong&gt;Public access&lt;/strong&gt; tab, ensure that &lt;strong&gt;All networks&lt;/strong&gt; is selected.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiv3j2gwp1wddiek47pa6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiv3j2gwp1wddiek47pa6.png" alt="yt66ff" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Settings, select &lt;strong&gt;Properties&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Properties page, select &lt;strong&gt;Admin user&lt;/strong&gt;, and then select &lt;strong&gt;Save&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fduxcd2vld6jq649ra7ot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fduxcd2vld6jq649ra7ot.png" alt="f5rrrf" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a WebAPI app and publish to a GitHub repository&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to create a WebAPI app and publish to a GitHub repository.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open Visual Studio Code.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the File menu, select &lt;strong&gt;Open Folder&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a new folder named &lt;strong&gt;AZ2003&lt;/strong&gt; in a location that is easy to find.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For example, create a folder named AZ2003 on the Windows Desktop.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Terminal menu, select &lt;strong&gt;New Terminal&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At the terminal command prompt, to create a new ASP.NET Web API project, enter the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       dotnet new webapi --no-https
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwe0jpo8hwirxo6g9pfj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwe0jpo8hwirxo6g9pfj.png" alt="hgygyuhjg" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At the terminal command prompt, run the following dotnet CLI command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      dotnet build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;On the View menu, select Command Palette, and then run the following command: &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;     .NET: Generate Assets for Build and Debug.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;em&gt;If the command generates an error message, select &lt;strong&gt;OK&lt;/strong&gt;, and then run the command again&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;In the root project folder, create a &lt;strong&gt;.gitignore&lt;/strong&gt; file that contains the following information:&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  [Bb]in/
  [Oo]bj/
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the File menu, select &lt;strong&gt;Save All&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the Source Control view.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Publish to GitHub&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If prompted, to enable the GitHub extension to sign in using GitHub, select Allow, and then provide authorization in GitHub.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In Visual Studio Code, select &lt;strong&gt;Publish to GitHub public repository&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that the bin and obj folders are not included in the repository.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7rusa8bh3hc916tsm1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7rusa8bh3hc916tsm1r.png" alt="66t6ggg" width="800" height="304"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Docker image and push to Azure Container Registry&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to create a Docker image and push the image to your Azure Container Registry.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ensure that you have your AZ2003 code project open in Visual Studio Code.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To create a Dockerfile, run the following command in the Command Palette: &lt;strong&gt;Docker: Add Docker Files to Workspace&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When prompted, specify the following information:&lt;/p&gt;

&lt;p&gt;Application Platform: &lt;strong&gt;.NET ASP.NET Core&lt;/strong&gt;.&lt;br&gt;
Operating System: &lt;strong&gt;Linux&lt;/strong&gt;.&lt;br&gt;
Ports: &lt;strong&gt;5000&lt;/strong&gt;.&lt;br&gt;
Docker Compose files: &lt;strong&gt;No&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;At a terminal command prompt, run the following docker CLI command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      docker build --tag aspnetcorecontainer:latest .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;The syntax for the build command is: &lt;strong&gt;docker build --tag image name:image tag .&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;This command builds a container image that is hosted by Docker and accessible using the Docker extension for VS Code&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Wait for the Docker Build command to complete.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open the Visual Studio Code Command Palette, and then run the following command: &lt;strong&gt;Docker Images: Push&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When the command runs, enter the following information:&lt;/p&gt;

&lt;p&gt;Select the docker image name that you created: &lt;strong&gt;aspnetcorecontainer&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Select the image tag that you created: &lt;strong&gt;latest&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If you see a message stating that no registry is connected, select &lt;strong&gt;Connect Registry&lt;/strong&gt;, and then enter the following information:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Registry provider: Select &lt;strong&gt;Azure&lt;/strong&gt;. Follow the online instructions to verify your Azure account if needed.&lt;/p&gt;

&lt;p&gt;Azure subscription: Select the Azure Subscription for this project.&lt;/p&gt;

&lt;p&gt;Select your Azure Container Registry resource: &lt;strong&gt;acraz2003LT25oct&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An image tag is generated: &lt;strong&gt;acraz2003LT12oct.azurecr.io/aspnetcorecontainer:latest&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To push the image to your Container Registry, press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;The following Docker command is executed:&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      docker image push acraz2003cah12oct.azurecr.io/aspnetcorecontainer:latest
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Wait for the image to be pushed to your Azure Container Registry.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the Source Control view and then &lt;strong&gt;Commit&lt;/strong&gt; and Sync your file updates.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Configure Azure DevOps and a starter Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Complete the following steps to configure Azure DevOps and a starter Pipeline:&lt;/p&gt;

&lt;p&gt;Open the Azure portal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the top search bar, in the Search textbox, enter &lt;strong&gt;devops&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results, select &lt;strong&gt;Azure DevOps organizations&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;My Azure DevOps Organizations&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the home page of your organization, in the lower-left corner of the page, select &lt;strong&gt;Organization settings&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left side menu under Security, select &lt;strong&gt;Policies&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that the &lt;strong&gt;Allow public projects policy&lt;/strong&gt; is set to &lt;strong&gt;On&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Return to the home page of your organization.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To create a new project, select &lt;strong&gt;New project&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If you created a new organization, you may see the Create a project to get started page.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Create new project page, specify the following information:&lt;/p&gt;

&lt;p&gt;Project name: &lt;strong&gt;Project1&lt;/strong&gt;&lt;br&gt;
Description: &lt;strong&gt;AZ-2003 project&lt;/strong&gt;&lt;br&gt;
Visibility: &lt;strong&gt;Public&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Create new project page, select &lt;strong&gt;Create project&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffqicnkdc49hv621t9hns.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffqicnkdc49hv621t9hns.png" alt="yt76rttt" width="800" height="400"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1euolz0jwhv1051k3sus.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1euolz0jwhv1051k3sus.png" alt="iuyur5gygt" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, select &lt;strong&gt;Repos&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Import a repository, select &lt;strong&gt;Import&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Import a Git repository page, enter the &lt;strong&gt;URL&lt;/strong&gt; for the GitHub repository you created for your code project, and then select &lt;strong&gt;Import&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpv4z8duj5pzb5gw75091.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpv4z8duj5pzb5gw75091.png" alt="hgftyr655" width="800" height="425"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaq6o4hzza7pabtrui8y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaq6o4hzza7pabtrui8y.png" alt="hgft665" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your repository URL should be similar to the following example:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;a href="https://github.com/your-account/AZ2003" rel="noopener noreferrer"&gt;https://github.com/your-account/AZ2003&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, select &lt;strong&gt;Pipelines&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Create Pipeline&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F33cgukr3fk4y6h7rke48.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F33cgukr3fk4y6h7rke48.png" alt="hgy77tyy" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Azure Repos Git&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdcdynoy8subyua6qceb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdcdynoy8subyua6qceb.png" alt="nyuijkj" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Select a repository page, select &lt;strong&gt;Project1&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkfchebmsii7urhvrhiu0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkfchebmsii7urhvrhiu0.png" alt="brhrhrt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Starter pipeline&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvwyp42yxmo0yrvswcycu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvwyp42yxmo0yrvswcycu.png" alt="fnhurhb" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Save and Run, select &lt;strong&gt;Save&lt;/strong&gt;, and then select &lt;strong&gt;Save&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F44cv9dl4qhm8serr6ali.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F44cv9dl4qhm8serr6ali.png" alt="Idferg" width="800" height="422"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvflnsfis7ehlnqbl50v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvflnsfis7ehlnqbl50v.png" alt="fgrye" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To rename the pipeline to Pipeline1, complete the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, select &lt;strong&gt;Pipelines&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To the right of the Project1 pipeline, select &lt;strong&gt;More options&lt;/strong&gt;, and then select &lt;strong&gt;Rename/move&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the Rename/move pipeline dialog, under Name, enter &lt;strong&gt;Pipeline1&lt;/strong&gt; and then select &lt;strong&gt;Save&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmvnbsqzb3phfgqp64gf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmvnbsqzb3phfgqp64gf.png" alt="fg56yry" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Don't run the pipeline now. You will configure this pipeline during the project exercise&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploy a self-hosted Windows agent&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For an Azure Pipeline to build and deploy Windows, Azure, and other Visual Studio solutions you need at least one Windows agent in the host environment.&lt;/p&gt;

&lt;p&gt;Complete the following steps to deploy a self-hosted Windows agent:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ensure that you're signed-in to Azure DevOps with the user account you're using for your Azure DevOps organization.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the home page of your organization, open your user settings, and then select &lt;strong&gt;Personal access tokens&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To create a personal access token, select &lt;strong&gt;+ New Token&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Name, enter &lt;strong&gt;AZ2003&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At the bottom of the Create a new personal access token window, to see the complete list of scopes, select &lt;strong&gt;Show all scopes&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the custom designed scope, select &lt;strong&gt;Agent Pools (Read &amp;amp; manage)&lt;/strong&gt; and &lt;strong&gt;Deployment Groups (Read &amp;amp; manage)&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that all the other boxes are cleared.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyr5iyenv8nl6mdozsg9n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyr5iyenv8nl6mdozsg9n.png" alt="4rfefd" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Success page, to copy the token, select Copy to clipboard.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1z5b3eo12hm866prfh2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1z5b3eo12hm866prfh2.png" alt="Ibnytjrnt" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;You will use this token when you configure the agent.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that you're signed into Azure DevOps as the Azure DevOps organization owner.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select your DevOps organization, and then select &lt;strong&gt;Organization settings&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhw6bj04qscbdl3a3dw22.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhw6bj04qscbdl3a3dw22.png" alt="rrer4" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left side menu under Pipelines, select &lt;strong&gt;Agent pools&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fin5suwroecobwwqjvdsq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fin5suwroecobwwqjvdsq.png" alt="r6eytrtr" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If the &lt;strong&gt;Get the agent&lt;/strong&gt; dialog box opens, skip to the next step.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If a list of Agent pools is displayed, complete the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To select the default pool, select &lt;strong&gt;Default&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;If the Default pool doesn't exist, select Add pool, and then enter the following information&lt;/em&gt;:&lt;br&gt;
&lt;em&gt;Under Pool type, select &lt;strong&gt;Self-hosted&lt;/strong&gt;&lt;/em&gt;.&lt;br&gt;
&lt;em&gt;Under Name, enter &lt;strong&gt;default&lt;/strong&gt;&lt;/em&gt;&lt;br&gt;
&lt;em&gt;Select &lt;strong&gt;Create&lt;/strong&gt;&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To open the pool that you just created, select &lt;strong&gt;default&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqlnyow83y3q3ciljl9we.png" alt="fdgtrthrt" width="800" height="424"&gt;
&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Agents&lt;/strong&gt;, and then select &lt;strong&gt;New agent&lt;/strong&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2yr1jijtrba47wyr8oke.png" alt="tryhdh" width="800" height="426"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Get the agent dialog box, complete the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select the &lt;strong&gt;Windows&lt;/strong&gt; tab.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left pane, select the processor architecture of the installed Windows OS version on your machine.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The x64 agent version is intended for 64-bit Windows, whereas the x86 version is intended for 32-bit Windows.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the right pane, select &lt;strong&gt;Download&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Follow the instructions to download the agent.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Use File Explorer to create the following folder location for the agent:&lt;br&gt;
               C:\agents&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Unpack the agent zip file into the directory you created.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open PowerShell as an Administrator.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Navigate to the "C:\agents" directory, and then enter the following PowerShell command:&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       .\config
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hmcn67py409x3q3yex3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hmcn67py409x3q3yex3.png" alt="dtgtg" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Respond to the configuration prompts as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Enter server URL: enter the URL for your &lt;strong&gt;DevOps organization&lt;/strong&gt;. Enter authentication type (press enter for PAT) &amp;gt;: press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;br&gt;
Enter personal access token &amp;gt;: Paste-in the &lt;strong&gt;personal access token&lt;/strong&gt; that you copied to the clipboard earlier.&lt;br&gt;
Enter agent pool (press enter for default): press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;br&gt;
Enter agent name (press enter for YOUR-PC-NAME), enter az2003-agent&lt;br&gt;
Enter work folder (press enter for _work) &amp;gt;: press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;br&gt;
Enter run agent as service? (Y/N) (press enter for N) &amp;gt;: enter &lt;strong&gt;Y&lt;/strong&gt;&lt;br&gt;
Enter enable SERVICE_SID_TYPE_UNRESTRICTED for agent service (Y/N) (press enter for N) &amp;gt;: enter &lt;strong&gt;Y&lt;/strong&gt;&lt;br&gt;
Enter User account to use for the service (press enter for NT AUTHORITY\NETWORK SERVICE) &amp;gt;: press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;br&gt;
Enter whether to prevent service starting immediately after configuration is finished? (Y/N) (press enter for N) &amp;gt;: press &lt;strong&gt;Enter&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fibhlsl8c4bqqrmfgjzei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fibhlsl8c4bqqrmfgjzei.png" alt="ghtdfbfd" width="800" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A message informing you that the agent started successfully is displayed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpxx1v287kiqpxkb172c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpxx1v287kiqpxkb172c.png" alt="jjnknkuu" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you're now ready to begin your project.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>azure</category>
      <category>cloudnative</category>
      <category>cloudcomputing</category>
    </item>
    <item>
      <title>Manage revisions in Azure Container Apps</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Mon, 01 Dec 2025 02:04:26 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/manage-revisions-in-azure-container-apps-5938</link>
      <guid>https://dev.to/lotanna_obianefo/manage-revisions-in-azure-container-apps-5938</guid>
      <description>&lt;p&gt;Azure Container Apps includes a built-in revision management system that allows you to track, control, and roll back application updates with ease. Each time you deploy new code or update configuration settings, Azure Container Apps automatically generates a new immutable revision. These revisions enable precise traffic management, safe deployments, blue-green or canary release strategies, and quick rollback in case of failures. By leveraging revision modes, traffic-splitting, and version history, teams can maintain application stability while continuously delivering updates in a controlled and observable manner.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Set revision management to multiple&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the Azure portal, open your container app resource.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left side menu, under Application, select &lt;strong&gt;Revisions and replicas&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At the top of the Revisions and replicas page, select &lt;strong&gt;Choose revision mode&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To switch from single to &lt;strong&gt;multi-revision&lt;/strong&gt; mode, select &lt;strong&gt;Confirm&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Revisions and replicas page, wait for the Revision Mode setting to update.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc89dy814fn2e22o1m4r3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc89dy814fn2e22o1m4r3.png" alt="gftdrttd" width="800" height="390"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbg29tcmhygm6duws21y7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbg29tcmhygm6duws21y7.png" alt="tf5rydrdc" width="800" height="393"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The Revision Mode will be set to Multiple after the update. Also, on the left-side menu, the section title changes from Application to Revisions&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a new revision with a v2 suffix&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the Azure portal, ensure that you have the Revisions and replicas page of your container app resource open.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;At the top of the page, select &lt;strong&gt;+ Create new revision&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Create and deploy new revision page, complete the following steps:&lt;/p&gt;

&lt;p&gt;Name / suffix: Enter &lt;strong&gt;v2&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Under Container image, select your container image. &lt;strong&gt;aca-az2003&lt;/strong&gt;.
Select &lt;strong&gt;Create&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwveaiyt1p53rbmrj4dh.png" alt="x5etfytt" width="800" height="397"&gt;
Wait for the deployment to be completed.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Configure labels on the revisions&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Settings, select &lt;strong&gt;Ingress&lt;/strong&gt;.&lt;br&gt;
If Ingress isn't enabled, select &lt;strong&gt;Enabled&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Ingress page, specify the following information:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Ingress traffic: &lt;strong&gt;select Accepting traffic from anywhere&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Ingress type: select &lt;strong&gt;HTTP&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Client certificate mode: ensure that &lt;strong&gt;Ignore&lt;/strong&gt; is selected.&lt;/p&gt;

&lt;p&gt;Transport: ensure that &lt;strong&gt;Auto&lt;/strong&gt; is selected.&lt;/p&gt;

&lt;p&gt;Insecure connections: ensure that &lt;strong&gt;Allowed&lt;/strong&gt; is NOT checked.&lt;/p&gt;

&lt;p&gt;Target port: enter &lt;strong&gt;5000&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;IP Security Restrictions Mode: ensure that &lt;strong&gt;Allow all traffic&lt;/strong&gt; is selected.&lt;/p&gt;

&lt;p&gt;At the bottom of the Ingress page, select &lt;strong&gt;Save&lt;/strong&gt;, and then wait for the update to complete.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkbzfpnrs8wg2gcsc1cg5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkbzfpnrs8wg2gcsc1cg5.png" alt="7y76t6trr" width="800" height="360"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foshuw8dqmttfk5itq166.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foshuw8dqmttfk5itq166.png" alt="ygress" width="800" height="241"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Revisions, select &lt;strong&gt;Revisions and replicas&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the v2 revision, under Label, enter &lt;strong&gt;updated&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the other revision, enter &lt;strong&gt;current&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At the top of the page, select &lt;strong&gt;Save&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvsd2lwp7elh4kri5qwam.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvsd2lwp7elh4kri5qwam.png" alt="guyftfyuyg" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure a traffic percentage on the revisions&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that you have the Revisions and replicas page open.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;For the v2 revision, under Traffic, enter &lt;strong&gt;25&lt;/strong&gt; as the percentage.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the other revision, under Traffic, enter &lt;strong&gt;75&lt;/strong&gt; as the percentage.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At the top of the page, select &lt;strong&gt;Save&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsry5e0ibb2464uyj4gh5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsry5e0ibb2464uyj4gh5.png" alt="Iijugt6t" width="800" height="390"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verify your work&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that you have your Container App open in the Azure portal.&lt;/p&gt;

&lt;p&gt;On the left-side menu, under Application, select &lt;strong&gt;Revisions and replicas&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Verify that your revisions are configured as follows:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69lyym9i6hsk23oyw7zm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69lyym9i6hsk23oyw7zm.png" alt="jhhjgytfryuyt" width="800" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Azure Container Apps uses a revision-based deployment model that creates a new immutable revision each time you update a container app’s configuration or container image. This enables safe, controlled rollouts without disrupting existing traffic. Revisions can be activated, deactivated, or rolled back as needed, allowing teams to test new versions, split traffic for gradual releases, and maintain high availability.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>azure</category>
      <category>cloud</category>
      <category>cloudnative</category>
    </item>
    <item>
      <title>Configure continuous integration by using Azure Pipelines</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Thu, 27 Nov 2025 22:38:51 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/configure-continuous-integration-by-using-azure-pipelines-16i2</link>
      <guid>https://dev.to/lotanna_obianefo/configure-continuous-integration-by-using-azure-pipelines-16i2</guid>
      <description>&lt;p&gt;Continuous Integration (CI) enables development teams to automatically build, test, and validate application changes whenever code is committed to a repository. In Azure environments, Azure Pipelines provides a robust CI service that integrates seamlessly with platforms like GitHub and Azure Repos. By configuring CI with Azure Pipelines, teams can automate the process of building container images, run tests, and securely pushing updated images to Azure Container Registry (ACR). This automated workflow ensures consistent deployments, reduces manual errors, and supports a scalable DevOps lifecycle for applications running on Azure Container Apps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Pipeline1 to use the self-hosted agent pool&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open a browser window, navigate to &lt;a href="https://dev.azure.com" rel="noopener noreferrer"&gt;https://dev.azure.com&lt;/a&gt;, and then open your Azure DevOps organization.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On your Azure DevOps page, to open your DevOps project, select Project1.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the left-side menu, select Pipelines.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Pipeline1&lt;/strong&gt;, and then select &lt;strong&gt;Edit&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfntz6rpryd4s2e1pth5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfntz6rpryd4s2e1pth5.png" alt="6ftrd6uyu" width="800" height="426"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhpltu5xzpr7o2qvlgvo9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhpltu5xzpr7o2qvlgvo9.png" alt="7ttfrd" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;To use the self-hosted agent pool, update the azure-pipelines.yml file as shown in the following example:&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;           trigger:
            - main

           pool:
           name: default

           steps:
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Recall that the &lt;strong&gt;pool&lt;/strong&gt; section specifies the agent pool to use for the pipeline. The &lt;strong&gt;name&lt;/strong&gt; property specifies the name of the agent pool. In this case, the name is &lt;strong&gt;default&lt;/strong&gt;, which is the pool you configured as a self-hosted agent pool&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmry286vvryncnufo4189.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmry286vvryncnufo4189.png" alt="fytkfyt66" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Under Validate and save, select Save without validating.&lt;/li&gt;
&lt;li&gt;Enter a commit message, and then select Save.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw2m51fg1e8yolepbcqd8.png" alt="I87uhgyt" width="800" height="421"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp5bjq8yg5cblscx38q21.png" alt="ht6dxzd" width="800" height="424"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Configure Pipeline1 with an Azure Container Apps deployment task&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that you have Pipeline1 open for editing.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the right side under Tasks, in the Search tasks field, enter &lt;strong&gt;azure container&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;In the filtered list of tasks, select &lt;strong&gt;Azure Container Apps Deploy&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Under Azure Resource Manager connection, select the Subscription you're using, and then select &lt;strong&gt;Authorize&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwk9lkcfsta6b90uunrl6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwk9lkcfsta6b90uunrl6.png" alt="87tfdrt" width="800" height="422"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqiebrsmon7qaw95yl3g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqiebrsmon7qaw95yl3g.png" alt="87trdtryy" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the Azure portal tab, open your Container App resource, and then open the Containers page.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Use the information on the Containers page to configure the following Pipeline1 Task information:&lt;/p&gt;

&lt;p&gt;Docker Image to Deploy: &lt;strong&gt;Registry/Image:Image tag&lt;/strong&gt;&lt;br&gt;
Azure Container App name: &lt;strong&gt;Name&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure the following &lt;strong&gt;Pipeline1&lt;/strong&gt; Task information:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Azure Resource group name: &lt;strong&gt;RG1&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Azure Container Apps Deploy page, select &lt;strong&gt;Add&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftgj20p1f9j6n3qvim2yp.png" alt="76ttr" width="800" height="424"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnk65senxrbgjnk6f8wje.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnk65senxrbgjnk6f8wje.png" alt="uyt76rrrrr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Yaml file for your pipeline should now include the AzureContainerApps tasks as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;                 trigger:
                   - main
                 pool:
                   name: default
                 steps:
                   - task: AzureContainerApps@1
                     inputs:
                       azureSubscription: '&amp;lt;Subscription&amp;gt;(&amp;lt;Subscription ID&amp;gt;)'
                       imageToDeploy: '&amp;lt;Registry&amp;gt;/&amp;lt;Image&amp;gt;:&amp;lt;Image tag&amp;gt;' from Container App resource
                       containerAppName: '&amp;lt;Name&amp;gt;' from Container App resource 
                       resourceGroup: '&amp;lt;resource group name&amp;gt;'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Select &lt;strong&gt;Validate and save&lt;/strong&gt;, and then select &lt;strong&gt;Save&lt;/strong&gt; again to commit directly to the main branch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fowweq62jvi24tp4zbz4j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fowweq62jvi24tp4zbz4j.png" alt="rtcewwd" width="800" height="422"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyldg0k6nqp8723hjs0du.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyldg0k6nqp8723hjs0du.png" alt="4ctbeub" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;You most likely have an indentation error. Review and correct the indentation to resolve the issue&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The contents of the YAML file must be formatted correctly, including indentation. If you encounter an error, review the YAML file and correct any indentation issues&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Navigate back to the main page of your pipeline.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Run the Pipeline1 deployment task&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that you have Pipeline1 open in Azure DevOps.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Runs tab of the Pipeline1 page, select Run pipeline.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A Run pipeline page opens to display the associated job.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Run&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhdzu086le7e6gb9s34v5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhdzu086le7e6gb9s34v5.png" alt="6tderdtrr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;The Jobs section displays job status, which progresses from Queued to Waiting&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;It can take a couple minutes for the status to transition from Queued to Waiting&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If a 'Permission needed' message is displayed ("This pipeline needs permission to access 2 resources before this run can continue"), select &lt;strong&gt;View&lt;/strong&gt; and then provide the required permissions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3673mmcjkwbugvuelpij.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3673mmcjkwbugvuelpij.png" alt="6frdrtfft" width="800" height="426"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3n3cw8ez1i7g6iptose.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3n3cw8ez1i7g6iptose.png" alt="ftrrd55" width="800" height="424"&gt;&lt;/a&gt;&lt;br&gt;
Monitor the status of the run operation and verify that the run is successful.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrbmjkl5qcxc6kvth5d7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrbmjkl5qcxc6kvth5d7.png" alt="r6rdduy" width="800" height="142"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verify your work&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this task, you examine your pipeline and container app to verify successful pipeline runs.&lt;/p&gt;

&lt;p&gt;Ensure that you have &lt;strong&gt;Project1&lt;/strong&gt; open in Azure DevOps.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the left side menu, select &lt;strong&gt;Pipelines&lt;/strong&gt;, and then select &lt;strong&gt;Pipeline1&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Runs tab displays individual runs that can be selected to review details.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1v95nlft16gn7a8ng339.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1v95nlft16gn7a8ng339.png" alt="f65detr" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open your Azure portal, and then open your Container App.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left side menu, select Activity Log.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Verify that a Create or Update Container App operation succeeded as a result of running your pipeline.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ferwb9z5k4gg9csx4ls08.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ferwb9z5k4gg9csx4ls08.png" alt="I87yrr" width="800" height="385"&gt;&lt;/a&gt;&lt;br&gt;
Notice that the Event initiated by column on the right shows your Project1 as the source.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In summary, configuring continuous integration (CI) with Azure Pipelines enables automated building, testing, and deployment of containerized applications to Azure Container Apps. By integrating your source code repository with Azure Pipelines, each commit or pull request can automatically trigger a pipeline that builds the container image, pushes it to Azure Container Registry (ACR), and deploys the updated image to Azure Container Apps. This setup ensures consistent application delivery, reduces manual overhead, and improves deployment reliability. Using YAML-based pipelines also provides version-controlled build definitions, promoting repeatability and alignment with DevOps best practices.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cicd</category>
      <category>devops</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Create and configure a container app in Azure Container Apps</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Thu, 27 Nov 2025 03:44:52 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/create-and-configure-a-container-app-in-azure-container-apps-14hj</link>
      <guid>https://dev.to/lotanna_obianefo/create-and-configure-a-container-app-in-azure-container-apps-14hj</guid>
      <description>&lt;p&gt;Creating and configuring a container app in Azure Container Apps involves deploying a containerized application into a fully managed, serverless environment designed for cloud-native workloads. Azure Container Apps abstracts away the need to manage Kubernetes clusters while still providing key capabilities such as autoscaling, revisions, traffic splitting, and secure integration with supporting services like Azure Container Registry.&lt;/p&gt;

&lt;p&gt;In this process, you define the container image to run, configure environment settings such as ingress and scaling rules, and connect the app to the required resources and identities. This setup enables reliable, scalable, and efficient execution of microservices and event-driven applications without managing underlying infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a container app that uses an ACR image&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Open your Azure portal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the portal menu, select &lt;strong&gt;+ Create&lt;/strong&gt; a resource.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the top search bar, in the Search textbox, enter &lt;strong&gt;container app&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the search results under Services, select &lt;strong&gt;Container Apps&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;Basics&lt;/strong&gt; tab, specify the following:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Specify the Azure subscription that you're using.&lt;br&gt;
Resource group: &lt;strong&gt;RG1&lt;/strong&gt;&lt;br&gt;
Container app name: &lt;strong&gt;aca-az2003&lt;/strong&gt;&lt;br&gt;
Region: Select the Region specified for VNET1 &lt;strong&gt;(Central US)&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The container app needs to be in the same region/location as the virtual network so you can choose VNET1 for the managed environment. For this guided project, keep all of your resources in the region/location specified for your resource group&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Container Apps Environment: Select &lt;strong&gt;Create new&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhetzi34xdjb7xaq7ioa3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhetzi34xdjb7xaq7ioa3.png" alt="rtge54" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxp97ktaecvbxvr03csg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxp97ktaecvbxvr03csg.png" alt="YUDSSS" width="800" height="352"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1xnowrjc219mq10ecm3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1xnowrjc219mq10ecm3.png" alt="II8Y6TR6" width="800" height="194"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Create Container Apps Environment page, select the &lt;strong&gt;Networking&lt;/strong&gt; tab, and then specify the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Use your own virtual network: Select &lt;strong&gt;Yes&lt;/strong&gt;.&lt;br&gt;
Virtual network: Select &lt;strong&gt;VNET1&lt;/strong&gt;.&lt;br&gt;
Infrastructure subnet: &lt;strong&gt;ACASubnet&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If the ACASubnet subnet is not listed, open your virtual network resource, adjust the subnet address range to 10.0.2.0/23 and retry the steps to create the Container App&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Create Container Apps Environment page, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqnig8uisew0bpwzlte95.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqnig8uisew0bpwzlte95.png" alt="UYEW3A3" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Create Container App page, select the &lt;strong&gt;Container&lt;/strong&gt; tab, and then specify the following:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Use quickstart image: Ensure that this setting is not selected. If it is selected, &lt;strong&gt;uncheck&lt;/strong&gt; this setting.&lt;br&gt;
Name: Enter &lt;strong&gt;aca-az2003&lt;/strong&gt;&lt;br&gt;
Image source: Ensure that &lt;strong&gt;Azure Container Registry&lt;/strong&gt; is selected.&lt;br&gt;
Registry: Select your container registry.: &lt;strong&gt;acraz2003cah.azurecr.io&lt;/strong&gt;&lt;br&gt;
Image: Select &lt;strong&gt;aspnetcorecontainer&lt;/strong&gt;&lt;br&gt;
Image tag: Select &lt;strong&gt;latest&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Review + create&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F901y8ugutbridz2ise2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F901y8ugutbridz2ise2w.png" alt="yut65rtfe" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once verification has Passed, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the deployment to complete.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb9g2iqo05ogaatov8zxl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb9g2iqo05ogaatov8zxl.png" alt="ytr655" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure the container app to authenticate using the user assigned identity&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Azure portal, open the Container App that you created.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Security, select &lt;strong&gt;Identity&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the tab for &lt;strong&gt;User assigned&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Add user assigned managed identity&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Add user assigned managed identity page, select &lt;strong&gt;uai-az2003&lt;/strong&gt;, and then select &lt;strong&gt;Add&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9wqbap8xaw2atnqh4egh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9wqbap8xaw2atnqh4egh.png" alt="tr5rtt" width="800" height="401"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjauv8bt4tomy7v1wbxqq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjauv8bt4tomy7v1wbxqq.png" alt="husdfdg" width="800" height="400"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkzow1fxbr0czywxqn32z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkzow1fxbr0czywxqn32z.png" alt="sgdevge" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure a connection between the container app and Service Bus&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Azure portal, ensure that you have your Container App open.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Settings, select &lt;strong&gt;Service Connector&lt;/strong&gt; (Preview).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select Connect to your Services.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Create connection page, specify the following:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Service type: Select &lt;strong&gt;Services Bus&lt;/strong&gt;.&lt;br&gt;
Client type: Select &lt;strong&gt;.NET&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Next&lt;/strong&gt;: Authentication.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpcrd2aiq75wid3ufe5y7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpcrd2aiq75wid3ufe5y7.png" alt="dr34fff" width="800" height="403"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sewi4a7nzjczt4o35k4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sewi4a7nzjczt4o35k4.png" alt="ert45egfe" width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Authentication tab, select &lt;strong&gt;User assigned managed identity&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensure that the correct subscription and user assigned managed identity are selected.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: The Azure subscription that you're using. &lt;br&gt;
User assigned managed identity: &lt;strong&gt;uai-az2003&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To change tabs, select &lt;strong&gt;Review + Create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv9nsk3agltbdjom93u4d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv9nsk3agltbdjom93u4d.png" alt="kjfdgqw" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once the Validation passed message appears, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the connection to be created.&lt;/p&gt;

&lt;p&gt;It can take a minute before the Service Connector page updates with the new connection.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx66gs28zx9sm01qawqdo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx66gs28zx9sm01qawqdo.png" alt="trd45drd" width="800" height="405"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx0v5i2i7a63810t7dek8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx0v5i2i7a63810t7dek8.png" alt="yu5rrr" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure HTTP scale rules&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that your Container App is open in the portal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the left-side menu under &lt;strong&gt;Application&lt;/strong&gt;, select &lt;strong&gt;Revisions and replicas&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Notice the Name assigned to your active revision.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the left-side menu under Application, select &lt;strong&gt;Containers&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To the right of Based on revision, ensure that your active revision is selected.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At the top of the page, select &lt;strong&gt;Edit and deploy&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At the bottom of the page, select &lt;strong&gt;Next&lt;/strong&gt;: Scale.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure the Min / max replicas as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Set Min replicas: &lt;strong&gt;0&lt;/strong&gt;&lt;br&gt;
Set Max replicas: &lt;strong&gt;2&lt;/strong&gt;&lt;br&gt;
Under Scale rule, select &lt;strong&gt;+ Add&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjb35fpzhgq3p8m98gjx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjb35fpzhgq3p8m98gjx.png" alt="iu8tfd4dc" width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Add scale rule page, specify the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Rule name: Enter &lt;strong&gt;scalerule-http&lt;/strong&gt;&lt;br&gt;
Type: Select &lt;strong&gt;HTTP scaling&lt;/strong&gt;.&lt;br&gt;
Concurrent requests: Set the value to &lt;strong&gt;10,000&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Add scale rule page, select &lt;strong&gt;Add&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the Create and deploy new revision page, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Ensure that your new scale rule is displayed.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcn1xj3gmrfmvu5d0l8zn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcn1xj3gmrfmvu5d0l8zn.png" alt="je43cffw" width="800" height="402"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmj8w9tsc705y6r8f371b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmj8w9tsc705y6r8f371b.png" alt="hg6t5dr66" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verify your work&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now you have to verify that your configuration meets the specified requirements.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;In the Azure portal, ensure that your Container App resource is open.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Settings, select &lt;strong&gt;Deployment&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At the top of the page, ensure that the &lt;strong&gt;Continuous deployment&lt;/strong&gt; tab is selected.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Verify that the expected Registry settings are reported:&lt;/p&gt;

&lt;p&gt;Repository source: &lt;strong&gt;Azure Container Registry&lt;/strong&gt;&lt;br&gt;
Registry: the name of your Container Registry; &lt;strong&gt;acraz2003LT2025&lt;/strong&gt;&lt;br&gt;
Image: &lt;strong&gt;aca-az2003&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2bv97szbw1hdja4yeci1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2bv97szbw1hdja4yeci1.png" alt="65rrdffy" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open your Container Apps Environment resource.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Verify that your Container App uses the proper subnet as follows:&lt;/p&gt;

&lt;p&gt;On the Overview page, verify that Virtual Network is set to &lt;strong&gt;VNET1&lt;/strong&gt;.&lt;br&gt;
On the &lt;strong&gt;Overview&lt;/strong&gt; page, verify that Infrastructure subnet is set to &lt;strong&gt;ACASubnet&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsknztih2ny82w7w0ykyz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsknztih2ny82w7w0ykyz.png" alt="c65rfcr" width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Verify that the targetService properties match the specified configuration.&lt;/p&gt;

&lt;p&gt;To verify your HTTP scale rule, you would need to run testing software that's able to simulate 10,000 concurrent HTTP requests and ensure that container replicas are created.&lt;/p&gt;

</description>
      <category>containers</category>
      <category>azure</category>
      <category>devops</category>
      <category>cloudnative</category>
    </item>
    <item>
      <title>Configure Azure Container Registry for a secure connection with Azure Container Apps</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Thu, 27 Nov 2025 00:57:38 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/configure-azure-container-registry-for-a-secure-connection-with-azure-container-apps-4peo</link>
      <guid>https://dev.to/lotanna_obianefo/configure-azure-container-registry-for-a-secure-connection-with-azure-container-apps-4peo</guid>
      <description>&lt;p&gt;Configuring Azure Container Registry (ACR) for a secure connection with Azure Container Apps is a crucial step in ensuring that your containerized applications are deployed safely and efficiently. This process involves setting up permissions and authentication so Azure Container Apps can securely pull container images from ACR without exposing credentials. By integrating ACR with managed identities or workload identities, teams can streamline deployments, improve security, and maintain a clean, automated DevOps workflow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure a user-assigned managed identity&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open your Azure portal.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the portal menu, select &lt;strong&gt;+ Create&lt;/strong&gt; a resource.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Create a resource page, in the Search services and marketplace text box, enter &lt;strong&gt;managed identity&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the filtered list of resources, select User &lt;strong&gt;Assigned Managed Identity&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the User Assigned Managed Identity page, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Create User Assigned Managed Identity page, specify the following information:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Specify the Azure subscription that you're using for this guided project.&lt;br&gt;
Resource group: &lt;strong&gt;RG1&lt;/strong&gt;&lt;br&gt;
Region: &lt;strong&gt;Central US&lt;/strong&gt;&lt;br&gt;
Name: &lt;strong&gt;uai-az2003&lt;/strong&gt;&lt;br&gt;
Select &lt;strong&gt;Review + create&lt;/strong&gt;.&lt;br&gt;
Select &lt;strong&gt;Create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnopgk8jaq0h1bsyozer1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnopgk8jaq0h1bsyozer1.png" alt="22" width="800" height="396"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkc16f6hi1uug15wh214r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkc16f6hi1uug15wh214r.png" alt="yutrr" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9u6vysh14lph2y1munnl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9u6vysh14lph2y1munnl.png" alt="ew4rw" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure Container Registry with AcrPull permissions for the managed identity&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
In the Azure portal, open your Container Registry resource that was already create.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, select &lt;strong&gt;Access Control (IAM)&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Access Control (IAM) page, select &lt;strong&gt;Add role assignment&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Search for the AcrPull role, and then select &lt;strong&gt;AcrPull&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Note: This configuration can also be applied when assigning the AcrPush role.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Members tab, to the right of Assign access to, select &lt;strong&gt;Managed identity&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;+ Select members&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Select managed identities page, under Managed identity, select &lt;strong&gt;User-assigned managed identity&lt;/strong&gt;, and then select the user-assigned managed identity created for this project.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;For example: uai-az2003.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Select managed identities page, select &lt;strong&gt;Select&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Members tab of the Add role assignment page, select &lt;strong&gt;Review + assign&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Review + assign tab, select &lt;strong&gt;Review + assign&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wait for the role assignment to be added.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9m7c4mopcagtwomggte6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9m7c4mopcagtwomggte6.png" alt="ftr5d4" width="800" height="399"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66hukchl3uu08zsz6hon.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66hukchl3uu08zsz6hon.png" alt="frd5eses" width="800" height="399"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjtxutanmjjhocxzhhwr8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjtxutanmjjhocxzhhwr8.png" alt="gftfffc" width="800" height="400"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp83nrqbwmrhiu2de6hc1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp83nrqbwmrhiu2de6hc1.png" alt="Ihfdrdd" width="800" height="401"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Configure Container Registry with a private endpoint connection&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that your Container Registry resource is open in the portal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Under Settings, select &lt;strong&gt;Networking&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Private access tab, select &lt;strong&gt;+ Create a private endpoint connection&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqex7zatkxbj1av77s5t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqex7zatkxbj1av77s5t.png" alt="Ihyfrsest" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Basics tab, under Project details, specify the following information:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Specify the Azure subscription that you're using for this project.&lt;br&gt;
Resource group: &lt;strong&gt;RG1&lt;/strong&gt;&lt;br&gt;
Name: &lt;strong&gt;pe-acr-az2003&lt;/strong&gt;&lt;br&gt;
Region: Ensure that &lt;strong&gt;Central US&lt;/strong&gt; is selected.&lt;br&gt;
Select &lt;strong&gt;Next&lt;/strong&gt;: Resource.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9y9nqkbrbz9v14phj0wz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9y9nqkbrbz9v14phj0wz.png" alt="trdsffgdt" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
On the Resource tab, ensure the following information is displayed:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Subscription: Ensure that the Azure subscription that you're using for this project is selected.&lt;br&gt;
Resource type: Ensure that &lt;strong&gt;Microsoft.ContainerRegistry/registries&lt;/strong&gt; is selected.&lt;br&gt;
Resource: Ensure that the name of your &lt;strong&gt;registry&lt;/strong&gt; is selected.&lt;br&gt;
Target sub-resource: Ensure that &lt;strong&gt;registry&lt;/strong&gt; is selected.&lt;br&gt;
Select &lt;strong&gt;Next&lt;/strong&gt;: Virtual Network.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fscf5zso8w5997kjyzk2j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fscf5zso8w5997kjyzk2j.png" alt="Iyugtdrft" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
On the Virtual Network tab, under Networking, ensure the following information is displayed:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Virtual network: Ensure that &lt;strong&gt;VNET1&lt;/strong&gt; is selected&lt;br&gt;
Subnet: Ensure that &lt;strong&gt;PESubnet&lt;/strong&gt; is selected.&lt;br&gt;
Select &lt;strong&gt;Next&lt;/strong&gt;: DNS.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fggd3ktnilcqlg86u91y5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fggd3ktnilcqlg86u91y5.png" alt="jrtqwte" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
On the DNS tab, under Private DNS Integration, ensure the following information is displayed:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Integrate with private DNS zone: Ensure that &lt;strong&gt;Yes&lt;/strong&gt; is selected.&lt;br&gt;
Private DNS Zone: Notice that (new) &lt;strong&gt;privatelink.azurecr.io&lt;/strong&gt; is specified.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftfneinhetmxk250oyrjm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftfneinhetmxk250oyrjm.png" alt="t7tytfer" width="800" height="398"&gt;&lt;/a&gt;&lt;br&gt;
Select &lt;strong&gt;Next&lt;/strong&gt;: Tags and then Select Next: &lt;strong&gt;Review + create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;On the Review + create tab, when you see the Validation passed message, select &lt;strong&gt;Create&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnri2pfrisui72fdmptrv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnri2pfrisui72fdmptrv.png" alt="yu76ftfd" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp6uoqzgswucze7bwy5pc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp6uoqzgswucze7bwy5pc.png" alt="yut65dd" width="800" height="397"&gt;&lt;/a&gt;&lt;br&gt;
Wait for the deployment to complete.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verify your work&lt;/strong&gt;&lt;br&gt;
In this task, you verify that your configuration meets the specified requirements.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the Azure portal, open your Container Registry resource.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Access Control (IAM) page, select &lt;strong&gt;Role assignments&lt;/strong&gt;.&lt;br&gt;
Verify that the role assignments list shows the &lt;strong&gt;AcrPull role&lt;/strong&gt; assigned to the User-assigned Managed Identity resource.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, under Settings, select &lt;strong&gt;Networking&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Networking page, select the &lt;strong&gt;Private access tab&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under Private endpoint, select the private endpoint that you created.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;For example, select &lt;strong&gt;per-acr-az2003&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;On the Private endpoint page, under Settings, select &lt;strong&gt;DNS configuration&lt;/strong&gt;.&lt;br&gt;
Verify the following DNS setting:&lt;br&gt;
Private DNS zone: set to &lt;strong&gt;privatelink.azurecr.io&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the left-side menu, select &lt;strong&gt;Overview&lt;/strong&gt;.&lt;br&gt;
Verify the following setting:&lt;br&gt;
Virtual network/subnet: set to &lt;strong&gt;VNET1/PESubnet&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To securely deploy containerized workloads in Azure Container Apps, you must establish a protected connection to Azure Container Registry (ACR), where container images are stored. &lt;/p&gt;

&lt;p&gt;This configuration ensures that only authorized resources can pull images from the registry. &lt;/p&gt;

</description>
      <category>containers</category>
      <category>resources</category>
      <category>azure</category>
      <category>devops</category>
    </item>
    <item>
      <title>Ingest data with a pipeline in Microsoft Fabric</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Sat, 04 Oct 2025 20:52:07 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/ingest-data-with-a-pipeline-in-microsoft-fabric-2cfo</link>
      <guid>https://dev.to/lotanna_obianefo/ingest-data-with-a-pipeline-in-microsoft-fabric-2cfo</guid>
      <description>&lt;p&gt;A data lakehouse is a common analytical data store for cloud-scale analytics solutions. One of the core tasks of a data engineer is to implement and manage the ingestion of data from multiple operational data sources into the lakehouse. In Microsoft Fabric, you can implement &lt;em&gt;extract&lt;/em&gt;, &lt;em&gt;transform&lt;/em&gt;, and &lt;em&gt;load&lt;/em&gt; (ETL) or &lt;em&gt;extract&lt;/em&gt;, &lt;em&gt;load&lt;/em&gt;, and &lt;em&gt;transform&lt;/em&gt; (ELT) solutions for data ingestion through the creation of pipelines.&lt;/p&gt;

&lt;p&gt;Fabric also supports Apache Spark, enabling you to write and run code to process data at scale. By combining the pipeline and Spark capabilities in Fabric, you can implement complex data ingestion logic that copies data from external sources into the OneLake storage on which the lakehouse is based, and then uses Spark code to perform custom data transformations before loading it into tables for analysis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a workspace&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before working with data in Fabric, create a workspace with the Fabric trial enabled.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to the Microsoft Fabric home page at &lt;a href="https://app.fabric.microsoft.com/home?experience=fabric-developer" rel="noopener noreferrer"&gt;https://app.fabric.microsoft.com/home?experience=fabric-developer&lt;/a&gt; in a browser and sign in with your Fabric credentials.&lt;/li&gt;
&lt;li&gt;In the menu bar on the left, select Workspaces (the icon looks similar to 🗇).&lt;/li&gt;
&lt;li&gt;Create a new workspace with a name of your choice, selecting a licensing mode in the Advanced section that includes Fabric capacity.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd6wxkhxvlke8o3oqfujx.png" alt="ijisjfif" width="800" height="423"&gt;
When your new workspace opens, it should be empty.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdeoat7sp6q2ahqsmz55c.png" alt="Ikadfdgf" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a lakehouse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that you have a workspace, it’s time to create a data lakehouse into which you will ingest data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the menu bar on the left, select &lt;strong&gt;Create&lt;/strong&gt;. In the New page, under the &lt;strong&gt;Data Engineering&lt;/strong&gt; section, select &lt;strong&gt;Lakehouse&lt;/strong&gt;. Give it a unique name of your choice.&lt;/li&gt;
&lt;li&gt;After a minute or so, a new lakehouse with no Tables or Files will be created.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77y2wczrsvjt6znbe42z.png" alt="uygtrr" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzod2n7m7sa13f5iwg7vx.png" alt="kjhytg" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;On the &lt;strong&gt;Explorer&lt;/strong&gt; pane on the left, in the … menu for the &lt;strong&gt;Files&lt;/strong&gt; node, select &lt;strong&gt;New subfolder&lt;/strong&gt; and create a subfolder named &lt;strong&gt;new_data&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkqfmevgjux3vpykamr32.png" alt="hgygddr" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffle7p0xad2j7kipqas2d.png" alt="ygtrry" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A simple way to ingest data is to use a &lt;strong&gt;Copy Data&lt;/strong&gt; activity in a pipeline to extract the data from a source and copy it to a file in the lakehouse.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the &lt;strong&gt;Home&lt;/strong&gt; page for your lakehouse, select &lt;strong&gt;Get data&lt;/strong&gt; and then select &lt;strong&gt;New data pipeline&lt;/strong&gt;, and create a new data pipeline named &lt;strong&gt;Ingest Sales Data&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;If the &lt;strong&gt;Copy Data&lt;/strong&gt; wizard doesn’t open automatically, select &lt;strong&gt;Copy Data&lt;/strong&gt; &amp;gt; &lt;strong&gt;Use copy assistant&lt;/strong&gt; in the pipeline editor page.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Copy Data&lt;/strong&gt; wizard, on the &lt;strong&gt;Choose data source&lt;/strong&gt; page, type HTTP in the search bar and then select &lt;strong&gt;HTTP&lt;/strong&gt; in the &lt;strong&gt;New sources&lt;/strong&gt; section.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frm4h5159azeije9hntso.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frm4h5159azeije9hntso.png" alt="ufhsaag" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32gt7vo9q491e3di1k0g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32gt7vo9q491e3di1k0g.png" alt="IzsfFWE" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrt5xfyd6xljb7kb0ili.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrt5xfyd6xljb7kb0ili.png" alt="IEFSAFEFW" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the Connect to data source pane, enter the following settings for the connection to your data source:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;URL&lt;/strong&gt;: &lt;a href="https://raw.githubusercontent.com/MicrosoftLearning/dp-data/main/sales.csv" rel="noopener noreferrer"&gt;https://raw.githubusercontent.com/MicrosoftLearning/dp-data/main/sales.csv&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Connection&lt;/strong&gt;: Create new connection&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;Connection name&lt;/strong&gt;: Specify a unique name&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;Data gateway&lt;/strong&gt;: (none)&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; &lt;strong&gt;Authentication kind&lt;/strong&gt;: Anonymous&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Next&lt;/strong&gt;. Then ensure the following settings are selected:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Relative URL&lt;/strong&gt;: Leave blank&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Request method&lt;/strong&gt;: GET&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;Additional headers&lt;/strong&gt;: Leave blank&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;Binary copy&lt;/strong&gt;: Unselected&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; &lt;strong&gt;Request timeout&lt;/strong&gt;: Leave blank&lt;br&gt;
&lt;strong&gt;(vi)&lt;/strong&gt; &lt;strong&gt;Max concurrent connections&lt;/strong&gt;: Leave blank&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Next&lt;/strong&gt;, and wait for the data to be sampled and then ensure that the following settings are selected:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;File format&lt;/strong&gt;: DelimitedText&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Column delimiter&lt;/strong&gt;: Comma (,)&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;Row delimiter&lt;/strong&gt;: Line feed (\n)&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;First row as header&lt;/strong&gt;: Selected&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; &lt;strong&gt;Compression type&lt;/strong&gt;: None&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;strong&gt;Preview data&lt;/strong&gt; to see a sample of the data that will be ingested. Then close the data preview and select &lt;strong&gt;Next&lt;/strong&gt;.
On the Connect to data destination page, set the following data destination options, and then select &lt;strong&gt;Next&lt;/strong&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Root folder&lt;/strong&gt;: Files&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Folder path name&lt;/strong&gt;: new_data&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;File name&lt;/strong&gt;: sales.csv&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;Copy behavior&lt;/strong&gt;: None&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set the following file format options and then select &lt;strong&gt;Next&lt;/strong&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;File format&lt;/strong&gt;: DelimitedText&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Column delimiter&lt;/strong&gt;: Comma (,)&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;Row delimiter&lt;/strong&gt;: Line feed (\n)&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;Add header to file&lt;/strong&gt;: Selected&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; &lt;strong&gt;Compression type&lt;/strong&gt;: None&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Copy summary page, review the details of your copy operation and then select &lt;strong&gt;Save + Run&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo79jgzugg1i6t4mngxtr.png" alt="gatrtqefr" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyf86c34hnr18eszus03s.png" alt="sgrregae" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F44n4wlcf9hn6upb284fb.png" alt="rgagergtE" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxtq9oh54nqijwlm36a7i.png" alt="TFREEYY" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhhdam5z128duon7uen5d.png" alt="utdeserrr" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo4xf95bmay1lzeob4kde.png" alt="jfgfdga" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;A new pipeline containing a Copy Data activity is created, as shown here:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fag9klo1vhmxkidsl767q.png" alt="kkcsfff" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;When the pipeline starts to run, you can monitor its status in the &lt;strong&gt;Output&lt;/strong&gt; pane under the pipeline designer. Use the ↻ (Refresh) icon to refresh the status, and wait until it has succeeeded.&lt;/li&gt;
&lt;li&gt;In the menu bar on the left, select your &lt;strong&gt;lakehouse&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;On the Home page, in the Explorer pane, expand &lt;strong&gt;Files&lt;/strong&gt; and select the &lt;strong&gt;new_data&lt;/strong&gt; folder to verify that the &lt;strong&gt;sales.csv&lt;/strong&gt; file has been copied.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fygivqzgwo83emr8ziiuz.png" alt="hgyfrresy" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a notebook&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the &lt;strong&gt;Home&lt;/strong&gt; page for your lakehouse, in the &lt;strong&gt;Open notebook&lt;/strong&gt; menu, select &lt;strong&gt;New notebook&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;After a few seconds, a new notebook containing a single cell will open. Notebooks are made up of one or more cells that can contain code or markdown (formatted text)&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the existing cell in the notebook, which contains some simple code, and then replace the default code with the following variable declaration.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;table_name = "sales"&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the &lt;strong&gt;…&lt;/strong&gt; menu for the cell (at its top-right) select &lt;strong&gt;Toggle parameter cell&lt;/strong&gt;. This configures the cell so that the variables declared in it are treated as parameters when running the notebook from a pipeline.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz9m7l9fat7iggl1fj7jy.png" alt="gfffdfft" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F767nvfubroag9qm2ex85.png" alt="gdfsdf" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Under the parameters cell, use the + Code button to add a new code cell. Then add the following code to it:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from pyspark.sql.functions import *&lt;/p&gt;

&lt;p&gt;# Read the new sales data&lt;br&gt;
df = spark.read.format("csv").option("header","true").load("Files/new_data/*.csv")&lt;/p&gt;

&lt;p&gt;## Add month and year columns&lt;br&gt;
df = df.withColumn("Year", year(col("OrderDate"))).withColumn("Month", month(col("OrderDate")))&lt;/p&gt;

&lt;p&gt;# Derive FirstName and LastName columns&lt;br&gt;
df = df.withColumn("FirstName", split(col("CustomerName"), " ").getItem(0)).withColumn("LastName", split(col("CustomerName"), " ").getItem(1))&lt;/p&gt;

&lt;p&gt;# Filter and reorder columns&lt;br&gt;
df = df["SalesOrderNumber", "SalesOrderLineNumber", "OrderDate", "Year", "Month", "FirstName", "LastName", "EmailAddress", "Item", "Quantity", "UnitPrice", "TaxAmount"]&lt;/p&gt;

&lt;p&gt;# Load the data into a table&lt;br&gt;
df.write.format("delta").mode("append").saveAsTable(table_name)&lt;/p&gt;

&lt;p&gt;&lt;em&gt;This code loads the data from the sales.csv file that was ingested by the Copy Data activity, applies some transformation logic, and saves the transformed data as a table - appending the data if the table already exists&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Verify that your notebooks looks similar to this, and then use the &lt;strong&gt;▷ Run all&lt;/strong&gt; button on the toolbar to run all of the cells it contains.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikzy3qrtugcd80npchwi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikzy3qrtugcd80npchwi.png" alt="jdsrfw" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Since this is the first time you’ve run any Spark code in this session, the Spark pool must be started. This means that the first cell can take a minute or so to complete&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When the notebook run has completed, in the &lt;strong&gt;Explorer&lt;/strong&gt; pane on the left, in the &lt;strong&gt;…&lt;/strong&gt; menu for Tables select &lt;strong&gt;Refresh&lt;/strong&gt; and verify that a sales table has been created.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdnyy7mqwwf4v5ipt22yw.png" alt="dssffdw" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;In the notebook menu bar, use the ⚙️ &lt;strong&gt;Settings&lt;/strong&gt; icon to view the notebook settings. Then set the &lt;strong&gt;Name&lt;/strong&gt; of the notebook to &lt;strong&gt;Load Sales&lt;/strong&gt; and close the settings pane.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F72kcc5gl793l0fj5qeyt.png" alt="eserfrr" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;In the hub menu bar on the left, select your lakehouse.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Explorer&lt;/strong&gt; pane, refresh the view. Then expand &lt;strong&gt;Tables&lt;/strong&gt;, and select the &lt;strong&gt;sales&lt;/strong&gt; table to see a preview of the data it contains.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnzctmdtw0xd4i9klilek.png" alt="Isdgdfgrs" width="800" height="420"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Modify the pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that you’ve implemented a notebook to transform data and load it into a table, you can incorporate the notebook into a pipeline to create a reusable ETL process.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the hub menu bar on the left select the &lt;strong&gt;Ingest Sales Data&lt;/strong&gt; pipeline you created previously.&lt;/li&gt;
&lt;li&gt;On the &lt;strong&gt;Activities&lt;/strong&gt; tab, in the &lt;strong&gt;All activities&lt;/strong&gt; list, select &lt;strong&gt;Delete data&lt;/strong&gt;. Then position the new Delete data activity to the left of the &lt;strong&gt;Copy data&lt;/strong&gt; activity and connect its &lt;strong&gt;On completion&lt;/strong&gt; output to the &lt;strong&gt;Copy data&lt;/strong&gt; activity, as shown here:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frbc9kbv72yyufvb357bl.png" alt="Ijdjjdhs" width="800" height="362"&gt;
&lt;/li&gt;
&lt;li&gt;Select the Delete data activity, and in the pane below the design canvas, set the following properties:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;General&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Name&lt;/strong&gt;: Delete old files&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Source&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Connection&lt;/strong&gt;: Your lakehouse&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;File path type&lt;/strong&gt;: Wildcard file path&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; &lt;strong&gt;Folder path&lt;/strong&gt;: Files / new_data&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; &lt;strong&gt;Wildcard file name&lt;/strong&gt;: &lt;em&gt;.csv&lt;br&gt;
*&lt;/em&gt;(v)** &lt;strong&gt;Recursively&lt;/strong&gt;: Selected&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logging settings&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Enable logging&lt;/strong&gt;: Unselected&lt;/p&gt;

&lt;p&gt;&lt;em&gt;These settings will ensure that any existing .csv files are deleted before copying the sales.csv file&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frkv395pcmm42ujsgs2d1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frkv395pcmm42ujsgs2d1.png" alt="Idisudjdss" width="800" height="420"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa764qu9kitot4j3nsdi1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa764qu9kitot4j3nsdi1.png" alt="Iiedjdhjjusj" width="800" height="420"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmllopgu1h2fnqxjdmeui.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmllopgu1h2fnqxjdmeui.png" alt="judhsdudu" width="800" height="420"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3e92g6po1k72crdp3kk6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3e92g6po1k72crdp3kk6.png" alt="Iisdusn" width="800" height="420"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fss37j179hp5y56wjkdwz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fss37j179hp5y56wjkdwz.png" alt="Iudusudd" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the &lt;strong&gt;Copy data&lt;/strong&gt; activity and then connect its &lt;strong&gt;On Completion&lt;/strong&gt; output to the &lt;strong&gt;Notebook&lt;/strong&gt; activity as shown here.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr44xwnxorsy7tg1d9tbq.png" alt="jfdgsgs" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;Select the Notebook activity, and then in the pane below the design canvas, set the following properties:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;General&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Name&lt;/strong&gt;: Load Sales notebook&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Settings&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Notebook&lt;/strong&gt;: Load Sales&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Base parameters&lt;/strong&gt;: &lt;br&gt;
Add a new parameter with the following properties:&lt;br&gt;
&lt;strong&gt;(i)&lt;/strong&gt; &lt;strong&gt;Name&lt;/strong&gt;: table_name&lt;br&gt;&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; &lt;strong&gt;Type&lt;/strong&gt;: String&lt;br&gt;&lt;br&gt;
(iii) &lt;strong&gt;Value&lt;/strong&gt;: new_sales&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The &lt;strong&gt;table_name&lt;/strong&gt; parameter will be passed to the notebook and override the default value assigned to the table_name variable in the parameters cell&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshnx6b6i2jjltkzeetzd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshnx6b6i2jjltkzeetzd.png" alt="IsfEKOA" width="800" height="420"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyxajhql7r2s9qymy2wib.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyxajhql7r2s9qymy2wib.png" alt="SDFAFAEW" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the Home tab, use the 🖫 (Save) icon to save the pipeline. Then use the ▷ Run button to run the pipeline, and wait for all of the activities to complete.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbon1v5cmpegtvjkvwhib.png" alt="dfgadf" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;In the hub menu bar on the left edge of the portal, select your lakehouse.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Explorer&lt;/strong&gt; pane, expand &lt;strong&gt;Tables&lt;/strong&gt; and select the &lt;strong&gt;new_sales&lt;/strong&gt; table to see a preview of the data it contains. This table was created by the notebook when it was run by the pipeline.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fulxo0tg4ibd4ov4kg5ft.png" alt="graijajga" width="800" height="420"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this project, you learned how to design and implement a data pipeline in Microsoft Fabric. The solution ingests data from an external source into a lakehouse using a pipeline, followed by a Spark notebook that transforms the ingested data and loads it into a structured table for further analysis.&lt;/p&gt;

</description>
      <category>microsoft</category>
      <category>cloudcomputing</category>
      <category>fabric</category>
      <category>performance</category>
    </item>
    <item>
      <title>Analyze data with Apache Spark in Fabric</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Fri, 03 Oct 2025 13:03:15 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/analyze-data-with-apache-spark-in-fabric-3hfj</link>
      <guid>https://dev.to/lotanna_obianefo/analyze-data-with-apache-spark-in-fabric-3hfj</guid>
      <description>&lt;p&gt;Data-driven organizations rely heavily on the ability to process, transform, and analyze large datasets efficiently. Microsoft Fabric provides a unified platform for analytics, and at its core is Apache Spark, a powerful distributed computing engine. Spark in Fabric enables developers, data engineers, and analysts to analyze massive volumes of structured and unstructured data in real time, all within an integrated environment.&lt;/p&gt;

&lt;p&gt;In this project you will ingest data into the Fabric lakehouse and use PySpark to read and analyze the data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a workspace&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before working with data in Fabric, create a workspace in a tenant with the Fabric capacity enabled.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to the Microsoft Fabric home page at &lt;a href="https://app.fabric.microsoft.com/home?experience=fabric-developer" rel="noopener noreferrer"&gt;https://app.fabric.microsoft.com/home?experience=fabric-developer&lt;/a&gt; in a browser and sign in with your Fabric credentials.&lt;/li&gt;
&lt;li&gt;In the menu bar on the left, select &lt;strong&gt;Workspaces&lt;/strong&gt; (the icon looks similar to 🗇). You can also find it on the home page.&lt;/li&gt;
&lt;li&gt;Create a new &lt;strong&gt;workspace&lt;/strong&gt; with a name of your choice, selecting a licensing mode in the Advanced section that includes Fabric capacity.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhr2r4mp9paz5r6139xt1.png" alt="Ijjsdos" width="800" height="420"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb94zlnvfcn32hrv8lfi9.png" alt="hffg" width="800" height="420"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a lakehouse and upload files&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that you have a workspace, it’s time to create a data lakehouse for your data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the menu bar on the left, select Create. In the New page, under the &lt;strong&gt;Data Engineering&lt;/strong&gt; section, select &lt;strong&gt;Lakehouse&lt;/strong&gt;. Give it a unique name of your choice.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqipa966udr5masg1zj32.png" alt="usrri" width="800" height="420"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F493iameu7ca95gzrsas9.png" alt="lodsju" width="800" height="420"&gt;
After a minute or so, a new lakehouse will be created:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp5oyu4nk6415o32nvqi0.png" alt="kjdekdw" width="800" height="420"&gt;
View the new lakehouse, and note that the &lt;strong&gt;Lakehouse explorer&lt;/strong&gt; pane on the left enables you to browse tables and files in the lakehouse:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdame65n60cfx5zcyu321.png" alt="jhyfrd" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;You can now ingest data into the lakehouse. There are several ways to do this, but for now you’ll download a folder of text files to your local computer (or VM if applicable) and then upload them to your lakehouse.&lt;/li&gt;
&lt;li&gt;Download the datafiles from &lt;a href="https://github.com/MicrosoftLearning/dp-data/raw/main/orders.zip" rel="noopener noreferrer"&gt;https://github.com/MicrosoftLearning/dp-data/raw/main/orders.zip&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Extract the zipped archive and verify that you have a folder named orders which contains three CSV files: 2019.csv, 2020.csv, and 2021.csv.&lt;/li&gt;
&lt;li&gt;Return to your new lakehouse. In the &lt;strong&gt;Explorer&lt;/strong&gt; pane, next to the Files folder select the &lt;strong&gt;…&lt;/strong&gt; menu, and select &lt;strong&gt;Upload&lt;/strong&gt; and &lt;strong&gt;Upload folder&lt;/strong&gt;. Navigate to the orders folder on your local computer (or lab VM if applicable) and select Upload.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4j0fqjuolnop9hkl9lbu.png" alt="iudesyt" width="800" height="420"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ueqdjrovm4upvavhw5k.png" alt="ugtftft" width="800" height="420"&gt;
&lt;/li&gt;
&lt;li&gt;After the files have been uploaded, expand Files and select the orders folder. Check that the CSV files have been uploaded, as shown here:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0stm2mf6qdaus6ji59mz.png" alt="utdrrs" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a notebook&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can now create a Fabric notebook to work with your data. Notebooks provide an interactive environment where you can write and run code.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the menu bar on the left, select &lt;strong&gt;Create&lt;/strong&gt;. In the New page, under the &lt;strong&gt;Data Engineering&lt;/strong&gt; section, select &lt;strong&gt;Notebook&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Fabric assigns a name to each notebook you create, such as Notebook 1, Notebook 2, etc. Click the name panel above the Home tab on the menu to change the name to something more descriptive.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27euyi28vhnl8v727201.png" alt="kdwdwd" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpkv2gkgj1et45fgu8y2.png" alt="sdkdawf" width="800" height="423"&gt;
A new notebook named &lt;strong&gt;data_Notebook&lt;/strong&gt; is created and opened.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkcee90o0s8hd0ajopc6l.png" alt="jwwdwde" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Select the first cell (which is currently a code cell), and then in the top-right tool bar, use the &lt;strong&gt;M&lt;/strong&gt;↓ button to convert it to a markdown cell. The text contained in the cell will then be displayed as formatted text.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd5rfclwokpvptg1mkaub.png" alt="Igtrrtr" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Use the 🖉 (Edit) button to switch the cell to editing mode, then modify the markdown as shown below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;# Sales order data exploration&lt;br&gt;
Use this notebook to explore sales order data&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you have finished, click anywhere in the notebook outside of the cell to stop editing it.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhidqfb7carpwjto1tijf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhidqfb7carpwjto1tijf.png" alt="gfrdrh" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a DataFrame&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that you have created a workspace, a lakehouse, and a notebook you are ready to work with your data. You will use PySpark, which is the default language for Fabric notebooks, and the version of Python that is optimized for Spark.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Fabric notebooks support multiple programming languages including Scala, R, and Spark SQL&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select your new workspace from the left bar. You will see a list of items contained in the workspace including your lakehouse and notebook.&lt;/li&gt;
&lt;li&gt;Select the lakehouse to display the Explorer pane, including the orders folder.&lt;/li&gt;
&lt;li&gt;From the top menu, select Open notebook, Existing notebook, and then open the notebook you created earlier. The notebook should now be open next to the Explorer pane. Expand &lt;strong&gt;Lakehouses&lt;/strong&gt;, expand the &lt;strong&gt;Files&lt;/strong&gt; list, and select the &lt;strong&gt;orders&lt;/strong&gt; folder. The CSV files that you uploaded are listed next to the notebook editor.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F54gvhvvn954yrog5ixsq.png" alt="fded" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4f95qcle541g9521zbdd.png" alt="hjwdiwd" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;From the &lt;strong&gt;…&lt;/strong&gt; menu for 2019.csv, select &lt;strong&gt;Load data&lt;/strong&gt; &amp;gt; &lt;strong&gt;Spark&lt;/strong&gt;. The following code is automatically generated in a new code cell:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frb02s1isshy8kw8gz2r5.png" alt="Ihfwrwewi" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg5bimlboyr41yqxayw09.png" alt="igeetew" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Select ▷ Run cell to the left of the cell to run the code.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;You can hide the Explorer panes on the left by using the « icons. This gives more space for the notebook&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The first time you run Spark code, a Spark session is started. This can take a few seconds or longer. Subsequent runs within the same session will be quicker&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When the cell code has completed, review the output below the cell, which should look like this:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F88kf0tv6jlb45cggvba8.png" alt="Iiuegqt" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;The output shows data from the 2019.csv file displayed in columns and rows. Notice that the column headers contain the first line of the data. To correct this, you need to modify the first line of the code as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;df = spark.read.format("csv").option("header","false").load("Files/orders/2019.csv")&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code again, so that the DataFrame correctly identifies the first row as data. Notice that the column names have now changed to _c0, _c1, etc.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzpwn5jzg7hxk9lcyj9x.png" alt="fdhxe" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Descriptive column names help you make sense of data. To create meaningful column names, you need to define the schema and data types. You also need to import a standard set of Spark SQL types to define the data types. Replace the existing code with the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from pyspark.sql.types import *&lt;/p&gt;

&lt;p&gt;orderSchema = StructType([&lt;br&gt;
    StructField("SalesOrderNumber", StringType()),&lt;br&gt;
    StructField("SalesOrderLineNumber", IntegerType()),&lt;br&gt;
    StructField("OrderDate", DateType()),&lt;br&gt;
    StructField("CustomerName", StringType()),&lt;br&gt;
    StructField("Email", StringType()),&lt;br&gt;
    StructField("Item", StringType()),&lt;br&gt;
    StructField("Quantity", IntegerType()),&lt;br&gt;
    StructField("UnitPrice", FloatType()),&lt;br&gt;
    StructField("Tax", FloatType())&lt;br&gt;
])&lt;/p&gt;

&lt;p&gt;df = spark.read.format("csv").schema(orderSchema).load("Files/orders/2019.csv")&lt;/p&gt;

&lt;p&gt;display(df)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and review the output
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbjnm46vtdj18626gm9dl.png" alt="gtfrdtgt" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;This DataFrame includes only the data from the 2019.csv file. Modify the code so that the file path uses a * wildcard to read all the files in the orders folder:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from pyspark.sql.types import *&lt;/p&gt;

&lt;p&gt;orderSchema = StructType([&lt;br&gt;
    StructField("SalesOrderNumber", StringType()),&lt;br&gt;
    StructField("SalesOrderLineNumber", IntegerType()),&lt;br&gt;
    StructField("OrderDate", DateType()),&lt;br&gt;
    StructField("CustomerName", StringType()),&lt;br&gt;
    StructField("Email", StringType()),&lt;br&gt;
    StructField("Item", StringType()),&lt;br&gt;
    StructField("Quantity", IntegerType()),&lt;br&gt;
    StructField("UnitPrice", FloatType()),&lt;br&gt;
    StructField("Tax", FloatType())&lt;br&gt;
])&lt;/p&gt;

&lt;p&gt;df = spark.read.format("csv").schema(orderSchema).load("Files/orders/*.csv")&lt;/p&gt;

&lt;p&gt;display(df)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you run the modified code, you should see sales for 2019, 2020, and 2021. Only a subset of the rows is displayed, so you may not see rows for every year.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;You can hide or show the output of a cell by selecting … next to the result. This makes it easier to work in a notebook&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Explore data in a DataFrame&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The DataFrame object provides additional functionality such as the ability to filter, group, and manipulate data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Filter a DataFrame&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a code cell by selecting &lt;strong&gt;+ Code&lt;/strong&gt; which appears when you hover the mouse above or below the current cell or its output. Alternatively, from the ribbon menu select &lt;strong&gt;Edit&lt;/strong&gt; and &lt;strong&gt;+ Add code cell below&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;The following code filters the data so that only two columns are returned. It also uses count and &lt;em&gt;distinct&lt;/em&gt; to summarize the number of records:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;customers = df['CustomerName', 'Email']&lt;/p&gt;

&lt;p&gt;print(customers.count())&lt;br&gt;
print(customers.distinct().count())&lt;/p&gt;

&lt;p&gt;display(customers.distinct())&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code, and examine the output:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;(i) The code creates a new DataFrame called &lt;strong&gt;customers&lt;/strong&gt; which contains a subset of columns from the original &lt;strong&gt;df&lt;/strong&gt; DataFrame. When performing a DataFrame transformation you do not modify the original DataFrame, but return a new one.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1uyy0rxu9s7fs1dhxbr0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1uyy0rxu9s7fs1dhxbr0.png" alt="jgyttfrrdr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Another way of achieving the same result is to use the select method:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;customers = df.select("CustomerName", "Email")&lt;/p&gt;

&lt;p&gt;The DataFrame functions &lt;em&gt;count&lt;/em&gt; and &lt;em&gt;distinct&lt;/em&gt; are used to provide totals for the number of customers and unique customers.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Modify the first line of the code by using select with a where function as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;customers = df.select("CustomerName", "Email").where(df['Item']=='Road-250 Red, 52')&lt;br&gt;
print(customers.count())&lt;br&gt;
print(customers.distinct().count())&lt;/p&gt;

&lt;p&gt;display(customers.distinct())&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the modified code to select only the customers who have purchased the Road-250 Red, 52 product. Note that you can “chain” multiple functions together so that the output of one function becomes the input for the next. In this case, the DataFrame created by the select method is the source DataFrame for the &lt;strong&gt;where&lt;/strong&gt; method that is used to apply filtering criteria.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8xd9qo32dvudkzw28u5.png" alt="jfdvrf" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Aggregate and group data in a DataFrame&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a code cell, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;productSales = df.select("Item", "Quantity").groupBy("Item").sum()&lt;/p&gt;

&lt;p&gt;display(productSales)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code. You can see that the results show the sum of order quantities grouped by product. The &lt;em&gt;groupBy&lt;/em&gt; method groups the rows by Item, and the subsequent &lt;em&gt;sum&lt;/em&gt; aggregate function is applied to the remaining numeric columns - in this case, &lt;em&gt;Quantity&lt;/em&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffofcpvs6ht890yrb5gj1.png" alt="ihgfcfcg" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Add another code cell to the notebook, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from pyspark.sql.functions import *&lt;/p&gt;

&lt;p&gt;yearlySales = df.select(year(col("OrderDate")).alias("Year")).groupBy("Year").count().orderBy("Year")&lt;/p&gt;

&lt;p&gt;display(yearlySales)&lt;/p&gt;

&lt;p&gt;Run the cell. Examine the output. The results now show the number of sales orders per year:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; The import statement enables you to use the Spark SQL library.&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; The select method is used with a SQL year function to extract the year component of the OrderDate field.&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; The alias method is used to assign a column name to the extracted year value.&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; The groupBy method groups the data by the derived Year column.&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; The count of rows in each group is calculated before the orderBy method is used to sort the resulting DataFrame.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqsm6733qpr7gle8ivr1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqsm6733qpr7gle8ivr1.png" alt="iytfddsery" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use Spark to transform data files&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A common task for data engineers and data scientists is to transform data for further downstream processing or analysis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use DataFrame methods and functions to transform data&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a code cell to the notebook, and enter the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from pyspark.sql.functions import *&lt;/p&gt;

&lt;p&gt;# Create Year and Month columns&lt;br&gt;
transformed_df = df.withColumn("Year", year(col("OrderDate"))).withColumn("Month", month(col("OrderDate")))&lt;/p&gt;

&lt;p&gt;# Create the new FirstName and LastName fields&lt;br&gt;
transformed_df = transformed_df.withColumn("FirstName", split(col("CustomerName"), " ").getItem(0)).withColumn("LastName", split(col("CustomerName"), " ").getItem(1))&lt;/p&gt;

&lt;p&gt;# Filter and reorder columns&lt;br&gt;
transformed_df = transformed_df["SalesOrderNumber", "SalesOrderLineNumber", "OrderDate", "Year", "Month", "FirstName", "LastName", "Email", "Item", "Quantity", "UnitPrice", "Tax"]&lt;/p&gt;

&lt;p&gt;# Display the first five orders&lt;br&gt;
display(transformed_df.limit(5))&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell. A new DataFrame is created from the original order data with the following transformations:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; Year and Month columns added, based on the OrderDate column.&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; FirstName and LastName columns added, based on the CustomerName column.&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; The columns are filtered and reordered, and the CustomerName column removed.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frdspbm6iwtrjih6wfqiz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frdspbm6iwtrjih6wfqiz.png" alt="jgytggftfty" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Review the output and verify that the transformations have been made to the data.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;You can use the Spark SQL library to transform the data by filtering rows, deriving, removing, renaming columns, and applying other data modifications&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Save the transformed data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At this point you might want to save the transformed data so that it can be used for further analysis.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Parquet&lt;/em&gt; is a popular data storage format because it stores data efficiently and is supported by most large-scale data analytics systems. Indeed, sometimes the data transformation requirement is to convert data from one format such as CSV, to Parquet.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To save the transformed DataFrame in Parquet format, add a code cell and add the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;transformed_df.write.mode("overwrite").parquet('Files/transformed_data/orders')&lt;/p&gt;

&lt;p&gt;print ("Transformed data saved!")&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and wait for the message that the data has been saved. Then, in the Explorer pane on the left, in the &lt;strong&gt;…&lt;/strong&gt; menu for the &lt;strong&gt;Files&lt;/strong&gt; node, select &lt;strong&gt;Refresh&lt;/strong&gt;. Select the &lt;em&gt;transformed_data&lt;/em&gt; folder to verify that it contains a new folder named orders, which in turn contains one or more Parquet files.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwvsqomfyp2c85pu3v3i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwvsqomfyp2c85pu3v3i.png" alt="jdrdrddrt" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkep9rd3xp3vcniwsv6ko.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkep9rd3xp3vcniwsv6ko.png" alt="fyftfrr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a cell with the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;orders_df = spark.read.format("parquet").load("Files/transformed_data/orders")&lt;br&gt;
display(orders_df)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell. A new DataFrame is created from the parquet files in the &lt;em&gt;transformed_data/orders&lt;/em&gt; folder. Verify that the results show the order data that has been loaded from the parquet files.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fytv5nfthgnf18ercadmg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fytv5nfthgnf18ercadmg.png" alt="jddgdrt" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4zgexhvx9s3wc9azflj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4zgexhvx9s3wc9azflj.png" alt="kjhgtft" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Save data in partitioned files&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When dealing with large volumes of data, partitioning can significantly improve performance and make it easier to filter data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a cell with code to save the dataframe, partitioning the data by Year and Month:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;orders_df.write.partitionBy("Year","Month").mode("overwrite").parquet("Files/partitioned_data")&lt;/p&gt;

&lt;p&gt;print ("Transformed data saved!")&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and wait for the message that the data has been saved. Then, in the Lakehouses pane on the left, in the … menu for the Files node, select &lt;strong&gt;Refresh&lt;/strong&gt; and expand the partitioned_data folder to verify that it contains a hierarchy of folders named &lt;em&gt;Year=xxxx&lt;/em&gt;, each containing folders named &lt;em&gt;Month=xxxx&lt;/em&gt;. Each month folder contains a parquet file with the orders for that month.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnjrcyorf2lsnrraalmy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnjrcyorf2lsnrraalmy.png" alt="khygttfty" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77z72qq1jlo2mul43btn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77z72qq1jlo2mul43btn.png" alt="jgtfrtt" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a new cell with the following code to load a new DataFrame from the orders.parquet file:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;orders_2021_df = spark.read.format("parquet").load("Files/partitioned_data/Year=2021/Month=*")&lt;/p&gt;

&lt;p&gt;display(orders_2021_df)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and verify that the results show the order data for sales in 2021. Notice that the partitioning columns specified in the path (Year and Month) are not included in the DataFrame.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr21dkytf43a7kdm0snpu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr21dkytf43a7kdm0snpu.png" alt="dtrdrdtft" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Work with tables and SQL&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You’ve now seen how the native methods of the DataFrame object enable you to query and analyze data from a file. However, you may be more comfortable working with tables using SQL syntax. Spark provides a metastore in which you can define relational tables.&lt;/p&gt;

&lt;p&gt;The Spark SQL library supports the use of SQL statements to query tables in the metastore. This provides the flexibility of a data lake with the structured data schema and SQL-based queries of a relational data warehouse - hence the term “data lakehouse”.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a table&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Tables in a Spark metastore are relational abstractions over files in the data lake. Tables can be &lt;em&gt;managed&lt;/em&gt; by the metastore, or &lt;em&gt;external&lt;/em&gt; and managed independently of the metastore.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a code cell to the notebook and enter the following code, which saves the DataFrame of sales order data as a table named &lt;em&gt;salesorders&lt;/em&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;# Create a new table&lt;br&gt;
df.write.format("delta").saveAsTable("salesorders")&lt;/p&gt;

&lt;p&gt;# Get the table description&lt;br&gt;
spark.sql("DESCRIBE EXTENDED salesorders").show(truncate=False)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code cell and review the output, which describes the definition of the new table&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Explorer&lt;/strong&gt; pane, in the … menu for the Tables folder, select &lt;strong&gt;Refresh&lt;/strong&gt;. Then expand the &lt;strong&gt;Tables&lt;/strong&gt; node and verify that the &lt;strong&gt;salesorders&lt;/strong&gt; table has been created.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbaheco0h6h7prtsxdyng.png" alt="IHGTFRH" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foclrmtyfyf833trae3ut.png" alt="yutttyu" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;No explicit path is provided, so the files for the table will be managed by the metastore. Also, the table is saved in delta format which adds relational database capabilities to tables. This includes support for transactions, row versioning, and other useful features. Creating tables in delta format is preferred for data lakehouses in Fabric&lt;/em&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the … menu for the salesorders table, select &lt;strong&gt;Load data **&amp;gt; **Spark&lt;/strong&gt;. A new code cell is added containing code similar to the following:&lt;/li&gt;
&lt;li&gt;Run the new code, which uses the Spark SQL library to embed a SQL query against the &lt;em&gt;salesorder&lt;/em&gt; table in PySpark code and load the results of the query into a DataFrame.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;df = spark.sql("SELECT * FROM [your_lakehouse].salesorders LIMIT 1000")&lt;/p&gt;

&lt;p&gt;display(df)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjqy8ny3z7qd96nsb07a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjqy8ny3z7qd96nsb07a.png" alt="Iyt7r5e45" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Run SQL code in a cell&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While it’s useful to be able to embed SQL statements into a cell containing PySpark code, data analysts often just want to work directly in SQL.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a new code cell to the notebook, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;%%sql&lt;br&gt;
SELECT YEAR(OrderDate) AS OrderYear,&lt;br&gt;
       SUM((UnitPrice * Quantity) + Tax) AS GrossRevenue&lt;br&gt;
FROM salesorders&lt;br&gt;
GROUP BY YEAR(OrderDate)&lt;br&gt;
ORDER BY OrderYear;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and review the results. Observe that:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; The %%sql command at the beginning of the cell (called a magic) changes the language to Spark SQL instead of PySpark.&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; The SQL code references the salesorders table that you created previously.&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; The output from the SQL query is automatically displayed as the result under the cell.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fozrq3kgd90t1mz63tpyp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fozrq3kgd90t1mz63tpyp.png" alt="ytrurrtr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Visualize data with Spark&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Charts help you to see patterns and trends faster than would be possible by scanning thousands of rows of data. Fabric notebooks include a built-in chart view but it is not designed for complex charts. For more control over how charts are created from data in DataFrames, use Python graphics libraries like &lt;em&gt;matplotlib&lt;/em&gt; or &lt;em&gt;seaborn&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View results as a chart&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a new code cell, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;%%sql&lt;br&gt;
SELECT * FROM salesorders&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code to display data from the salesorders view you created previously. In the results section beneath the cell, select &lt;strong&gt;+ New chart&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fegcet9gmwws1sg14y0cm.png" alt="jgfrdr" width="800" height="423"&gt;
Use the &lt;strong&gt;Build my own&lt;/strong&gt; button at the bottom-right of the results section and set the chart settings:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; Chart type: Bar chart&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; X-axis: Item&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; Y-axis: Quantity&lt;br&gt;
&lt;strong&gt;(iv)&lt;/strong&gt; Series Group: leave blank&lt;br&gt;
&lt;strong&gt;(v)&lt;/strong&gt; Aggregation: Sum&lt;br&gt;
&lt;strong&gt;(vi)&lt;/strong&gt; Missing and NULL values: Display as 0&lt;br&gt;
&lt;strong&gt;(vii)&lt;/strong&gt; Stacked: Unselected&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsr20589cq2ogwdwa7552.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsr20589cq2ogwdwa7552.png" alt="ytfue" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
Your chart should look similar to this:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8htxqujl171j6u0tlaoh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8htxqujl171j6u0tlaoh.png" alt="tdrydr" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Get started with matplotlib&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a new code cell, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;sqlQuery = "SELECT CAST(YEAR(OrderDate) AS CHAR(4)) AS OrderYear, \&lt;br&gt;
                SUM((UnitPrice * Quantity) + Tax) AS GrossRevenue, \&lt;br&gt;
                COUNT(DISTINCT SalesOrderNumber) AS YearlyCounts \&lt;br&gt;
            FROM salesorders \&lt;br&gt;
            GROUP BY CAST(YEAR(OrderDate) AS CHAR(4)) \&lt;br&gt;
            ORDER BY OrderYear"&lt;br&gt;
df_spark = spark.sql(sqlQuery)&lt;br&gt;
df_spark.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code. It returns a Spark DataFrame containing the yearly revenue and number of orders. To visualize the data as a chart, we’ll first use the matplotlib Python library. This library is the core plotting library on which many others are based and provides a great deal of flexibility in creating charts.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3e7rq9fnzinddpy8uuex.png" alt="Ihgytfdtrd" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Add a new code cell, and add the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from matplotlib import pyplot as plt&lt;/p&gt;

&lt;p&gt;# matplotlib requires a Pandas dataframe, not a Spark one&lt;br&gt;
df_sales = df_spark.toPandas()&lt;/p&gt;

&lt;p&gt;# Create a bar plot of revenue by year&lt;br&gt;
plt.bar(x=df_sales['OrderYear'], height=df_sales['GrossRevenue'])&lt;/p&gt;

&lt;p&gt;# Display the plot&lt;br&gt;
plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the cell and review the results, which consist of a column chart with the total gross revenue for each year. Review the code, and notice the following:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;(i)&lt;/strong&gt; The matplotlib library requires a Pandas DataFrame, so you need to convert the Spark DataFrame returned by the Spark SQL query.&lt;br&gt;
&lt;strong&gt;(ii)&lt;/strong&gt; At the core of the matplotlib library is the pyplot object. This is the foundation for most plotting functionality.&lt;br&gt;
&lt;strong&gt;(iii)&lt;/strong&gt; The default settings result in a usable chart, but there’s considerable scope to customize it.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp0me5vnqr6nk6nnjc8a5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp0me5vnqr6nk6nnjc8a5.png" alt="Igytfdtrdty" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Modify the code to plot the chart as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from matplotlib import pyplot as plt&lt;/p&gt;

&lt;p&gt;# Clear the plot area&lt;br&gt;
plt.clf()&lt;/p&gt;

&lt;p&gt;# Create a bar plot of revenue by year&lt;br&gt;
plt.bar(x=df_sales['OrderYear'], height=df_sales['GrossRevenue'], color='orange')&lt;/p&gt;

&lt;p&gt;# Customize the chart&lt;br&gt;
plt.title('Revenue by Year')&lt;br&gt;
plt.xlabel('Year')&lt;br&gt;
plt.ylabel('Revenue')&lt;br&gt;
plt.grid(color='#95a5a6', linestyle='--', linewidth=2, axis='y', alpha=0.7)&lt;br&gt;
plt.xticks(rotation=45)&lt;/p&gt;

&lt;p&gt;# Show the figure&lt;br&gt;
plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Re-run the code cell and view the results. The chart is now easier to understand.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3u3wzly2jvyoy1zs13gi.png" alt="kjhfrytft" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fts4jdwcvabb6pvpr26pj.png" alt="tyjutr" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;A plot is contained with a Figure. In the previous examples, the figure was created implicitly but it can be created explicitly. Modify the code to plot the chart as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from matplotlib import pyplot as plt&lt;/p&gt;

&lt;p&gt;# Clear the plot area&lt;br&gt;
plt.clf()&lt;/p&gt;

&lt;p&gt;# Create a Figure&lt;br&gt;
fig = plt.figure(figsize=(8,3))&lt;/p&gt;

&lt;p&gt;# Create a bar plot of revenue by year&lt;br&gt;
plt.bar(x=df_sales['OrderYear'], height=df_sales['GrossRevenue'], color='orange')&lt;/p&gt;

&lt;p&gt;# Customize the chart&lt;br&gt;
plt.title('Revenue by Year')&lt;br&gt;
plt.xlabel('Year')&lt;br&gt;
plt.ylabel('Revenue')&lt;br&gt;
plt.grid(color='#95a5a6', linestyle='--', linewidth=2, axis='y', alpha=0.7)&lt;br&gt;
plt.xticks(rotation=45)&lt;/p&gt;

&lt;p&gt;# Show the figure&lt;br&gt;
plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Re-run the code cell and view the results. The figure determines the shape and size of the plot.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqiwiqsy5scm6j9thmp6a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqiwiqsy5scm6j9thmp6a.png" alt="shthehrw" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bt0thbhelwr3ksnsrku.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bt0thbhelwr3ksnsrku.png" alt="rgrewer" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A figure can contain multiple subplots, each on its own axis. Modify the code to plot the chart as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;from matplotlib import pyplot as plt&lt;/p&gt;

&lt;p&gt;# Clear the plot area&lt;br&gt;
plt.clf()&lt;/p&gt;

&lt;p&gt;# Create a figure for 2 subplots (1 row, 2 columns)&lt;br&gt;
fig, ax = plt.subplots(1, 2, figsize = (10,4))&lt;/p&gt;

&lt;p&gt;# Create a bar plot of revenue by year on the first axis&lt;br&gt;
ax[0].bar(x=df_sales['OrderYear'], height=df_sales['GrossRevenue'], color='orange')&lt;br&gt;
ax[0].set_title('Revenue by Year')&lt;/p&gt;

&lt;p&gt;# Create a pie chart of yearly order counts on the second axis&lt;br&gt;
ax[1].pie(df_sales['YearlyCounts'])&lt;br&gt;
ax[1].set_title('Orders per Year')&lt;br&gt;
ax[1].legend(df_sales['OrderYear'])&lt;/p&gt;

&lt;p&gt;# Add a title to the Figure&lt;br&gt;
fig.suptitle('Sales Data')&lt;/p&gt;

&lt;p&gt;# Show the figure&lt;br&gt;
plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Re-run the code cell and view the results.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1jt0qae0nkuc8uvisf82.png" alt="sefagahe" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4tom9z4t40vufoc3alng.png" alt="rsegwhtrhq" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use the seaborn library&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While &lt;em&gt;matplotlib&lt;/em&gt; enables you to create different chart types, it can require some complex code to achieve the best results. For this reason, new libraries have been built on matplotlib to abstract its complexity and enhance its capabilities. One such library is seaborn.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a new code cell to the notebook, and enter the following code:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;import seaborn as sns&lt;/p&gt;

&lt;p&gt;# Clear the plot area&lt;br&gt;
plt.clf()&lt;/p&gt;

&lt;p&gt;# Create a bar chart&lt;br&gt;
ax = sns.barplot(x="OrderYear", y="GrossRevenue", data=df_sales)&lt;/p&gt;

&lt;p&gt;plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the code to display a bar chart created using the seaborn library.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhoaglgx908wiywbbqfus.png" alt="gygftfuy" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Modify the code again as follows:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;import seaborn as sns&lt;/p&gt;

&lt;p&gt;# Clear the plot area&lt;br&gt;
plt.clf()&lt;/p&gt;

&lt;p&gt;# Create a line chart&lt;br&gt;
ax = sns.lineplot(x="OrderYear", y="GrossRevenue", data=df_sales)&lt;/p&gt;

&lt;p&gt;plt.show()&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the modified code to view the yearly revenue as a line chart.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpm06jvg2csbbdvbbwpqz.png" alt="uytrtfuu" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now in this project, you’ve learned how to use Spark to work with data in Microsoft Fabric.&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>cloudcomputing</category>
      <category>fabric</category>
      <category>microsoft</category>
    </item>
    <item>
      <title>Create a Microsoft Fabric Lakehouse</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Thu, 02 Oct 2025 01:00:06 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/create-a-microsoft-fabric-lakehouse-gj3</link>
      <guid>https://dev.to/lotanna_obianefo/create-a-microsoft-fabric-lakehouse-gj3</guid>
      <description>&lt;p&gt;Large-scale data analytics solutions have traditionally been built around a data warehouse, in which data is stored in relational tables and queried using SQL. The growth in “big data” (characterized by high volumes, variety, and velocity of new data assets) together with the availability of low-cost storage and cloud-scale distributed compute technologies has led to an alternative approach to analytical data storage; the data lake. In a data lake, data is stored as files without imposing a fixed schema for storage. Increasingly, data engineers and analysts seek to benefit from the best features of both of these approaches by combining them in a data lakehouse; in which data is stored in files in a data lake and a relational schema is applied to them as a metadata layer so that they can be queried using traditional SQL semantics.&lt;/p&gt;

&lt;p&gt;In Microsoft Fabric, a lakehouse provides highly scalable file storage in a OneLake store (built on Azure Data Lake Store Gen2) with a metastore for relational objects such as tables and views based on the open source Delta Lake table format. Delta Lake enables you to define a schema of tables in your lakehouse that you can query using SQL.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a workspace&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before working with data in Fabric, create a workspace with the Fabric trial enabled.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to the Microsoft Fabric home page at &lt;a href="https://app.fabric.microsoft.com/home?experience=fabric" rel="noopener noreferrer"&gt;https://app.fabric.microsoft.com/home?experience=fabric&lt;/a&gt; in a browser, and sign in with your Fabric credentials.&lt;/li&gt;
&lt;li&gt;In the menu bar on the left, select &lt;strong&gt;Workspaces&lt;/strong&gt; (the icon looks similar to 🗇).&lt;/li&gt;
&lt;li&gt;Create a new workspace with a name of your choice, selecting a licensing mode in the &lt;strong&gt;Advanced&lt;/strong&gt; section that includes Fabric capacity.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftuor8qo9s48vcyy8vdry.png" alt="GSDGE" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbx4rdm8k5f48cy2oyrw9.png" alt="LEREF" width="800" height="423"&gt;
When your new workspace opens, it should be empty.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvqtb7jgvq7pjxlaj9ep.png" alt="ugftr" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a lakehouse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that you have a workspace, it’s time to create a data lakehouse for your data files.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;On the menu bar on the left, select &lt;strong&gt;Create&lt;/strong&gt;. In the New page, under the &lt;em&gt;Data Engineering&lt;/em&gt; section, select &lt;strong&gt;Lakehouse&lt;/strong&gt;. Give it a unique name of your choice.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpt584fle5i2m0arrtj98.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpt584fle5i2m0arrtj98.png" alt="juhygy" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5zp94kzxhckovm2wi5mo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5zp94kzxhckovm2wi5mo.png" alt="jhgyft" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
View the new lakehouse, and note that the Lakehouse explorer pane on the left enables you to browse tables and files in the lakehouse:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;Tables&lt;/strong&gt; folder contains tables that you can query using SQL semantics. Tables in a Microsoft Fabric lakehouse are based on the open source &lt;em&gt;Delta Lake&lt;/em&gt; file format, commonly used in Apache Spark.&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Files&lt;/strong&gt; folder contains data files in the OneLake storage for the lakehouse that aren’t associated with managed delta tables. You can also create shortcuts in this folder to reference data that is stored externally.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnhu5rohehvawipji82v5.png" alt="gfrtf" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Currently, there are no tables or files in the lakehouse.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Upload a file&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Fabric provides multiple ways to load data into the lakehouse, including built-in support for pipelines that copy data from external sources and data flows (Gen 2) that you can define using visual tools based on Power Query. However one of the simplest ways to ingest small amounts of data is to upload files or folders from your local computer.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Download the sales.csv file from &lt;a href="https://raw.githubusercontent.com/MicrosoftLearning/dp-data/main/sales.csv" rel="noopener noreferrer"&gt;https://raw.githubusercontent.com/MicrosoftLearning/dp-data/main/sales.csv&lt;/a&gt;, saving it as &lt;strong&gt;sales.csv&lt;/strong&gt; on your local computer.&lt;/li&gt;
&lt;li&gt;Return to the web browser tab containing your lakehouse, and in the &lt;strong&gt;…&lt;/strong&gt; menu for the &lt;strong&gt;Files&lt;/strong&gt; folder in the &lt;strong&gt;Explorer&lt;/strong&gt; pane, select &lt;strong&gt;New subfolder&lt;/strong&gt;, and create a subfolder named &lt;strong&gt;data&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffaphwsp3y3pxdlhdgi8m.png" alt="gtfrdrdr" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgsahgzjtcop7scce8wxh.png" alt="ytfrdy" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;In the … menu for the new &lt;strong&gt;data&lt;/strong&gt; folder, select &lt;strong&gt;Upload&lt;/strong&gt; and &lt;strong&gt;Upload files&lt;/strong&gt;, and then upload the &lt;strong&gt;sales.csv&lt;/strong&gt; file from your local computer.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2syigzo3d4tytxe1dc1i.png" alt="Ijhhg" width="800" height="423"&gt;
After the file has been uploaded, select the Files/data folder and verify that the sales.csv file has been uploaded, as shown here:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F35zi7nbduat1e160ka85.png" alt="gtfty" width="800" height="423"&gt;
Select the sales.csv file to see a preview of its contents.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Explore shortcuts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In many scenarios, the data you need to work with in your lakehouse may be stored in some other location. While there are many ways to ingest data into the OneLake storage for your lakehouse, another option is to instead create a shortcut. Shortcuts enable you to include externally sourced data in your analytics solution without the overhead and risk of data inconsistency associated with copying it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the … menu for the Files folder, select New shortcut.&lt;/li&gt;
&lt;li&gt;View the available data source types for shortcuts. Then close the New shortcut dialog box without creating a shortcut.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnvowuzn99ozstmkwjt84.png" alt="shortcut" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Load file data into a table&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The sales data you uploaded is in a file, which data analysts and engineers can work with directly by using Apache Spark code. However, in many scenarios you may want to load the data from the file into a table so that you can query it using SQL.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the &lt;strong&gt;Explorer&lt;/strong&gt; pane, select the &lt;strong&gt;Files/data&lt;/strong&gt; folder so you can see the &lt;strong&gt;sales.csv&lt;/strong&gt; file it contains.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;…&lt;/strong&gt; menu for the &lt;strong&gt;sales.csv&lt;/strong&gt; file, select &lt;strong&gt;Load to Tables &amp;gt; New table&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;In &lt;strong&gt;Load to table&lt;/strong&gt; dialog box, set the table name to &lt;strong&gt;sales&lt;/strong&gt; and confirm the load operation. Then wait for the table to be created and loaded.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4clslu3f2elqbxe9q03.png" alt="Iokerefq" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;In the Explorer pane, select the sales table that has been created to view the data.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;If the sales table does not automatically appear, in the … menu for the Tables folder, select Refresh.&lt;/em&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fllm9w54vufo7lbfv5jx1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fllm9w54vufo7lbfv5jx1.png" alt="Ijfqe" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the … menu for the &lt;strong&gt;sales&lt;/strong&gt; table, select &lt;strong&gt;View files&lt;/strong&gt; to see the underlying files for this table.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2c2zphmw2y0z2j1vyfnv.png" alt="Ihgtrd" width="800" height="423"&gt;
&lt;em&gt;Files for a delta table are stored in Parquet format, and include a subfolder named _delta_log in which details of transactions applied to the table are logged&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use SQL to query tables&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When you create a lakehouse and define tables in it, a SQL endpoint is automatically created through which the tables can be queried using SQL &lt;strong&gt;SELECT&lt;/strong&gt; statements.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;At the top-right of the Lakehouse page, switch from &lt;strong&gt;Lakehouse&lt;/strong&gt; to &lt;strong&gt;SQL analytics endpoint&lt;/strong&gt;. Then wait a short time until the SQL analytics endpoint for your lakehouse opens in a visual interface from which you can query its tables.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Use the New SQL query button to open a new query editor, and enter the following SQL query:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SELECT Item, SUM(Quantity * UnitPrice) AS Revenue&lt;br&gt;
FROM sales&lt;br&gt;
GROUP BY Item&lt;br&gt;
ORDER BY Revenue DESC;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnw5x2ch90kbs7m0b6g6s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnw5x2ch90kbs7m0b6g6s.png" alt="gftftygy" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use the ▷ Run button to run the query and view the results, which should show the total revenue for each product.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2kxbsoqp1mnqgxqxvql8.png" alt="Ihgyfrt" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbyfpbdg2s2hkmvijmmmo.png" alt="gftdrh" width="800" height="423"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylmq2vmo2o00yrh29ir7.png" alt="jhdrdrt" width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Create a visual query&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While many data professionals are familiar with SQL, data analysts with Power BI experience can apply their Power Query skills to create visual queries.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the toolbar, expand the &lt;strong&gt;New SQL query&lt;/strong&gt; option and select &lt;strong&gt;New visual query&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0j97aydss751duxqqmbi.png" alt="fdrdrf" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;Drag the &lt;strong&gt;sales&lt;/strong&gt; table to the new visual query editor pane that opens to create a Power Query as shown here:
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv4qdmu3ioobj12u5rlm0.png" alt="gtfrdr" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Manage columns&lt;/strong&gt; menu, select &lt;strong&gt;Choose columns&lt;/strong&gt;. Then select only the &lt;strong&gt;SalesOrderNumber&lt;/strong&gt; and &lt;strong&gt;SalesOrderLineNumber&lt;/strong&gt; columns.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu2bac8j5k5yziti1qres.png" alt="tutrtrry" width="800" height="425"&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6gvh0uauzazzwsgvsuj5.png" alt="yutrery" width="800" height="423"&gt;
&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Transform&lt;/strong&gt; menu, select &lt;strong&gt;Group by&lt;/strong&gt;. Then group the data by using the following &lt;strong&gt;Basic&lt;/strong&gt; settings:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Group by&lt;/strong&gt;: SalesOrderNumber&lt;br&gt;
&lt;strong&gt;New column name&lt;/strong&gt;: LineItems&lt;br&gt;
&lt;strong&gt;Operation&lt;/strong&gt;: Count distinct values&lt;br&gt;
&lt;strong&gt;Column&lt;/strong&gt;: SalesOrderLineNumber&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2zq0ccyq002d55o7wqh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2zq0ccyq002d55o7wqh.png" alt="FJYRTRHJ" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftd36hsim64skqcvmiavn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftd36hsim64skqcvmiavn.png" alt="GFTFYUUY" width="800" height="423"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7vc7reaci1rs3d5g0xd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7vc7reaci1rs3d5g0xd.png" alt="DSEUio" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this project, you have created a lakehouse and imported data into it. You’ve seen how a lakehouse consists of files and tables stored in a OneLake data store. The managed tables can be queried using SQL, and are included in a default semantic model to support data visualizations.&lt;/p&gt;

</description>
      <category>database</category>
      <category>dataengineering</category>
      <category>cloudcomputing</category>
      <category>datascience</category>
    </item>
    <item>
      <title>Mastering Linux Commands: Essential Tools for Efficient System Management</title>
      <dc:creator>lotanna obianefo</dc:creator>
      <pubDate>Wed, 01 Oct 2025 19:05:01 +0000</pubDate>
      <link>https://dev.to/lotanna_obianefo/mastering-linux-commands-essential-tools-for-efficient-system-management-3013</link>
      <guid>https://dev.to/lotanna_obianefo/mastering-linux-commands-essential-tools-for-efficient-system-management-3013</guid>
      <description>&lt;p&gt;Linux, an open-source operating system that powers environments ranging from servers and supercomputers to embedded systems, depends extensively on its command-line interface (CLI) for management and automation. In contrast to graphical user interfaces (GUIs), the Linux terminal provides unmatched efficiency, adaptability, and accuracy. Understanding key Linux commands is crucial for navigating the filesystem, managing processes, and troubleshooting issues.&lt;/p&gt;

&lt;p&gt;This article explores several essential Linux commands, grouped by category, with their primary functions and practical examples. Focusing on commands that form the backbone of daily Linux operations. By the end, you'll see how to apply in a real-world case study.&lt;/p&gt;

&lt;p&gt;Before initiating development in the Linux terminal, it is considered best practice to operate from the root directory by running this command &lt;strong&gt;sudo su&lt;/strong&gt;. This provides elevated privileges required to update system packages and install dependencies essential for application execution.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;sudo su&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Forrspqjbok7m5mlhpkse.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Forrspqjbok7m5mlhpkse.png" alt="sudo su" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
Execute the command &lt;strong&gt;apt-get update&lt;/strong&gt; to synchronize the local package index with the repositories. This operation refreshes metadata for all available packages, ensuring the system references the latest versions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;apt-get update&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faz4i09n6spbjeh8v4pz9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faz4i09n6spbjeh8v4pz9.png" alt="apt-get update" width="800" height="470"&gt;&lt;/a&gt;&lt;br&gt;
At this stage, essential software packages and utilities such as NGINX and Vim can be installed. The command &lt;strong&gt;apt install nginx&lt;/strong&gt; provisions NGINX, a high-performance web server that also functions as a reverse proxy, load balancer, and caching solution to optimize application delivery. &lt;br&gt;
Similarly, &lt;strong&gt;apt install vim&lt;/strong&gt; installs Vim, an advanced and extensible text editor derived from &lt;strong&gt;Vi&lt;/strong&gt;, widely used for efficient code and configuration file editing in Linux environments.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;apt install nginx&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frr4nhn9z5r9f1iohmnu4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frr4nhn9z5r9f1iohmnu4.png" alt="nginx" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;apt install vim&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mcdbpg2ehexf7btyjut.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mcdbpg2ehexf7btyjut.png" alt="vim" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
When the terminal output becomes cluttered with excessive command history, execute the &lt;strong&gt;clear&lt;/strong&gt; command to reset the display buffer and present a clean workspace.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Navigation and Directory Management&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;These commands facilitate navigation within the Linux filesystem, enabling directory creation, traversal, and hierarchical organization.&lt;/p&gt;

&lt;p&gt;Utilize the &lt;strong&gt;mkdir&lt;/strong&gt; command to provision three new directories within the filesystem.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;mkdir dir1 dir2 dir3&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwxp60g0s5synhit6d764.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwxp60g0s5synhit6d764.png" alt="mkdir" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
Execute the &lt;strong&gt;ls&lt;/strong&gt; command to display the directories that were recently created within the current filesystem path. &lt;br&gt;
Now you can see the 3 directories we just created &lt;strong&gt;dir1&lt;/strong&gt;, &lt;strong&gt;dir2&lt;/strong&gt; and &lt;strong&gt;dir3&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4udfv2cfkfcj4pk5ha12.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4udfv2cfkfcj4pk5ha12.png" alt="ls" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Initially, it is necessary to navigate to the target directory using the &lt;strong&gt;cd&lt;/strong&gt; command.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cd dir1&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflguakwnjg2bowd4frf4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflguakwnjg2bowd4frf4.png" alt="cd" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
To inspect file system metadata, including both standard and hidden entries, the commands &lt;strong&gt;ls -l&lt;/strong&gt; and &lt;strong&gt;ls -la&lt;/strong&gt; are utilized. The &lt;strong&gt;ls -l&lt;/strong&gt; command outputs detailed attributes of visible files and directories, whereas &lt;strong&gt;ls -la&lt;/strong&gt; extends this listing to include hidden entries.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ls -l&lt;/strong&gt; &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fch9ib97fipjsx4zvz5m6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fch9ib97fipjsx4zvz5m6.png" alt="lsl" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;ls -la&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1kub1cz17waz0kc7gjo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1kub1cz17waz0kc7gjo.png" alt="lsla" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
Notice the dotfiles along the directory we created with their associated metadata.&lt;/p&gt;

&lt;p&gt;When navigating the filesystem, the &lt;strong&gt;cd ..&lt;/strong&gt; command moves the user one level up to the parent directory of the current working path, whereas the &lt;strong&gt;cd ~&lt;/strong&gt; command resolves directly to the user’s home directory, independent of the current location within the filesystem.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cd ..&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96i3jbg66kkg8emu81lp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96i3jbg66kkg8emu81lp.png" alt="cdcd" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
Notice we moved a step further to the parent directory of the current directory.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cd ~&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3r4bjr23ix20h8c22n48.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3r4bjr23ix20h8c22n48.png" alt="dcdc" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
To displays the absolute path of the current directory in which the user is working, we will use the command &lt;strong&gt;pwd&lt;/strong&gt; which stands for “&lt;em&gt;print working directory&lt;/em&gt;.”.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Files and Directory Operations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Since the newly created directories are empty, the next step is to create a file, duplicate and relocate it into one of the target directories, and then open it in a text editor for further modifications.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;touch filename&lt;/strong&gt; command is used to create an empty file. &lt;/p&gt;

&lt;p&gt;Notice, we have to run the &lt;strong&gt;ls&lt;/strong&gt; command afterwards to list out the files and directories we created. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;touch datafile.txt&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzwys9dua0ldder4sw8wg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzwys9dua0ldder4sw8wg.png" alt="touch" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To duplicate the newly created file into another directory, the &lt;strong&gt;cp filename.txt dir2&lt;/strong&gt; command is used, which preserves the original file in its current location while placing a copy in the target directory. In contrast, the &lt;strong&gt;mv filename.txt dir2&lt;/strong&gt; command relocates the file entirely into dir2, removing it from the source path. Functionally, &lt;strong&gt;cp&lt;/strong&gt; behaves like a copy operation, whereas &lt;strong&gt;mv&lt;/strong&gt; is analogous to a cut-and-paste.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cp datafile.txt dir2&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftsrz383jhv4y3x8prf0n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftsrz383jhv4y3x8prf0n.png" alt="cp" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;mv datafile.txt dir2&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F41e1ywceg8wbyn5t258v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F41e1ywceg8wbyn5t258v.png" alt="mv" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
After executing the ls command to verify the directory contents, you will observe that the file has been relocated to dir2.&lt;/p&gt;

&lt;p&gt;On like the touch command &lt;strong&gt;vim filename.txt&lt;/strong&gt; command not only creates the file (if it does not already exist) but also opens it in the Vim editor for further modifications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;vim datafile.txt&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1c9fi6727w4rm8nupoj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1c9fi6727w4rm8nupoj.png" alt="vim" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
When the command is executed, it opens the file in the Vim editor. Press &lt;em&gt;i&lt;/em&gt; to enter Insert mode and make your changes. Once editing is complete, press &lt;em&gt;Esc&lt;/em&gt; to return to Normal mode, then type &lt;em&gt;:wq&lt;/em&gt; and press Enter to write the changes to disk and exit the editor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfc34hph8tdw87aumoq3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfc34hph8tdw87aumoq3.png" alt="iii" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4e9feqgz6wlal8k1p6n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4e9feqgz6wlal8k1p6n.png" alt="wqwwq" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To display the contents of the recently edited file, use the &lt;strong&gt;cat filename&lt;/strong&gt; command.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cat datafile.txt&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft339k8vi0ccvxje6mhk6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft339k8vi0ccvxje6mhk6.png" alt="cat" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If a directory is no longer required, it can be removed using the &lt;strong&gt;rmdir dir&lt;/strong&gt; command. However, &lt;strong&gt;rmdir&lt;/strong&gt; only deletes empty directories; to remove a directory along with its contents (files and subdirectories), the &lt;strong&gt;rm -r dir&lt;/strong&gt; command must be used. In contrast, the &lt;strong&gt;rm filename&lt;/strong&gt; command removes individual files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;rm datafile&lt;/strong&gt;&lt;br&gt;
To delete an individual file&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frgksfqnu4480macevvjg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frgksfqnu4480macevvjg.png" alt="RM" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;rm -r dir1&lt;/strong&gt;&lt;br&gt;
To delete a directory with it's content&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs954vhv2os1n38ajcbul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs954vhv2os1n38ajcbul.png" alt="rmr" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;rmdir dir3&lt;/strong&gt;&lt;br&gt;
To delete an empty directory&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe8xzg8irmvf67yl74gk6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe8xzg8irmvf67yl74gk6.png" alt="rmdir" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, &lt;strong&gt;grep&lt;/strong&gt; is a robust command-line utility designed for pattern matching and text search within files or command outputs. It functions as the Linux terminal’s built-in search mechanism, enabling efficient filtering and extraction of relevant information from large datasets or command results. &lt;/p&gt;

&lt;p&gt;Using &lt;em&gt;delete&lt;/em&gt; as the search word.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;grep delete newfile.txt&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffvaw3bo1xymxtq9jutzq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffvaw3bo1xymxtq9jutzq.png" alt="grep" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;System Information and Management&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Among the vast array of Linux commands, those dedicated to system information and management stand out for their ability to deliver real-time insights and control over hardware, processes, and software.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;uname -a&lt;/strong&gt; command provides a comprehensive summary of system-level information, including kernel name, release, version, and hardware architecture details.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;uname -a&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2osqqh4oifa4yc2jh6ui.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2osqqh4oifa4yc2jh6ui.png" alt="uname" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To verify the storage capacity of the system, the &lt;strong&gt;df -h&lt;/strong&gt; command is executed. This command reports disk space usage for all mounted filesystems, presenting the information in a human-readable format with size units.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;df -h&lt;/strong&gt; &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6q0265zdbozhtox8r0ya.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6q0265zdbozhtox8r0ya.png" alt="df h" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;free -h&lt;/strong&gt; command provides a detailed summary of the system’s memory utilization, including both RAM and swap space, in a human-readable format. It displays the total, used, free, and available memory, along with buffer/cache usage.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;free -h&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw6v94p0omhb9qhe2mu6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw6v94p0omhb9qhe2mu6.png" alt="freeh" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;top&lt;/strong&gt; command is used to display active processes and monitor real-time system resource utilization. It provides a continuously updated, dynamic view of process activity, including CPU and memory consumption. Functionally, &lt;strong&gt;top&lt;/strong&gt; serves as a terminal-based equivalent of a task manager in Linux.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;top&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ogu5g9qn6oswnqbagkp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ogu5g9qn6oswnqbagkp.png" alt="top" width="800" height="427"&gt;&lt;/a&gt;&lt;br&gt;
Note: To terminate a running process in the terminal, use &lt;em&gt;Ctrl+C&lt;/em&gt;, which sends a SIGINT (interrupt signal) to stop execution. Alternatively, &lt;em&gt;Ctrl+Z&lt;/em&gt; suspends the process by sending a SIGTSTP signal, pausing its execution and placing it in the background.&lt;/p&gt;

&lt;p&gt;Mastering Linux commands is not just about memorization. it’s about building intuition for how the operating system interacts with files, processes, and resources. From navigating the filesystem to monitoring system performance and controlling processes, these commands form the foundation of efficient system administration.&lt;/p&gt;

&lt;p&gt;By consistently practicing and combining these tools, administrators and developers can transform Linux from a basic operating system into a powerful, precision-driven environment for managing modern workloads.&lt;/p&gt;

</description>
      <category>linux</category>
      <category>networking</category>
      <category>programming</category>
      <category>cloudcomputing</category>
    </item>
  </channel>
</rss>
