<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mike Richter</title>
    <description>The latest articles on DEV Community by Mike Richter (@michaelsrichter).</description>
    <link>https://dev.to/michaelsrichter</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/michaelsrichter"/>
    <language>en</language>
    <item>
      <title>ML to AI with ML.Net and Azure</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 26 Apr 2021 23:59:15 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/ml-to-ai-with-ml-net-and-azure-3b3e</link>
      <guid>https://dev.to/michaelsrichter/ml-to-ai-with-ml-net-and-azure-3b3e</guid>
      <description>&lt;p&gt;Use ML.Net to build a Sentiment Analysis API&lt;/p&gt;

&lt;h1&gt;
  
  
  Goal
&lt;/h1&gt;

&lt;p&gt;This post will show you how easy it is to build a Sentiment Analysis Prediction Model with ML.NET and Model Builder in Visual Studio.&lt;/p&gt;

&lt;p&gt;We will then deploy that prediction model as an API and consume it as AI from a web application.&lt;/p&gt;

&lt;p&gt;If you have all the requirements already, this should take about 30 minutes and be completely free.&lt;/p&gt;

&lt;h2&gt;
  
  
  ML to AI
&lt;/h2&gt;

&lt;p&gt;What's the difference between Machine Learning (ML) and AI? My perspective is that AI is &lt;em&gt;applied&lt;/em&gt; ML. When you build applications that use ML to improve the user experience and to make the applications feel more natural and intuitive, that is AI.  &lt;/p&gt;

&lt;p&gt;We will see this transformation from ML to AI as we go through this post.&lt;/p&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;We will build a Machine Learning model using an existing ML.Net tutorial. &lt;/li&gt;
&lt;li&gt;We will add a Web API to the project and update the code to wire it all together.&lt;/li&gt;
&lt;li&gt;We will deploy the Web API to the cloud.&lt;/li&gt;
&lt;li&gt;We will use a simple web-app to demonstrate making the call to the API.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By the end of this post, you'll have a working Sentiment Analysis API and an AI-infused app that you can show to your friends and coworkers!&lt;/p&gt;

&lt;h1&gt;
  
  
  Build the ML Model
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Tutorial
&lt;/h2&gt;

&lt;p&gt;Follow the instructions for this &lt;a href="https://dotnet.microsoft.com/learn/ml-dotnet/get-started-tutorial"&gt;tutorial&lt;/a&gt; to get started. Note that you will need Visual Studio installed (the tutorial provides the link) to complete this exercise. You do NOT need Visual Studio to use ML.Net generally, but Visual Studio comes with a feature called &lt;a href="https://dotnet.microsoft.com/apps/machinelearning-ai/ml-dotnet/model-builder"&gt;Model Builder&lt;/a&gt; which provides a nice UI and writes a lot of the code for you.&lt;/p&gt;

&lt;p&gt;When you finish the tutorial, you should be able to run the project and see predictions. Congratulations, you're ready for the next step!&lt;/p&gt;

&lt;h1&gt;
  
  
  Add An API
&lt;/h1&gt;

&lt;p&gt;OK, you built an ML Model and have a way to consume it. But, it is not very usable right now. Your users or customers are not going to want to use the Command Line and these predictions are not very helpful outside the context of an actual application. Let's fix that.&lt;/p&gt;

&lt;h2&gt;
  
  
  Update Some Code
&lt;/h2&gt;

&lt;p&gt;To make this work, we'll need to change some of the code that Model Builder generated for us.  In Visual Studio find the file called &lt;code&gt;ConsumeModel.cs&lt;/code&gt; and open it. Around line 30 you should see a line that looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;string modelPath = @"C:\Users\mrichter\AppData\Local\Temp\MLVSTools\BaruchMLDemo11ML\BaruchMLDemo11ML.Model\MLModel.zip"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The directory for the ML Model (&lt;code&gt;MLModel.zip&lt;/code&gt;) we produced is hard-coded. This is bad because when we deploy this API, we won't actually know what the directory structure will be. Also, the API will be hosted on a Linux server, so this Windows directory won't make any sense.&lt;/p&gt;

&lt;p&gt;We need to update the &lt;code&gt;modelPath&lt;/code&gt; variable from a hard-coded directory to a 'relative' directory.  Update that line with this code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;string modelPath = System.IO.Path.Combine(System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location), "MLModel.zip");
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now the code will look for the model (&lt;code&gt;MLModel.zip&lt;/code&gt;) in the same directory that the application is executing. &lt;/p&gt;

&lt;p&gt;Rebuild the solution again (Build --&amp;gt; Rebuild Solution). You should see a message in the output that "3 succeeded". That's because we have 3 projects in the solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  Add the API Project
&lt;/h2&gt;

&lt;p&gt;In the Visual Studio solution explorer, right-click and choose &lt;strong&gt;Add a New Project&lt;/strong&gt;. See the screen shot below.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1Aq5LDCC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8rcwy7dsmbzozlke6ht5.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1Aq5LDCC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8rcwy7dsmbzozlke6ht5.PNG" alt="Add-New-Project"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Search for &lt;strong&gt;API&lt;/strong&gt; and choose the C# ASP.NET Core Web API project. Click Next.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I9WD6-xO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f04is8gk94nkwnx8i2kc.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I9WD6-xO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f04is8gk94nkwnx8i2kc.PNG" alt="Add-Web-Api"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Name the project something like MLDemo.Web. Leave all the other defaults and click Create. &lt;/p&gt;
&lt;h2&gt;
  
  
  Prepare the API Project
&lt;/h2&gt;

&lt;p&gt;This API Project comes with some demo classes that we don't need, so let's remove them. In the root directory, remove &lt;code&gt;WeatherForecaster.cs&lt;/code&gt;. Just click it and hit the delete button. In the Controllers folder, delete &lt;code&gt;WeatherForecastController.cs&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Under Properties, you'll find &lt;code&gt;launchSettings.json&lt;/code&gt;. You'll see 2 entries that look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"launchUrl":"weatherforecast"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;change them so they look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"launchUrl":"score"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now let's build the Score URL endpoint which is how our API will be called! &lt;/p&gt;

&lt;h2&gt;
  
  
  Build the API
&lt;/h2&gt;

&lt;p&gt;Right-click on the **Controller **folder and choose Add --&amp;gt; Controller&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MjGY8hN9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udgvv93st9zgzi63dikx.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MjGY8hN9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udgvv93st9zgzi63dikx.PNG" alt="Add-Controller"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Choose "API Controller - Empty".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wt9NK2KO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y0caekpjgl8nrcjy4wdx.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wt9NK2KO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y0caekpjgl8nrcjy4wdx.PNG" alt="Add-API-File"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Save the name of the file as &lt;code&gt;ScoreController.cs&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TKRbo2j8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/egb40r2vhtuhkzalzafr.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TKRbo2j8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/egb40r2vhtuhkzalzafr.PNG" alt="Save-Name-As-Score-Controller"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Replace the content of the generate filed with the below. It's not 100% necessary but you may want to update your namespace name to match the name of your project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using Microsoft.AspNetCore.Mvc;
using System.Text.Json;

namespace UPDATEMEIFYOUWANT.Web.Controllers
{
    //This tells the dotnet framework that we are building a web api
    [ApiController]

    //This attribute tells dotnet that we want the URL route for the api to be
    //based on the controller's name.
    //In this case, the controller's name is "ScoreController" so the route will be /score
    [Route("[controller]")]
    public class ScoreController : Controller
    {
        public IActionResult Index()
        {
            return View();
        }

        //This tells dotnet that our API will resond to HTTP GET Verbs
        //the "text" parameter will get passed in from the query string when a request to this API is made.
        // 
        [HttpGet]
        public string Get(string text)
        {
            var input = new ModelInput() { Col0 = text };
            //Let's call our ML model predict the sentiment of our text
            ModelOutput result = ConsumeModel.Predict(input);
            //Return the prediction in the response.
            return JsonSerializer.Serialize(result);
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will notice that some of the code has red lines beneath it. That's because we want our API to reference the the Model project in our solution, but we haven't yet created that explicit reference between the projects. If you hover over any of those red-underlined words, you should see a little lightbulb pop up which reveals a menu. At the bottom of the menu there's an option to add a reference to our Model project. Click it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--P58TIwoa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k59llk6hf52gt7yrxaoz.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--P58TIwoa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k59llk6hf52gt7yrxaoz.PNG" alt="Add-A-Reference-to-Model"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's it! You've now built an API around the ML Model. Let's try it out! &lt;/p&gt;

&lt;h2&gt;
  
  
  Test the API
&lt;/h2&gt;

&lt;p&gt;Try rebuilding the entire solution. Go to Build -&amp;gt; Rebuild Solution. After a few seconds the output should tell you "4 succeeded" (because we now have 4 projects).&lt;/p&gt;

&lt;p&gt;Right-Click on the Web project and choose Debug --&amp;gt; Start New Instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FO2WQ-v6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bq4bw4tychv8elfzguuy.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FO2WQ-v6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bq4bw4tychv8elfzguuy.PNG" alt="debug-start-new-instance"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After a few seconds, your browser should start and point to a URL that looks like &lt;code&gt;https://localhost:44380/score&lt;/code&gt; and you should see some text in the browser window that looks like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"Prediction":"1","Score":[0.4424079,0.5575921]}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can pass in text via the URL and the API will use the model to predict the sentiment. Try updating the URL like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://localhost:44380/score?text=this%20is%20amazing!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;(note you can write spaces, the Browser will replace them with &lt;code&gt;%20&lt;/code&gt;.)&lt;/p&gt;

&lt;p&gt;The prediction should be "1". The model correctly predicted this is a positive statement. &lt;/p&gt;

&lt;p&gt;Now update the URL to say this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://localhost:44380/score?text=I%20think%20this%20is%20horrible
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The model should return "0" and predict that this is a negative statement.&lt;/p&gt;

&lt;p&gt;Fantastic! Let's review what you've done so far.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You've built an ML Model.&lt;/li&gt;
&lt;li&gt;Consumed the model from the command line.&lt;/li&gt;
&lt;li&gt;Built a Web API&lt;/li&gt;
&lt;li&gt;Consumed the model from the web api.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But we're still not done. Users aren't going to connect to your computer to talk to this API. You need to deploy it to the cloud! &lt;/p&gt;

&lt;h1&gt;
  
  
  Deploy the API to the Cloud
&lt;/h1&gt;

&lt;p&gt;To deploy the API you will need some requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;A Microsoft Azure Account - If you don't have one, get one for &lt;a href="https://azure.microsoft.com/en-us/free/"&gt;free&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Azure Command Line (CLI) &lt;a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli"&gt;installed&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Git client &lt;a href="https://git-scm.com/book/en/v2/Getting-Started-Installing-Git"&gt;installed&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Add the solution to Git
&lt;/h2&gt;

&lt;p&gt;We need to add the solution to Git because we will be using Git as the deployment vehicle for getting this API into the cloud.&lt;/p&gt;

&lt;p&gt;In your command line/terminal, go to the root directory for your solution. You should see four sub directories at this level (one for each project that was created).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--g6650EaO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/psgex4uaqvt288r8rxyz.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g6650EaO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/psgex4uaqvt288r8rxyz.PNG" alt="Solution-Directory"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Run these commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git init
git add -A
git commit -m "Initial Commit"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Log into Azure from the Command Line
&lt;/h2&gt;

&lt;p&gt;Now log into Azure with these commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az login
az account set -s "your subscription name"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Build a web app to host our API
&lt;/h2&gt;

&lt;p&gt;You're going to set some default values for Azure resources here (note these commands are for setting environment variables in Windows. Here's a &lt;a href="https://www.twilio.com/blog/2017/01/how-to-set-environment-variables.html"&gt;post&lt;/a&gt; for setting them on other operating systems. )&lt;/p&gt;

&lt;p&gt;The APP name needs to be globally unique and should only contain letters and numbers. Try using your name to make sure it is unique.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;setx RG "baruchmldemo11rg"
setx PLAN "baruchmldemo11plan" 
setx APP "baruchmldemo11api"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Try running this command to make sure the environment variables were set correctly: &lt;code&gt;echo %RG%&lt;/code&gt;. If you don't see the value you set returned, you may need to open a new terminal window. &lt;/p&gt;

&lt;p&gt;Once you've verified that the environmental variables were set correctly, We're ready to start deploying Azure resources! &lt;/p&gt;

&lt;p&gt;Create a Resource Group where these resources will live:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az group create -n %RG% -l eastus
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create an App Service plan that will host our app. This will create a free plan in the East US.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az appservice plan create -l eastus -g %RG% -n %PLAN% --sku F1 --is-linux
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we create the actual Web App that lives in the Free App Service plan and that will host our web API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp create -g %RG% -p %PLAN% -n %APP% --runtime "DOTNETCORE|3.1"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When we add our API to an app, we want it to be reached from a web browser client, so we need to update the CORS setting)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp cors add -a "*" -n %APP% -g %RG% 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will be using Git to deploy our web application, so we need to create credentials for it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp deployment user set --user-name %APP% --password m@gicDepl0y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should change the password. Just remember what it is, you will need it later.&lt;/p&gt;

&lt;p&gt;Now let's tell the Web App that we will be deploying via Git.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az webapp deployment source config-local-git --name %APP% -g %RG%
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That should return a URL that looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://baruchmldemo11api@baruchmldemo11api.scm.azurewebsites.net/baruchmldemo11api.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Remove the part of the URL between the "https://" and including the @ symbol so it looks like this now:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://baruchmldemo11api.scm.azurewebsites.net/baruchmldemo11api.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use that URL as a "remote" for git by running this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git remote add azure https://baruchmldemo11api.scm.azurewebsites.net/baruchmldemo11api.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Deploy the API
&lt;/h2&gt;

&lt;p&gt;If we've done everything right we should be ready to deploy our web API to the cloud. To do it, run the command below. You will be asked for the credentials. Use the app name value for user name (in my case &lt;code&gt;baruchmldemo99api&lt;/code&gt;) and the password we defined as the password (&lt;code&gt;m@gicDepl0y&lt;/code&gt;).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git push azure master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command can take a couple of minutes to run. &lt;/p&gt;

&lt;h2&gt;
  
  
  Test the Web API.
&lt;/h2&gt;

&lt;p&gt;When the command is successfully complete, we can test the API.&lt;/p&gt;

&lt;p&gt;Got to the &lt;a href="https://portal.azure.com"&gt;Azure Portal&lt;/a&gt; to find the URL for your new Web API. &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bETvVTeh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wl1u27wbzbodlktgtc1b.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bETvVTeh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wl1u27wbzbodlktgtc1b.PNG" alt="Azure-Portal"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the URL, it may take a minute for it start. Update the URL so it looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://baruchmldemo11api.azurewebsites.net/score?text=azure%20is%20the%20best%20for%20APIs!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That should return a positive sentiment prediction. Congratulations, your API is deployed to the cloud!  &lt;/p&gt;

&lt;p&gt;Cool, we are &lt;strong&gt;&lt;em&gt;almost&lt;/em&gt;&lt;/strong&gt; done! 😄&lt;/p&gt;

&lt;h1&gt;
  
  
  Add the API to an App
&lt;/h1&gt;

&lt;p&gt;The last step is the easiest and the most fun. Let's see how we can use this API in a web application. &lt;br&gt;
Go to this URL:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://baruchmldemo.z13.web.core.windows.net/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;strong&gt;Sentinment Analysis API&lt;/strong&gt; text box, enter in the URL endpoint for your API. It should be something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://baruchmldemo11api.azurewebsites.net/score
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, in &lt;strong&gt;Text to analyze&lt;/strong&gt;, enter in some text and click "Analyze!"&lt;/p&gt;

&lt;p&gt;Depending on what text you put in, you should see a smiley face appear that reflects the sentiment of the text you entered.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vHUFxHGo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s0yd1pzebper9gsep8az.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vHUFxHGo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s0yd1pzebper9gsep8az.PNG" alt="Sentiment-App"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  ML to AI
&lt;/h2&gt;

&lt;p&gt;This is a simple app but it shows how we create AI by invoking these ML Models.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;In this blog post, we built an ML Model from scratch and operationalized it with an API. We deployed that API to the cloud and used it inside an application. A decade ago this may have taken months to build and cost a lot of money. We did it today in 30 minutes for free. &lt;/p&gt;

&lt;h2&gt;
  
  
  What's next?
&lt;/h2&gt;

&lt;p&gt;This model can definitely be improved and all the steps to retrain it, build and deploy it can be automated. We built a model that is about 80% accurate in a few minutes. If we want to improve it we need to train it with more data. We can start with this model, deploy it, learn from it and improve it. All the tools and steps for operationalizing the model are there, we just went through them, but that is just the easy part!&lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>ai</category>
      <category>azure</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>Authorization Strategies with Azure Active Directory</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 08 Mar 2021 01:20:15 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/authorization-strategies-with-azure-active-directory-2l8m</link>
      <guid>https://dev.to/michaelsrichter/authorization-strategies-with-azure-active-directory-2l8m</guid>
      <description>&lt;p&gt;This article will explore four strategies for authorizing users who authenticate into your applications with Azure Active Directory.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Application Roles&lt;/li&gt;
&lt;li&gt;User Groups&lt;/li&gt;
&lt;li&gt;Azure AD B2C 3rd Party Claims&lt;/li&gt;
&lt;li&gt;Azure Web API&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We all know the two main elements of identity when it comes to App Development. There's Authentication: this is who I am; and there's Authorization: this is what I can do. Azure Active Directory does the authentication for you, but you can also use it for authorization.&lt;/p&gt;

&lt;h2&gt;
  
  
  Authentication with Azure Active Directory
&lt;/h2&gt;

&lt;p&gt;In the cloud, one of the most popular systems for authentication is Azure Active Directory (AAD). If you use Office 365, when you sign in to Outlook or use the Web versions of Word or office, you are authenticating against AAD. AAD is the identity system for Office 365. If your company uses popular SAAS solutions like Salesforce, Workday, SAP or ADP and you sign in with your work Office 365 account, then you are using the same AAD to sign in to those applications. And, of course, you can build your own applications that leverages AAD to single-sign-on users with their corporate accounts. There are &lt;a href="https://cobweb.com/latest-news/10-reasons-why-you-should-be-using-azure-active-directory1001" rel="noopener noreferrer"&gt;lots of reasons&lt;/a&gt; to enable SSO with AAD and &lt;a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/category/azure-active-directory-apps" rel="noopener noreferrer"&gt;thousands of SAAS providers&lt;/a&gt; do just that. &lt;/p&gt;

&lt;h2&gt;
  
  
  Authorization with Azure Active Directory
&lt;/h2&gt;

&lt;p&gt;Let's say you're building a SAAS app. You can delegate authentication to AAD and that is well documented. But how do you authorize users? When you authenticate users from AAD using OIDC/OAUTH2, you'll get a jwt token with a globally unique identifier for that user composed of a Tenant ID (the AAD tenant the user belongs to) and an Object ID (the User's ID in that tenant). Once you have this information, you can use it to figure out what permissions (if any) the user has in your application. &lt;/p&gt;

&lt;p&gt;For the sake of simplicity, authorization or permissions information can be put in two places. Inside AAD or somewhere else like a Bespoke Permissions system you build for your application.  You can use either one or use can use both. How do you choose? There are a lot of decisions that go into designing how you authorize users. By leveraging AAD, there's fewer things you need to worry about but you may not get the granularity you need. &lt;/p&gt;

&lt;p&gt;Here are some things to think about it when considering where to put your authorization information.&lt;/p&gt;

&lt;h3&gt;
  
  
  In Azure Active Directory
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Control&lt;/strong&gt;: Customer has a lot control and app developer has less control.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Management&lt;/strong&gt;: Because your customer will be modifying objects in AAD to determine permissions, an AAD administrator may need to be actively engaged when changes are required. For example, if your SAAS application is for Marketing teams, your customer may need to submit requests to IT to change/add permissions in your system. This could add some friction.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing&lt;/strong&gt;: if you're licensing your application per user, you'll need to be aware of how many users your customers are adding. You will need to monitor that and potentially implement a way to true-up.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Complexity&lt;/strong&gt;: There's less for you to build and maintain. Therefore, relying on AAD for permissions can be simpler. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Granularity&lt;/strong&gt;: You will use some built in AAD features. Depending on your needs this may suffice or you may need a more granular system.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost&lt;/strong&gt;: Since you're leveraging your customer's AAD, there may not be a big cost impact storing authorization information there.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Bespoke System
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Control&lt;/strong&gt;: Customer only has the control that the bespoke system allows. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Management&lt;/strong&gt;: Generally, the AAD administrator does not need to be actively involved. They will need to greenlight your app, but then the team that's actually using the application can control permissions. If you're building a SAAS app for marketing teams, the marketing team may need to ask IT to allow your application one time, but from then on they can manage permissions themselves. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing&lt;/strong&gt;: You can more easily control how you do per user licenses because you have access to all the transactions happening in the vendor system.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Complexity&lt;/strong&gt;: There's more for you to build an maintain. Therefore, relying on your own system can be more complex.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Granularity&lt;/strong&gt;: You can build as much granularity as you need and evolve it over time. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost&lt;/strong&gt;: You will need to operate a permissions system. If you use a commercial product to manage permissions there may be a cost associated with it. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's look at four options for building an authorization system with AAD.&lt;/p&gt;

&lt;h2&gt;
  
  
  Authorization Strategies
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Strategies Leveraging Azure AD
&lt;/h3&gt;

&lt;h4&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Enterprise Application Roles
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps" rel="noopener noreferrer"&gt;Application Roles&lt;/a&gt; allow app developers to define roles in their application like Administrator, Contributor, Reader. When  customers add the application to AAD so they can SSO to it, the customers' AAD admins can add users to your application and assign them to roles. This is a very simple solution for you. It adds more complexity for your customers and provides the least granular set of permissions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77kv7nx9jegj5svrscjh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77kv7nx9jegj5svrscjh.png" alt="Enterprise Application Roles"&gt;&lt;/a&gt;&lt;/p&gt;
Authorization with Enterprise Application Roles



&lt;h4&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Microsoft Groups
&lt;/h4&gt;

&lt;p&gt;Permissions in your application can be aligned to custom groups in your customers' AAD tenants. &lt;br&gt;
There are &lt;a href="https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/5-WebApp-AuthZ/5-2-Groups/README.md" rel="noopener noreferrer"&gt;several ways to retrieve the group information&lt;/a&gt;. You can get the user's groups in the token. Or you could also ping Microsoft Graph to get the Groups the customer is a member of. There are few things to think about with groups:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Will your customer be OK with sending information about every group a user is in?&lt;/li&gt;
&lt;li&gt;If you want to query Microsoft Graph to see the members of a particular group, your application will request that permission and the customer's AAD admins will have to &lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/consent-framework" rel="noopener noreferrer"&gt;consent&lt;/a&gt; to that. &lt;/li&gt;
&lt;li&gt;In either case, you should tell customers to create new groups that are reserved just for the roles in your applications. For example, the groups can be called "My-App Administrators", "My-App Contributors", "My-App Readers".&lt;/li&gt;
&lt;li&gt;You wont get the actual group names back from AAD, just their Object ID Guids. So you will need a per AAD Tenant lookup somewhere in your application to map group ids from your customers' tenants.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdcqjywiyv2xbspudvr3m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdcqjywiyv2xbspudvr3m.png" alt="groups"&gt;&lt;/a&gt;&lt;/p&gt;
Authorization with User Groups



&lt;p&gt;Which should you choose? There's a discussion about Roles vs Groups here: &lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps#app-roles-vs-groups" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps#app-roles-vs-groups&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's a Microsoft Learn course to get hands on with groups and roles: &lt;a href="https://docs.microsoft.com/en-us/learn/modules/identity-users-groups-approles/" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/learn/modules/identity-users-groups-approles/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Strategies Leveraging a Bespoke System
&lt;/h3&gt;

&lt;p&gt;AAD gives you couple of options for integrating with a Bespoke permissions system.&lt;br&gt;
 You may want to use this, rather than AAD, if you need more granular or advanced authorization features like ACLs, Policies and RBAC.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Azure AD B2C 3rd Party Claims
&lt;/h4&gt;

&lt;p&gt;You can add authentication with AAD into your application directly in your code. You can use the &lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-overview" rel="noopener noreferrer"&gt;MSAL library&lt;/a&gt; to integrate AAD.  You can also use a service that acts as a proxy to many identification providers. Let's say you have customers that use AAD and customers that use Google Workspace. There are various Identity Services out there that will make it easy for you to talk to many identity providers. Microsoft's offering for this &lt;a href="https://azure.microsoft.com/en-us/services/active-directory/external-identities/b2c/" rel="noopener noreferrer"&gt;Azure AD B2C&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;During the authentication flow, Azure AD B2C can &lt;a href="https://docs.microsoft.com/en-us/azure/active-directory-b2c/custom-policy-rest-api-claims-exchange" rel="noopener noreferrer"&gt;call out to a 3rd party API&lt;/a&gt; to get custom claims. You can build that 3rd party API around your Bespoke Permissions system. B2C can call that API and merge claims from your Bespoke Permissions service into the JWT token it returns to your application. The benefit of this approach is that your application only calls one service, Azure AD B2C and it gets back a consistent set of claims. Azure AD B2C does the hard-work of calling multiple identity providers and 3rd party permissions services. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvrq43zzrda8fej690f0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvrq43zzrda8fej690f0.png" alt="b2c"&gt;&lt;/a&gt;&lt;/p&gt;
Authorization with Azure AD B2C


 

&lt;h4&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Azure AD Web API
&lt;/h4&gt;

&lt;p&gt;If you have a Bespoke Permissions system and expose it as an API, you can have your application call it directly. You can &lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-protected-web-api-overview" rel="noopener noreferrer"&gt;leverage AAD to make sure calls to the permissions API service are protected&lt;/a&gt;. In this case, you're using &lt;strong&gt;your own&lt;/strong&gt; AAD, not your customers. It just takes a few steps to make this work.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In AAD you'll give your application an identity.&lt;/li&gt;
&lt;li&gt;In AAD you'll give your permissions service API an identity.&lt;/li&gt;
&lt;li&gt;In AAD, you'll configure the permissions service API to allow the application to make requests to it.&lt;/li&gt;
&lt;li&gt;In your application, when you make requests to the permissions service, you'll present the identity, so the permissions service knows who's calling and will allow it.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckcepqji9pyx0tfs3hai.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckcepqji9pyx0tfs3hai.png" alt="webapi"&gt;&lt;/a&gt;&lt;/p&gt;
Authorization with AAD Web API



&lt;h2&gt;
  
  
  In Conclusion
&lt;/h2&gt;

&lt;p&gt;There are other ways to implement authorization and permissions in your application. In this article I discussed four strategies that leverage AAD. There are lot of elements to consider when deciding which strategy to choose. If using roles and groups inside of AAD is not granular enough there are many 3rd party authorization systems that you can integrate into your application, including open source projects like &lt;a href="https://casbin.org/" rel="noopener noreferrer"&gt;Casbin&lt;/a&gt;. I hope you found this discussion useful! &lt;/p&gt;

</description>
      <category>azure</category>
      <category>authorization</category>
      <category>authentication</category>
      <category>permissions</category>
    </item>
    <item>
      <title>Easily Create Allow (or Deny) Lists for Azure Resources</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 22 Feb 2021 02:49:03 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/easily-create-allow-or-deny-lists-for-azure-resources-pfb</link>
      <guid>https://dev.to/michaelsrichter/easily-create-allow-or-deny-lists-for-azure-resources-pfb</guid>
      <description>&lt;p&gt;In this post, we will use the Azure CLI and the Azure Resource Graph to quickly generate a list of endpoints for Allow or Deny lists. You can find the source code for this example here:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i3JOwpme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/michaelsrichter"&gt;
        michaelsrichter
      &lt;/a&gt; / &lt;a href="https://github.com/michaelsrichter/azure-allow-list"&gt;
        azure-allow-list
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Use node and the Azure CLI to quickly generate allow (or deny) lists.
    &lt;/h3&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;h3&gt;
  
  
  What is an Allow List?
&lt;/h3&gt;

&lt;p&gt;An Allow List is a list of endpoints that you will &lt;em&gt;allow&lt;/em&gt; through your firewall. A Deny List is the exact opposite: endpoints you &lt;em&gt;deny&lt;/em&gt; through your firewall. You may be more familiar with the archaic terms, white list and black list. They are essentially the same thing. When I use the term &lt;strong&gt;endpoints&lt;/strong&gt;, I am using it as a generic term to refer to URLs or IP Addresses. &lt;/p&gt;

&lt;h3&gt;
  
  
  Why do I want an Allow List?
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Security
&lt;/h4&gt;

&lt;p&gt;Over the years I've worked with many Microsoft Partners who were helping their internal IT teams prepare to move to Azure. Often some services would live in Azure and some would remain on-premises. The internal IT teams needed a list of all the public endpoints in Azure so they could allow-list them or deny-list them.&lt;/p&gt;

&lt;p&gt;Building this list in the past was complicated. Imagine if you have thousands of Azure Resources and many of them had public endpoints; it used to be a pain to gather all that information. Almost every Azure service has a public endpoint. Popular services like Azure SQL DB, Azure Storage Accounts, and Azure Web Apps all have public endpoint and there are many, many more. Some services, like storage accounts, even have multiple public endpoints! &lt;/p&gt;

&lt;h4&gt;
  
  
  Insights and Risks
&lt;/h4&gt;

&lt;p&gt;Even if you're not concerned with creating Allow Lists or Deny Lists, you might be interested to see all of the publicly available endpoints for your Azure services. You might want to track each of them and make sure you understand the risk and risk mitigation efforts for each. &lt;/p&gt;

&lt;h4&gt;
  
  
  Governance
&lt;/h4&gt;

&lt;p&gt;Just by running this tool you might be surprised or even shocked at what you'll find. If you're an administrator for large deployments in Azure, you may have had suspicions that there a ton of externally available resources. Now you can easily find them all and implement a plan to address them. &lt;/p&gt;

&lt;h3&gt;
  
  
  How do I easily generate an Allow List?
&lt;/h3&gt;

&lt;p&gt;I built some sample code using node to show you how easy it is. Azure has a cross platform CLI and a couple of years ago the team built the &lt;a href="https://docs.microsoft.com/en-us/azure/governance/resource-graph/overview#:~:text=Azure%20Resource%20Graph%20is%20a,can%20effectively%20govern%20your%20environment"&gt;Azure Resource Graph&lt;/a&gt; (ARG). ARG lets you query your Azure Resources and return just the data you need, formatted the way you need. Before ARG came along, it was a real pain to pull all of this information together. &lt;/p&gt;

&lt;p&gt;ARG gets us most of the way there. I wrote some node code for the finishing touches. &lt;/p&gt;

&lt;p&gt;To get a CSV file of all your URLs and IP addresses in an Azure subscription, you just need to run a command like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install

node index.js c8faea8e-b5d3-4f31-bc58-f15f4390309a &amp;gt; azure-allow-list.csv
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can open that csv file in Excel and you'll get something that looks like this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bPq7UxeP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eriipb8qdjzy7vbzwqrs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bPq7UxeP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eriipb8qdjzy7vbzwqrs.png" alt="AllowList"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice the &lt;strong&gt;value&lt;/strong&gt; column has the URL or the IP Address for your service. You can now take this list and import it into whatever Firewall or Network appliance you're using.&lt;/p&gt;

&lt;p&gt;You can find all the easy steps to get started on the Github repo: &lt;a href="https://github.com/michaelsrichter/azure-allow-list"&gt;https://github.com/michaelsrichter/azure-allow-list&lt;/a&gt;. It works great in the &lt;a href="http://shell.azure.com/"&gt;Azure Shell&lt;/a&gt; too! &lt;/p&gt;

&lt;p&gt;Of course, this tool can be improved in many ways. Feel free to make a suggestion or submit your own pull-request. And please let me know if you have any thoughts or feedback. You can use the comments below or on Github too. Thanks! &lt;/p&gt;

</description>
      <category>azure</category>
      <category>security</category>
      <category>node</category>
      <category>architecture</category>
    </item>
    <item>
      <title>How to Deploy to Azure with Least Privilege</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Sun, 14 Feb 2021 18:38:36 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/how-to-deploy-to-azure-with-least-privilege-5cjc</link>
      <guid>https://dev.to/michaelsrichter/how-to-deploy-to-azure-with-least-privilege-5cjc</guid>
      <description>&lt;p&gt;In this post we'll walk through the steps you can take to give a  Service Principal a role with "Least Privilege" in Azure. After reading this article you will have a very practical method that you can use over and over again. You will be able to create roles for your Service Principals that will only allow them to deploy specific types of resources and only in the specified scopes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Background
&lt;/h3&gt;

&lt;p&gt;If you build an ARM Template or you get one from a 3rd party software company or services company, you will need permissions to deploy it. If you want to automate the deployment of that ARM Template, you will want to create a Service Principal that will do the deployment for you. Ideally the Service Principal will only have enough permission to deploy that ARM Template and do nothing else. If the Service Principal has broad permissions, like contributor or owner of an entire subscription or resource group, the Service Principal can be exploited. &lt;/p&gt;

&lt;p&gt;For instance, the ARM Template can be altered and more services can be added to it. Or the Service Principal's credentials can be compromised or re-used in other automation pipelines. &lt;/p&gt;

&lt;p&gt;So, how do you build a "Least Privilege" Service Principal with only the permissions that it needs? Let's find out.&lt;/p&gt;

&lt;h3&gt;
  
  
  Concepts
&lt;/h3&gt;

&lt;p&gt;Here are the concepts I will discuss in this article.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/app-objects-and-service-principals#service-principal-object"&gt;Service Principal&lt;/a&gt; - Essentially a Service Account that you can use to automate Azure.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview"&gt;ARM Template&lt;/a&gt; - A declarative json file used for deploying Infrastructure As Code in Azure.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits"&gt;Azure Subscription&lt;/a&gt; and &lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/manage-resource-groups-portal"&gt;Resource Groups&lt;/a&gt; are the &lt;strong&gt;scopes&lt;/strong&gt; for where you can deploy Azure services. All Azure services live inside a Resource Group which lives inside a Subscription. You can scope permissions at the individual Resource level, the Resource Group level or for the whole subscription.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles"&gt;Azure Built-In Roles&lt;/a&gt; and &lt;a href="https://docs.microsoft.com/en-us/azure/role-based-access-control/custom-roles"&gt;Azure Custom Roles&lt;/a&gt;. Roles are what determine what an identity can do in Azure. A user or a service principal doesn't have permission to do anything in Azure until it is assigned a role. Identities can have more than one role. Roles determine what actions you can perform in Azure and within what scope. &lt;strong&gt;Roles are at the heart of what this article is about!&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/role-based-access-control/custom-roles#how-to-determine-the-permissions-you-need"&gt;Permissions&lt;/a&gt; - there are thousands of permissions that determine what an identity can do in Azure. Building a role composed of only the minimum required permissions and only within the minimum required scope is how we get to least privilege.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Set Up
&lt;/h3&gt;

&lt;p&gt;Here's what you'll need to follow along. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An Azure Subscription with a Resource Group that YOU are the owner of. We are going to create a Service Principal that will be scoped to this Resource Group and this requires that you are an owner of it because you are delegating access to the Resource Group. Note that to create a Resource Group you need to be a Contributor or an Owner of a Subscription. Otherwise an Owner or Contributor will need to create a Resource Group for you and make you an owner.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.microsoft.com/en-us/cli/azure/get-started-with-azure-cli"&gt;The Azure CLI&lt;/a&gt;. We will need two instances of the CLI running. In one instance you will sign in with YOUR credentials to create Service Principals and Roles. In the other instance you will sign in as the Least Privilege Service Principal.&lt;/li&gt;
&lt;li&gt;A simple text editor. I'm using &lt;a href="https://code.visualstudio.com/"&gt;VS Code&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;An ARM Template to deploy. You can find 100s of examples on the &lt;a href="https://azure.microsoft.com/en-us/resources/templates"&gt;Azure Quickstart Templates&lt;/a&gt; site. I am going to use the &lt;a href="https://azure.microsoft.com/en-us/resources/templates/umbraco-webapp-simple/"&gt;Simple Umbraco CMS Web App&lt;/a&gt; Template. It uses various services like Azure App Service, Azure Storage, Azure SQL DB and Application Insights. Our Service Principal should ONLY be able to deploy those services in our chosen Resource Group. If we tried to use the same Service Principal to deploy Virtual Machines or Container Instances, it should fail.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Let's Start
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Create a Service Principal
&lt;/h4&gt;

&lt;p&gt;Sign in to the Azure CLI with your credentials and create a service principal. I will refer to this as your &lt;strong&gt;User CLI&lt;/strong&gt; instance.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az ad sp create-for-rbac -n "leastsp" --skip-assignment
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will return something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "appId": "55555555-5555-5555-5555-555555555555",
  "displayName": "leastsp",
  "name": "http://leastsp",
  "password": "SuPerSecretP@ssw0rd",
  "tenant": "00000000-0000-0000-0000-000000000000"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure to copy and paste these details somewhere; you won't be able to see the password again! &lt;/p&gt;

&lt;p&gt;Right now we have a Service Principal that has no permissions to do anything. It's just an identity in your Azure AD. Let's try to login to the Azure CLI. &lt;/p&gt;

&lt;p&gt;Start another CLI instance. I will refer to this as the &lt;strong&gt;SP CLI&lt;/strong&gt;. Sign in with this command (of course update the values with &lt;em&gt;YOUR&lt;/em&gt; values):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az login --service-principal -u http://leastsp --tenant 00000000-0000-0000-0000-000000000000 -p SuPerSecretP@ssw0rd
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should get a message back that says &lt;code&gt;No subscriptions found for http://leastsp.&lt;/code&gt; That makes sense, this principal has no permissions to do anything yet! &lt;/p&gt;

&lt;p&gt;We add permissions to a principal by assigning it a &lt;strong&gt;role&lt;/strong&gt;. A role is essentially a container of permissions. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Principals have roles. &lt;/li&gt;
&lt;li&gt;Roles have permissions. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's look at what roles are assigned to our Service Principal. &lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;User CLI&lt;/strong&gt; run this command (remember to use &lt;em&gt;YOUR&lt;/em&gt; values):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role assignment list --assignee 55555555-5555-5555-5555-555555555555
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This returns an empty array &lt;code&gt;[]&lt;/code&gt;. No roles assigned. 😦&lt;/p&gt;

&lt;p&gt;To let the Service Principal login to your Azure subscription, let's give it a &lt;strong&gt;Reader&lt;/strong&gt; role of the Resource Group that you own. If the Resource Group is called &lt;code&gt;Least&lt;/code&gt;,  The ID for that Resource Group will be &lt;code&gt;/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Go back to your 1st CLI (the &lt;strong&gt;User CLI&lt;/strong&gt;), the one where YOU are logged in. Type this in (last reminder to use &lt;em&gt;YOUR&lt;/em&gt; values):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role assignment create --role Reader --assignee 55555555-5555-5555-5555-555555555555 --scope "/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;strong&gt;SP CLI&lt;/strong&gt; try logging in again:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az login --service-principal -u http://leastsp --tenant 00000000-0000-0000-0000-000000000000 -p SuPerSecretP@ssw0rd
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This time you should be successful and see something like this get returned:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[
  {
    "cloudName": "AzureCloud",
    "homeTenantId": "00000000-0000-0000-0000-000000000000",
    "id": "11111111-1111-1111-1111-111111111111",
    "isDefault": true,
    "managedByTenants": [],
    "name": "Subscription Name",
    "state": "Enabled",
    "tenantId": "00000000-0000-0000-0000-000000000000",
    "user": {
      "name": "http://leastsp",
      "type": "servicePrincipal"
    }
  }
]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And we should see the one Resource Group that our SP is now a Reader of:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az group list -o table

Name    Location    Status
------  ----------  ---------
least   eastus      Succeeded
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The SP should ONLY be able to see the Resource Group that it is scoped to and nothing else in the subscription or any other subscriptions.&lt;/p&gt;

&lt;p&gt;We can say that &lt;strong&gt;leastsp&lt;/strong&gt; now has the Reader role, scoped to the &lt;strong&gt;Least&lt;/strong&gt; Resource Group.&lt;/p&gt;

&lt;p&gt;Now, let's try to deploy the &lt;a href="https://azure.microsoft.com/en-us/resources/templates/umbraco-webapp-simple/"&gt;Simple Umbraco&lt;/a&gt; template. We'll use the CLI command on that page to do the deployment. Remember to do this in your &lt;strong&gt;SP CLI&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az group deployment create --resource-group least --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/umbraco-webapp-simple/azuredeploy.json --p '@parameters.json'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you might expect, we get an error:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"error":{"code":"AuthorizationFailed","message":"The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have authorization to perform action 'Microsoft.Resources/deployments/validate/action' over scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/least/providers/Microsoft.Resources/deployments/azuredeploy' or the scope is invalid. If access was recently granted, please refresh your credentials."}}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This makes complete sense. Your SP only has a &lt;strong&gt;Reader&lt;/strong&gt; role. We shouldn't expect a Reader to be able to deploy services! That would require the "write" kind of permissions. 😄&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;To deploy this template with least privilege, we will need to create a Role with only the permissions that are required.&lt;/em&gt;&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Let's begin.&lt;/p&gt;

&lt;h3&gt;
  
  
  Building the Role
&lt;/h3&gt;

&lt;p&gt;This is the main part of the article. Stay with me! 😅&lt;/p&gt;

&lt;p&gt;We will create a role and add all the permissions we need to it. When we try to deploy the ARM Template, we will get an error message (like the one above) telling us about additional permissions that we need. We will add the permissions to the role and try to do the deployment again. We may need to repeat this several times as each step of the deployment reveals new permissions that are needed.&lt;/p&gt;

&lt;p&gt;Look at the error message above. It's says our Service Principal doesn't have permissions to perform &lt;code&gt;Microsoft.Resources/deployments/validate/action&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Let's create a role that has this permission and assign it to our SP. Start by creating a role definition file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "type": "Microsoft.Authorization/roleDefinitions",
    "roleName": "leastprivilegeappdeployer",
    "description": "Least Privilege App Deployer",
    "assignableScopes": [
        "/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least"
    ],
    "name": "leastprivilegeappdeployer",
    "roleType": "CustomRole",
    "permissions": [
        {
            "actions": [
                "Microsoft.Resources/deployments/validate/action"
            ],
            "notActions": [],
            "dataActions": [],
            "notDataActions": []
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save this file as &lt;strong&gt;role.json&lt;/strong&gt;. You see that the role is called "leastprivilegeappdeployer" and it is assigned to our &lt;strong&gt;least&lt;/strong&gt; resource group. The only permission it has is the perform the deployment validation action.&lt;/p&gt;

&lt;p&gt;Let's create this role in your &lt;strong&gt;User CLI&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role definition create --role-definition role.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And, again in our &lt;strong&gt;User CLI&lt;/strong&gt;, assign this role to our Service Principal&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role assignment create --role leastprivilegeappdeployer --assignee 55555555-5555-5555-5555-555555555555 --scope "/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We have now added a role to our SP that allows it to validate deployments. Let's try our deployment again. Switch over to the &lt;strong&gt;SP CLI&lt;/strong&gt;. Before we redeploy, we should logout and log back in to make sure the CLI is updated with the SP's role.&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;SP CLI&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az logout

az login --service-principal -u http://leastsp --tenant 00000000-0000-0000-0000-000000000000 -p SuPerSecretP@ssw0rd
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now try the deployment again in the &lt;strong&gt;SP CLI&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az group deployment create --resource-group least --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/umbraco-webapp-simple/azuredeploy.json --p '@parameters.json'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We should now get this &lt;strong&gt;&lt;em&gt;REALLY LONG&lt;/em&gt;&lt;/strong&gt; error:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{"error":{"code":"InvalidTemplateDeployment","message":"Deployment failed with multiple errors: 'Authorization failed for template resource 'umbracolzybcxduxe526' of type 'Microsoft.Sql/servers'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Sql/servers/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Sql/servers/umbracolzybcxduxe526'.:Authorization failed for template resource 'umbracolzybcxduxe526/umbraco-db' of type 'Microsoft.Sql/servers/databases'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Sql/servers/databases/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Sql/servers/umbracolzybcxduxe526/databases/umbraco-db'.:Authorization failed for template resource 'umbracolzybcxduxe526/AllowAllWindowsAzureIps' of type 'Microsoft.Sql/servers/firewallrules'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Sql/servers/firewallrules/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Sql/servers/umbracolzybcxduxe526/firewallrules/AllowAllWindowsAzureIps'.:Authorization failed for template resource 'lzybcxduxe526standardsa' of type 'Microsoft.Storage/storageAccounts'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Storage/storageAccounts/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Storage/storageAccounts/lzybcxduxe526standardsa'.:Authorization failed for template resource 'umbracolzybcxduxe526serviceplan' of type 'Microsoft.Web/serverFarms'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Web/serverFarms/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Web/serverFarms/umbracolzybcxduxe526serviceplan'.:Authorization failed for template resource 'umbracolzybcxduxe526' of type 'Microsoft.Web/Sites'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Web/Sites/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Web/Sites/umbracolzybcxduxe526'.:Authorization failed for template resource 'umbracolzybcxduxe526/MSDeploy' of type 'Microsoft.Web/Sites/Extensions'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Web/Sites/Extensions/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Web/Sites/umbracolzybcxduxe526/Extensions/MSDeploy'.:Authorization failed for template resource 'umbracolzybcxduxe526/connectionstrings' of type 'Microsoft.Web/Sites/config'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Web/Sites/config/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Web/Sites/umbracolzybcxduxe526/config/connectionstrings'.:Authorization failed for template resource 'umbracolzybcxduxe526/web' of type 'Microsoft.Web/Sites/config'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'Microsoft.Web/Sites/config/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/Microsoft.Web/Sites/umbracolzybcxduxe526/config/web'.:Authorization failed for template resource 'umbracolzybcxduxe526serviceplan-scaleset' of type 'microsoft.insights/autoscalesettings'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'microsoft.insights/autoscalesettings/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/microsoft.insights/autoscalesettings/umbracolzybcxduxe526serviceplan-scaleset'.:Authorization failed for template resource 'umbracolzybcxduxe526-appin' of type 'microsoft.insights/components'. The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have permission to perform action 'microsoft.insights/components/write' at scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least/providers/microsoft.insights/components/umbracolzybcxduxe526-appin'.'"}}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is scary but it's also great because it gives us everything we need to fix it! Our SP can now validate the deployment but the validation shows that we need more permissions. We can pull all of these permissions out of the error message and update our custom role. Let's add these permissions to our &lt;strong&gt;role.json&lt;/strong&gt; file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "type": "Microsoft.Authorization/roleDefinitions",
    "roleName": "leastprivilegeappdeployer",
    "description": "Least Privilege App Deployer",
    "assignableScopes": [
        "/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least"
    ],

    "id": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.Authorization/roleDefinitions/33333333-3333-3333-3333-333333333333",
    "name": "33333333-3333-3333-3333-333333333333",
    "roleType": "CustomRole",
    "permissions": [
        {
            "actions": [
                "Microsoft.Resources/deployments/validate/action",
                "Microsoft.Sql/servers/write",
                "Microsoft.Sql/servers/databases/write",
                "Microsoft.Sql/servers/firewallrules/write",
                "Microsoft.Storage/storageAccounts/write",
                "Microsoft.Web/serverFarms/write",
                "Microsoft.Web/Sites/write",
                "Microsoft.Web/Sites/Extensions/write",
                "Microsoft.Web/Sites/config/write",
                "microsoft.insights/autoscalesettings/write",
                "microsoft.insights/components/write"
            ],
            "notActions": [],
            "dataActions": [],
            "notDataActions": []
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll see that I also included the &lt;strong&gt;id&lt;/strong&gt; field in the json. This value was generated for us by Azure and since we are going to update the role, we need to include the generated id going forward. You can get the id via the command &lt;code&gt;az role definition list&lt;/code&gt; in the &lt;strong&gt;User CLI&lt;/strong&gt;. Here's an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role definition list --scope /subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least --custom-role-only -n leastprivilegeappdeployer --query [0].id
Result
------------------------------------------------------------------------------------------------------------------------------------------
/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.Authorization/roleDefinitions/33333333-3333-3333-3333-333333333333
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;strong&gt;User CLI&lt;/strong&gt;, run the role update command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role definition update --role-definition role.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's try the deployment again.&lt;br&gt;
Back in the &lt;strong&gt;SP CLI&lt;/strong&gt;, log out and login again and then try the deployment again.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az logout

az login --service-principal -u http://leastsp --tenant 00000000-0000-0000-0000-000000000000 -p SuPerSecretP@ssw0rd

az group deployment create --resource-group least --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/umbraco-webapp-simple/azuredeploy.json --p '@parameters.json'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We are SO close! 😉&lt;/p&gt;

&lt;p&gt;There's another error, but I promise this is the LAST one.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Azure Error: AuthorizationFailed
Message: The client '22222222-2222-2222-2222-222222222222' with object id '22222222-2222-2222-2222-222222222222' does not have authorization to perform action 'Microsoft.Resources/deployments/write' over scope '/subscriptions/11111111-1111-1111-1111-111111111111/resourcegroups/least/providers/Microsoft.Resources/deployments/azuredeploy' or the scope is invalid. If access was recently granted, please refresh your credentials.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our role just needs the permission to actually write the deployments. Let's update our &lt;strong&gt;role.json&lt;/strong&gt; one more time to give it this permission.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "type": "Microsoft.Authorization/roleDefinitions",
    "roleName": "leastprivilegeappdeployer",
    "description": "Least Privilege App Deployer",
    "assignableScopes": [
        "/subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/least"
    ],

    "id": "/subscriptions/11111111-1111-1111-1111-111111111111/providers/Microsoft.Authorization/roleDefinitions/33333333-3333-3333-3333-333333333333",
    "name": "33333333-3333-3333-3333-333333333333",
    "roleType": "CustomRole",
    "permissions": [
        {
            "actions": [
                "Microsoft.Resources/deployments/validate/action",
                "Microsoft.Resources/deployments/write",
                "Microsoft.Sql/servers/write",
                "Microsoft.Sql/servers/databases/write",
                "Microsoft.Sql/servers/firewallrules/write",
                "Microsoft.Storage/storageAccounts/write",
                "Microsoft.Web/serverFarms/write",
                "Microsoft.Web/Sites/write",
                "Microsoft.Web/Sites/Extensions/write",
                "Microsoft.Web/Sites/config/write",
                "microsoft.insights/autoscalesettings/write",
                "microsoft.insights/components/write"
            ],
            "notActions": [],
            "dataActions": [],
            "notDataActions": []
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;strong&gt;User CLI&lt;/strong&gt; run the role update command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az role definition update --role-definition role.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And now, in the &lt;strong&gt;SP CLI&lt;/strong&gt;, logout, login and do the deployment.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az logout

az login --service-principal -u http://leastsp --tenant 00000000-0000-0000-0000-000000000000 -p SuPerSecretP@ssw0rd

az group deployment create --resource-group least --template-uri https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/umbraco-webapp-simple/azuredeploy.json --p '@parameters.json'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If it worked, congratulations! 👏 You built a role scoped to a single Resource Group with the least amount of permissions required to deploy this ARM template. You assigned the role to a Service Principal and did the deployment. You can visit the web app that got built and start configuring Umbraco. Awesome! &lt;/p&gt;

&lt;p&gt;Remember, you can follow these steps with any other ARM Template you want to use.&lt;/p&gt;

&lt;p&gt;If it didn't work, let me know what went wrong and I'd be happy to help you figure it out.&lt;/p&gt;

&lt;h3&gt;
  
  
  Clean Up
&lt;/h3&gt;

&lt;p&gt;When you're done trying this out, remember to delete your Service Principal, your Custom Role and your Resource Group with all the resources from this template. In the &lt;strong&gt;User CLI&lt;/strong&gt; you can use these commands with the appropriate flags:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az ad sp delete 

az role definition delete

az group delete
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Final Thoughts
&lt;/h3&gt;

&lt;p&gt;Providing your operations team, your devops pipeline or your customers with an ARM Template is only a part of a secure automated deployment process. You should also provide a role that has the least amount of privileges to deploy that ARM Template. If a new service gets added to the ARM Template, the role should also be updated to reflect that change. Adopting this process helps reduce risk and exposure, especially from a security, compliance and cost control perspective.&lt;/p&gt;

&lt;p&gt;If you're building roles that will be doing automated deployments, you can assume that you'll need the &lt;code&gt;Microsoft.Resources/deployments/validate/action&lt;/code&gt; and the  &lt;code&gt;Microsoft.Resources/deployments/write&lt;/code&gt; permissions, so you can always start a role with those permissions.&lt;/p&gt;

&lt;p&gt;I hope this was a helpful tutorial and that you'll be able to use this to secure your automated deployments in the future! If you have any feedback or suggestions please share them or leave them in the comments. &lt;/p&gt;

&lt;p&gt;Thanks! &lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>security</category>
      <category>cloud</category>
    </item>
    <item>
      <title>The Cloud Solution Architect Checklist</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 08 Feb 2021 02:12:29 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/the-cloud-solution-architect-checklist-2fm6</link>
      <guid>https://dev.to/michaelsrichter/the-cloud-solution-architect-checklist-2fm6</guid>
      <description>&lt;p&gt;As a Cloud Solution Architect, I help Microsoft's partners build solutions on Azure. Often the way forward is not clear and using a checklist can help point us in the right direction. Here's a discussion of what checklists are and why they're useful.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why do we need a checklist?
&lt;/h3&gt;

&lt;p&gt;Here are some interesting health care stats that will set the stage.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Hospital doctors evaluate &lt;strong&gt;250&lt;/strong&gt; different disease and conditions per year.&lt;/li&gt;
&lt;li&gt;Intensive Care Unit (ICU) patients require, on average &lt;strong&gt;178&lt;/strong&gt; daily tasks and about half of these patients experience a complication.&lt;/li&gt;
&lt;li&gt;In one study of &lt;strong&gt;41,000&lt;/strong&gt; trauma patients:

&lt;ul&gt;
&lt;li&gt;Unique injury diagnoses: &lt;strong&gt;1,224&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Combination of diagnoses: &lt;strong&gt;32,261&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;In 2006 there were &lt;strong&gt;230 Million&lt;/strong&gt; surgeries globally

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;3% - 17%&lt;/strong&gt; had complications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;7 million&lt;/strong&gt; people died.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;What do these stats have to do with checklists and cloud architecture?&lt;/p&gt;

&lt;p&gt;First, all of these statistics come from the excellent book "&lt;a href="https://www.amazon.com/Checklist-Manifesto-How-Things-Right/dp/0312430000" rel="noopener noreferrer"&gt;The Checklist Manifesto&lt;/a&gt;" by Atul Gawande.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgduuzz8jq6tbffjb8ld4.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgduuzz8jq6tbffjb8ld4.PNG" alt="Checklist Manifest Book Cover"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you look at these stats, you can see how complicated it is to diagnose and treat healthcare issues. &lt;/p&gt;

&lt;p&gt;It's also very complicated to design cloud architectures. &lt;/p&gt;

&lt;p&gt;Now, to be clear, &lt;em&gt;&lt;strong&gt;I AM NOT&lt;/strong&gt;&lt;/em&gt; suggesting that cloud architecture is as complicated as diagnosing and treating healthcare issues. Nor am I suggesting the problems of cloud architecture are as serious or important as saving people's lives. I am merely suggesting that when things get complicated a checklist can come in handy.&lt;/p&gt;

&lt;p&gt;Here's an eye chart of the Cloud Native ecosystem from the &lt;a href="https://landscape.cncf.io/" rel="noopener noreferrer"&gt;CNCF&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp42fhxepw7sy3kjbnv0i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp42fhxepw7sy3kjbnv0i.png" alt="Cloud Native Ecosystem"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Think about this eye chart and then think about every other technology domain – security, data, identity, monitoring, etc. They have their own eye charts. There are thousands of options in near infinite combinations. An architecture, especially an existing/legacy one, is going to be a Frankenstein of all these different services. And, as an architect, it’s your job to design and/or validate these architectures for your cloud platform. &lt;/p&gt;

&lt;p&gt;And just think about how complicated that cloud platform is on its own! I work at Microsoft and Azure itself has 100s of different services. Check out this eye chart from &lt;a href="https://azurecharts.com/" rel="noopener noreferrer"&gt;Azure Charts&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsyb662b4v0fgommf82hg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsyb662b4v0fgommf82hg.png" alt="Azure Eye Chart"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  How can a checklist help?
&lt;/h3&gt;

&lt;p&gt;So, how can a checklist help with all this complexity? &lt;/p&gt;

&lt;p&gt;Let's go back to those healthcare examples and look what happens when simple checklists were implemented.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In 2001, John Hopkins Hospital implemented a &lt;strong&gt;5 question&lt;/strong&gt; checklist for ICU doctors. After 15 months they prevented &lt;strong&gt;43&lt;/strong&gt; infections, &lt;strong&gt;8&lt;/strong&gt; deaths and saved &lt;strong&gt;$2 Million&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;In 2009, 8 hospital piloted a 2-minute, 19 question surgery checklist. After 3 months, surgery complications fell &lt;strong&gt;36%&lt;/strong&gt; and deaths fell &lt;strong&gt;47%.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's be clear: 5 questions saved a hospital two million dollars over 15 months. Spending 2 minutes at the beginning of a surgery cut the number of deaths in half! &lt;/p&gt;

&lt;p&gt;Again, these impressive stats come from "The Checklist Manifesto." A checklist is just asking a few question before starting a process. It forces you to check things every time you start a process. Without a checklist, you might rely on just remmebering to check these things. But if you forget to do this thing just once, it can have a tremendously bad impact. Checklists are used in a lot of other place besides healthcare: Airplane pilots use them as well as &lt;em&gt;non-Cloud&lt;/em&gt; architects and investors. &lt;/p&gt;

&lt;p&gt;Checklists are a great tool for avoiding errors and for lowering stress. There's a great &lt;a href="https://freakonomics.com/podcast/atul-gawande/" rel="noopener noreferrer"&gt;Freakonomics podcast episode&lt;/a&gt; featuring Atul Gawande, the author of The Checklist Manifesto, if you want to learn more about the power of checklists. The book is also a great quick read! &lt;/p&gt;

&lt;h3&gt;
  
  
  My Checklist
&lt;/h3&gt;

&lt;p&gt;Last year I worked with my colleagues at Microsoft, who are among some of the best partner architects, to compile the checklists we use when working with Microsoft Partners. We shared our &lt;a href="https://github.com/Azure/Solution-Architecture-Questions" rel="noopener noreferrer"&gt;Solution Architecture Questions&lt;/a&gt; (SAQ) on Github. Anyone is free to use this list, share it and contribute to it. Some of the questions are Azure partnership specific, but they can be used when designing architectures in any cloud environment. &lt;/p&gt;

&lt;p&gt;I also started working on a web app for finding these questions, asking them and recording the answers. You can find it at &lt;a href="https://www.saq.monster/" rel="noopener noreferrer"&gt;https://saq.monster&lt;/a&gt;. Once you create an account, this web app runs completely in the browser. It leverages &lt;a href="https://docs.microsoft.com/en-us/aspnet/core/blazor/host-and-deploy/webassembly?view=aspnetcore-5.0" rel="noopener noreferrer"&gt;Blazor Web Assembly&lt;/a&gt; and &lt;a href="https://azure.microsoft.com/en-us/services/app-service/static/" rel="noopener noreferrer"&gt;Azure Static Web Apps&lt;/a&gt;. So whatever data you put into the app remains in your browser only. Eventually I may build a cloud sync service behind it, but for now it remains completely client-only. &lt;/p&gt;

&lt;p&gt;What are your thoughts on checklists and the list we built? Are there strong opinions out there about using checklists in the field of cloud architecture? Do individuals have their own lists? I would love to hear your thoughts on this!&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>cloud</category>
      <category>azure</category>
    </item>
    <item>
      <title>When Their Stuff Doesn't Work With Your Stuff</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Sun, 31 Jan 2021 21:08:14 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/when-their-stuff-doesn-t-work-with-your-stuff-58gm</link>
      <guid>https://dev.to/michaelsrichter/when-their-stuff-doesn-t-work-with-your-stuff-58gm</guid>
      <description>&lt;p&gt;Here's the situation: There is an open source &lt;strong&gt;library&lt;/strong&gt; that you love to use that works with an open source &lt;strong&gt;platform&lt;/strong&gt; that your solution is built on. You use this &lt;strong&gt;library&lt;/strong&gt; all the time and it's part of your standard set of tools for operating and managing the &lt;strong&gt;platform&lt;/strong&gt;. Now you move the solution to a new cloud vendor which provides you the &lt;strong&gt;platform&lt;/strong&gt; as a managed service. But, unfortunately the &lt;strong&gt;library&lt;/strong&gt; you know and love no longer works in this new environment! &lt;/p&gt;

&lt;p&gt;Let's explore this very real situation that unfortunately comes up too regularly. At the heart of this common problem is the risk that is part of any non-trivial software system. &lt;/p&gt;

&lt;p&gt;There are 3 parts to this puzzle.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The Open Source Software or Platform itself.&lt;/li&gt;
&lt;li&gt;The Cloud Flavor of this Open Source Software or Platform.&lt;/li&gt;
&lt;li&gt;The Ecosystem of tools and libraries around the Open Source Software or Platform.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Note that I wont cite any specific examples to protect the names of the innocent :). &lt;/p&gt;

&lt;h3&gt;
  
  
  Cloud Flavors of Open Source
&lt;/h3&gt;

&lt;p&gt;Modern software architecture today means using lots of different solutions from a lot of different sources. We use cloud services, 3rd party applications and open-source software. Making things more complicated is that many cloud providers have their own flavors of popular open-source software. For instance: AWS MySQL vs GCP MySQL vs Azure MySQL. You maybe thinking: 'So what? MySQL is MySQL is MySQL.' And, like-wise, 'Kubernetes is just Kubernetes.' Yes, but, and there is a big BUT: each platform brings its own features and integrates them into their own offerings. &lt;/p&gt;

&lt;p&gt;These different &lt;em&gt;Cloud Flavors&lt;/em&gt; often require specific adjustments, tweaking and testing. Usually it is not the core workload that varies among these Cloud Flavors. Rather, it is in the operationalization (observability, security, fault tolerance, etc.) where there can be lots of differences.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--d2Jy59PJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/74txpw9i503mj9tmpt2g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--d2Jy59PJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/74txpw9i503mj9tmpt2g.png" alt="Open Source Flavors on Different Cloud Vendors"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Open Source Ecosystem around Open Source.
&lt;/h3&gt;

&lt;p&gt;Popular Open Source platforms, especially Open Source platforms, often have an ecosystem of other Open Source tools around them. Tools like libraries, plugins and connectors that provide additional features and integrations are widely used by the community. Think of WordPress and the &lt;a href="https://wordpress.org/plugins/"&gt;tens of thousands of plugins&lt;/a&gt; available for it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ae2rMBS_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/zkkltdzczzar4avww75n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ae2rMBS_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/zkkltdzczzar4avww75n.png" alt="Open Source Ecosystem"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So cloud vendors have their own implementations of open source software. That open source software has its own ecosystem of open source tools. Everything should all work together, right? Ideally, yes, but that is not always the case. And when things don't work as expected who is responsible for fixing it?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0zVhc0H_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/00winnbqfycjfji24bjz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0zVhc0H_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/00winnbqfycjfji24bjz.png" alt="Open Source Ecosystem and the Cloud"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Look at the image above. Suppose you want to use "Tool 1" with Open Source Software on Cloud Vendor C and it doesn't work as expected. &lt;/p&gt;

&lt;h3&gt;
  
  
  Who is Responsible?
&lt;/h3&gt;

&lt;p&gt;Who is responsible for making sure the tool works?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Open Source Tool 1 maintainers?&lt;/li&gt;
&lt;li&gt;The Open Source Software maintainers?&lt;/li&gt;
&lt;li&gt;The Cloud Vendor C engineers?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We know it's a problem for you when you want to get your solution working. Who are you likely to expect to fix it? &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My experience tells me that customers believe it is the Cloud Vendor who should be responsible.&lt;/strong&gt; After all, the tool works with the platform in the non-managed service. The only thing different is the cloud vendor's implementation. But it is not that simple.&lt;/p&gt;

&lt;p&gt;Usually the cloud vendor is building a specific implementation to simplify management of the software, and that can come with some trade-offs or lack of support for some integrations. Like WordPress, popular open source software and platforms can have an ecosystem of dozens, hundreds or thousands of libraries and plugins. How is the Cloud Vendor supposed to test, let alone support them all?&lt;/p&gt;

&lt;h3&gt;
  
  
  Look to the Docs!
&lt;/h3&gt;

&lt;p&gt;Often you will find documentation and support in various places and that might begin to help us figure out who should be responsible.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The tool maintainers may provide guidance for installing and running the tool on different cloud vendors. If they do that, then they should encourage the community to open bugs when problems are discovered.&lt;/li&gt;
&lt;li&gt;The cloud vendor may also provide specific documentation on how to use its service with popular 3rd party plugins and tools. If they provide this documentation, you can assume it's supported and customers should be able to submit support tickets when things don't work as expected.&lt;/li&gt;
&lt;li&gt;Rarely would the Open Source platform itself be responsible for fixing how 3rd party solutions integrate or operationalize the software. &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Where is the Risk?
&lt;/h3&gt;

&lt;p&gt;If the cloud vendor doesn't provide specific documentation, you may be using that tool at your own risk. How much risk is there?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Does the tool project website say they specifically support the cloud vendor? That's good.&lt;/li&gt;
&lt;li&gt;Does the cloud vendor say they support the tool and provide documentation? That's even better.&lt;/li&gt;
&lt;li&gt;If the cloud vendor doesn't support it, things may work today, but may not work tomorrow. When the cloud vendor makes a change, they wont test it against unsupported 3rd party tools. If a change breaks the tool, you'll have to wait for the tool maintainers to update it (or submit a pull request with your own fix!).&lt;/li&gt;
&lt;li&gt;If the tool is focused more on the core workload of the platform, there is generally less risk.&lt;/li&gt;
&lt;li&gt;if the tool is focused on the operationalization of the platform there is generally more risk.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Final Thoughts
&lt;/h3&gt;

&lt;p&gt;Always try to use supported libraries and tools when building solutions. The open source ecosystem combined with open source platforms and a cloud flavor, can create infinite variations. It may be hard to always build a solution with tools that are 100% supported in all these environments. Make explicit decisions and document what tools you use and for what reasons. &lt;a href="https://www.castsoftware.com/research-labs/risk-management-in-software-development-and-software-engineering-projects"&gt;Calculate the risk and have a mitigation plan&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Finally, as a Cloud Architect, I look at the world of opinionated software development to get a perspective on this. Think of &lt;a href="https://spring.io/projects/spring-boot"&gt;Java Spring Boot&lt;/a&gt; and how it lets you get productive with "minimal fuss." If you choose a Cloud Flavor of an open source platform, why not also choose the built-in capabilities and supported tools? You're choosing a managed service to reduce complexity and ultimately save costs, so try to embrace it's "opinion" on what tools you should use with it.&lt;/p&gt;

&lt;p&gt;Let me know what you think! &lt;/p&gt;

</description>
      <category>architecture</category>
      <category>opensource</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Parameters vs Architecture</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 25 Jan 2021 02:29:28 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/parameters-vs-architecture-1kfo</link>
      <guid>https://dev.to/michaelsrichter/parameters-vs-architecture-1kfo</guid>
      <description>&lt;p&gt;As an architect that works with Microsoft partners, I often get direct perspectives and opinions about Azure. These insights are sometimes obvious to me but are completely new to partners. Here's something that happened recently and I thought I'd share to help all architects everywhere on their cloud journey.&lt;/p&gt;

&lt;h3&gt;
  
  
  Automation
&lt;/h3&gt;

&lt;p&gt;We all know that the cloud makes it easy to spin up some service, try it out and shut it down when we're done.  The cloud also makes it easy to automate doing these things. If we build some parameterization into our automation then it's easy to spin up services for different kinds of environments. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Basic/cheap tiers for dev and test, and &lt;/li&gt;
&lt;li&gt;Premium/expensive tiers for production. Easy right? Not so fast! &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The reality is that many times there are architecture implications based on what SKU or tier of a service you are going to deploy.&lt;/strong&gt; The basic tier usually does not support all the features and integrations of a premium tier.  This makes building automation a little more complicated. You can't just swap out parameter values - you will need to adjust other architecture components.&lt;/p&gt;

&lt;h3&gt;
  
  
  An Example
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/services/mysql/"&gt;Azure Database for MySQL&lt;/a&gt; is a PAAS database service from Azure. It comes in &lt;a href="https://docs.microsoft.com/en-us/azure/mysql/concepts-pricing-tiers"&gt;several tiers&lt;/a&gt; - basic, general purpose and memory optimized.  The basic tier does not allow for a feature called Service Endpoints.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/virtual-network/virtual-network-service-endpoints-overview"&gt;Service Endpoints&lt;/a&gt; are a way for you to lock down who can access the service. With a service endpoint, you can say, &lt;em&gt;'only allow traffic from this subnet in this network to access my service.'&lt;/em&gt; That's a useful feature in a production scenario when you want to really restrict who can access the service. There are other ways to lock down access but that is beyond the scope of this article.&lt;/p&gt;

&lt;p&gt;If you set up a Basic mysql database and try to connect from a VNET that has a Service Endpoint enabled. You will get an error.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hcSvTNvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/96f52pm3xsgtl9b8notf.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hcSvTNvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/96f52pm3xsgtl9b8notf.PNG" alt="Service Endpoint Error"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But, if you set up a General Purpose mysql database and try to connect from a VNET that has a Service Endpoint enabled, it works.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lbdI1iqd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/puov7h3w2tw3z9shcv8z.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lbdI1iqd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/puov7h3w2tw3z9shcv8z.PNG" alt="Service Endpoint Works"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can build an &lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview"&gt;Azure Resource Manager (ARM) Template&lt;/a&gt; to automate deploying these service.  But when you want to deploy in a real production environment with a Service Endpoint, you will need to add additional components. See the example below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        {
            "type": "Microsoft.DBforMySQL/servers/virtualNetworkRules",
            "apiVersion": "2017-12-01",
            "name": "[concat(parameters('servers_mrichtermysqlgeneral_name'), '/serviceendpoint')]",
            "dependsOn": [
                "[resourceId('Microsoft.DBforMySQL/servers', parameters('servers_mrichtermysqlgeneral_name'))]"
            ],
            "properties": {
                "virtualNetworkSubnetId": "[concat(parameters('virtualNetworks_hcm_perf_vnet_eastus_externalid'), '/subnets/serviceendpoint')]",
                "ignoreMissingVnetServiceEndpoint": false
            }
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You need a way to build automation that includes this for production environments and excludes this non-production. It's a little more complicated than swapping parameters when deploying this arm template.&lt;/p&gt;

&lt;p&gt;Obviously this is just a simplistic example. You can imagine a solution with many different services can get much more complicated.&lt;/p&gt;

&lt;h3&gt;
  
  
  Just Be Aware
&lt;/h3&gt;

&lt;p&gt;As you can see, targeting different environments can have an impact on your automation. Azure does give you some ways to address this. ARM Templates support conditions, logical functions and nest templates. You can build a template that says If MySQL is Basic tier, use dev template, otherwise use prod template. You can learn more about &lt;a href="https://azure.microsoft.com/en-us/blog/create-flexible-arm-templates-using-conditions-and-logical-functions/"&gt;flexible arm templates from this blog article&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;So, yes: it is possible to create a single set of automation for a solution that targets different service tiers, utilizing different features for different environments. It's just a little bit more complicated than swapping parameter values. Be aware of this when planning your Azure cloud journey.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>architecture</category>
      <category>configuration</category>
      <category>automation</category>
    </item>
    <item>
      <title>Azure Files Transactions Experiments</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 18 Jan 2021 21:29:24 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/azure-files-transactions-experiments-2mfl</link>
      <guid>https://dev.to/michaelsrichter/azure-files-transactions-experiments-2mfl</guid>
      <description>&lt;p&gt;What are Azure File transactions? Can we measure them? Are all transactions equal? Let's find out! &lt;/p&gt;

&lt;h3&gt;
  
  
  Background
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/storage/files/storage-files-introduction" rel="noopener noreferrer"&gt;Azure Files&lt;/a&gt; is a file share service from Azure. It comes in several tiers: Premium, Transaction Optimized, Hot and Cool. All tiers charge you based on the amount of data in the file share. If you use any tier besides Premium, &lt;strong&gt;you are charged for transactions as well&lt;/strong&gt;. You can find more details on the &lt;a href="https://azure.microsoft.com/en-us/pricing/details/storage/files/" rel="noopener noreferrer"&gt;Azure Files pricing page&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;For Transaction Optimized (previously known as the Standard tier), &lt;strong&gt;Write&lt;/strong&gt; transactions cost $0.015 per 10,000.  Straightforward, right? &lt;/p&gt;

&lt;p&gt;But, &lt;strong&gt;what is a transaction&lt;/strong&gt; and how do you know when they happen and what causes them? A penny-and-a-half per 10 thousand transactions sounds reasonable, but do you know how many transactions your workload is going to need? Are all transactions equal? If I write a 1 MB file, is that the same as writing a 1 GB file when it comes to transactions? &lt;/p&gt;

&lt;h3&gt;
  
  
  Existing Documentation
&lt;/h3&gt;

&lt;p&gt;There is some documentation on what &lt;a href="https://docs.microsoft.com/en-us/azure/storage/files/understanding-billing#what-are-transactions" rel="noopener noreferrer"&gt;storage transactions are&lt;/a&gt;. But this doesn't give you a complete picture. To be fair to the Azure team, it can be difficult to give a complete picture. The Azure File service can be used multiple ways, all utilizing different transactions in different ways. For example, you can connect to Azure Files over SMB, NFS, or the Rest API. Each way will have an impact on what transactions happen and how often.&lt;/p&gt;

&lt;h3&gt;
  
  
  My Questions
&lt;/h3&gt;

&lt;p&gt;I wanted to see if there's a way to reduce transactions and optimize costs. Would writing fewer larger files reduce transactions and be cheaper than writing many smaller files? &lt;/p&gt;

&lt;p&gt;I did some of my own experiments. Here's what I was looking for: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I wanted to see what the transaction impact is when I add files of different sizes at different times. &lt;/li&gt;
&lt;li&gt;I also wanted to see if files with different sizes have an impact on transactions.&lt;/li&gt;
&lt;li&gt;How could I see transactions per file share within a storage account. In the Azure Portal, you can only see transaction metrics aggregated at the account level. But you can have many file shares per storage account; is there a way to see transactions per file share?&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Set Up
&lt;/h3&gt;

&lt;p&gt;I created a General Purpose v2 Storage Account in Azure and called it &lt;code&gt;transactiontests&lt;/code&gt;. Within the storage account I created a Transaction Optimized File Share and called it &lt;code&gt;fileshare1&lt;/code&gt;. The URI for the fileshare is &lt;code&gt;\\transactiontests.file.core.windows.net\fileshare1\&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Azure Storage accounts collects diagnostic logs and allows you to send them somewhere where you can inspect them. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuktqsuzajvo7y25f85yf.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuktqsuzajvo7y25f85yf.PNG" alt="diagnosticMenuItem"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I enabled Diagnostic Settings to write the logs to an Azure Log Analytics account. I used Log Analytics to query for transactions that were happening in specific time frames. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fygkboopd4l1d02gkxef3.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fygkboopd4l1d02gkxef3.PNG" alt="diagnosticSettings"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I created a Windows VM in the same region as my storage account and I mounted the file share to the Windows VM via SMB. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjp466vpitrf3sc1m7z3m.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjp466vpitrf3sc1m7z3m.PNG" alt="SMBMount"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I created dummy files of various sizes using this &lt;a href="https://pinetools.com/random-file-generator" rel="noopener noreferrer"&gt;Online Random File Generator&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Now I had everything I needed to run my experiments&lt;/p&gt;
&lt;h3&gt;
  
  
  The Experiments
&lt;/h3&gt;

&lt;p&gt;I added different files of different sizes to the file share over time. I started doing this one by one. I added a file, waited a minute for Diagnostic Settings to ship the logs to the Log Analytics workspace. Then in the Log Analytics portal experience, I would query to see what transactions were logged.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;StorageFileLogs
| where TimeGenerated &amp;gt; now(-3m)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's an example of the output&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvzhwkzqjgqrk4om4l439.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvzhwkzqjgqrk4om4l439.PNG" alt="loganalyticssampleoutput"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I kept track of &lt;strong&gt;Write&lt;/strong&gt; transactions since I was adding files to the File share. Any other transactions I just considered 'overhead.'&lt;/p&gt;

&lt;p&gt;I added a 1 KB file, a 100 KB file, 1 MB file and so on. You can see the table below in the &lt;strong&gt;Results&lt;/strong&gt; section for all the file sizes and the resulting transaction values.&lt;/p&gt;

&lt;p&gt;After adding each file one by one. I added three copies of files of the same size to the file share. I was curious to see if there was any decrease in overhead when doing multiple operations at the same time. &lt;/p&gt;

&lt;h3&gt;
  
  
  Results
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Transactions Per File Count and Size
&lt;/h4&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;# of Files&lt;/th&gt;
&lt;th&gt;File Size (MB)&lt;/th&gt;
&lt;th&gt;# of Transactions&lt;/th&gt;
&lt;th&gt;# of Writes&lt;/th&gt;
&lt;th&gt;Writers per MB&lt;/th&gt;
&lt;th&gt;Transaction Overhead&lt;/th&gt;
&lt;th&gt;Overhead / File&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;0.001&lt;/td&gt;
&lt;td&gt;34&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;33&lt;/td&gt;
&lt;td&gt;33.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;0.100&lt;/td&gt;
&lt;td&gt;48&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;47&lt;/td&gt;
&lt;td&gt;47.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;1.000&lt;/td&gt;
&lt;td&gt;38&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;34&lt;/td&gt;
&lt;td&gt;34.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;5.000&lt;/td&gt;
&lt;td&gt;48&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;38&lt;/td&gt;
&lt;td&gt;38.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;100.000&lt;/td&gt;
&lt;td&gt;136&lt;/td&gt;
&lt;td&gt;100&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;36&lt;/td&gt;
&lt;td&gt;36.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;1024.000&lt;/td&gt;
&lt;td&gt;1057&lt;/td&gt;
&lt;td&gt;1024&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;33&lt;/td&gt;
&lt;td&gt;33.0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;0.001&lt;/td&gt;
&lt;td&gt;53&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;50&lt;/td&gt;
&lt;td&gt;16.7&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;0.100&lt;/td&gt;
&lt;td&gt;58&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;55&lt;/td&gt;
&lt;td&gt;18.3&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;1.000&lt;/td&gt;
&lt;td&gt;62&lt;/td&gt;
&lt;td&gt;12&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;50&lt;/td&gt;
&lt;td&gt;16.7&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;5.000&lt;/td&gt;
&lt;td&gt;85&lt;/td&gt;
&lt;td&gt;30&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;55&lt;/td&gt;
&lt;td&gt;18.3&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;100.000&lt;/td&gt;
&lt;td&gt;364&lt;/td&gt;
&lt;td&gt;300&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;64&lt;/td&gt;
&lt;td&gt;21.3&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;1024.000&lt;/td&gt;
&lt;td&gt;3142&lt;/td&gt;
&lt;td&gt;3072&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;70&lt;/td&gt;
&lt;td&gt;23.3&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h4&gt;
  
  
  Breakdown of Transaction Types
&lt;/h4&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;OperationName&lt;/th&gt;
&lt;th&gt;Count&lt;/th&gt;
&lt;th&gt;% of all&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;QueryInfo&lt;/td&gt;
&lt;td&gt;116&lt;/td&gt;
&lt;td&gt;2%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;QueryDirectory&lt;/td&gt;
&lt;td&gt;46&lt;/td&gt;
&lt;td&gt;1%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Create&lt;/td&gt;
&lt;td&gt;212&lt;/td&gt;
&lt;td&gt;4%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;TreeConnect&lt;/td&gt;
&lt;td&gt;21&lt;/td&gt;
&lt;td&gt;0%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Ioctl&lt;/td&gt;
&lt;td&gt;81&lt;/td&gt;
&lt;td&gt;2%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Close&lt;/td&gt;
&lt;td&gt;103&lt;/td&gt;
&lt;td&gt;2%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;SetInfo&lt;/td&gt;
&lt;td&gt;48&lt;/td&gt;
&lt;td&gt;1%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Write&lt;/td&gt;
&lt;td&gt;4560&lt;/td&gt;
&lt;td&gt;88%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;TreeDisconnect&lt;/td&gt;
&lt;td&gt;19&lt;/td&gt;
&lt;td&gt;0%&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Thanks to &lt;a href="https://thisdavej.com/copy-table-in-excel-and-paste-as-a-markdown-table/" rel="noopener noreferrer"&gt;this tool&lt;/a&gt; for converting Excel to Markdown!&lt;/p&gt;

&lt;h3&gt;
  
  
  Observations
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;As I suspected, writing files of different sizes do not equal the same number of writes. &lt;/li&gt;
&lt;li&gt;It looks like writing 1 MB counts as 1 Write transaction. &lt;/li&gt;
&lt;li&gt;For some reason adding an exactly 1 MB file results in four Write transactions and writing a 5 MB file results in 2 Write transactions. I saw this consistently whether I added 1 file at a time or 3.&lt;/li&gt;
&lt;li&gt;File sizes that were less than 1 MB results in a minimum of 1 Write transaction.&lt;/li&gt;
&lt;li&gt;Some overhead could be saved by adding multiple files at a time.&lt;/li&gt;
&lt;li&gt;The bulk of transactions were Writes. Considering many of the overhead transactions like &lt;code&gt;QueryDirectory&lt;/code&gt; or &lt;code&gt;TreeConnect&lt;/code&gt; cost even less than Writes, there may not be much cost savings for trying to reduce that overheard even more.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So to answer my original questions at the top of this post, here is what I learned: &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Larger files result in more Write transactions. Overhead transactions cost less than Writes and are a small over-all percentage of the transactions. Therefore, I do not believe I will save much money by adding fewer larger files as opposed to many smaller files.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I also learned that by viewing the diagnostic logs, I can see transactions per file share.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fivd2d2lfhbm42u3pmmvw.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fivd2d2lfhbm42u3pmmvw.PNG" alt="transactionsperfileshare"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;I feel better now about understanding how Azure File transactions happen. Obviously there are more experiments I can do, like reading files, modifying and deleting them. I am reasonably confident that the transaction counts will be similar in those situations as well; deleting large files will be more transactions than deleting small ones, etc.&lt;/p&gt;

&lt;p&gt;I hope this has been helpful! &lt;/p&gt;

</description>
      <category>azure</category>
      <category>storage</category>
    </item>
    <item>
      <title>Azure Hybrid Connection Manager Latency</title>
      <dc:creator>Mike Richter</dc:creator>
      <pubDate>Mon, 11 Jan 2021 02:45:51 +0000</pubDate>
      <link>https://dev.to/michaelsrichter/azure-hybrid-connection-manager-latency-54mi</link>
      <guid>https://dev.to/michaelsrichter/azure-hybrid-connection-manager-latency-54mi</guid>
      <description>&lt;p&gt;In this post we'll look at some non-scientific experiments I ran to measure the performance latency of the Hybrid Connection Manager (HCM) in Azure.&lt;/p&gt;

&lt;h1&gt;
  
  
  What is the Hybrid Connection Manager (HCM)?
&lt;/h1&gt;

&lt;p&gt;Do you need to connect an on-premises resource to a service in Azure? Does the idea of setting up VPNs or Express Routes, asking IT for dedicated IP addresses, and opening up in-bound ports make you queasy? Then Hybrid Connection Manager is a potential solution. You can read all about HCM in the &lt;a href="https://docs.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections"&gt;Azure Documentation&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;HCM is built specifically for App Service, Azure's Platform-As-A-Service offering for web apps and APIs. You can use HCM to connect to an on-premises database or other service. HCM has really nice integration with App Service making it really easy to set up.&lt;/p&gt;

&lt;p&gt;HCM is built on top of the &lt;a href="https://docs.microsoft.com/en-us/azure/azure-relay/relay-hybrid-connections-protocol"&gt;Azure Relay Hybrid connection&lt;/a&gt; capability. You can use Relays to build your own hybrid connections between other Azure services and your services running locally. HCM is specifically for App Service.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Experiments
&lt;/h1&gt;

&lt;p&gt;I wanted to see what kind of latency was introduced when using HCM. Theoretically it should not introduce much. HCM communicates via Azure Service Bus. As long as the App Service and Service Bus are in the same region, there should be virtually zero latency for that communication. Of course, there will be latency over the internet to the on-premises location. Optimizing for that is outside the scope of this post.&lt;/p&gt;

&lt;p&gt;I did two experiments&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What was latency like when connecting to servers in different parts of the world?&lt;/li&gt;
&lt;li&gt;What was latency like when connecting to servers in the same region as the app service?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In these experiments I set up VMs in different Azure regions to effectively "mock" on-premises environments. I don't have hardware for testing lying around all over the world. ;)&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 1 Set Up
&lt;/h3&gt;

&lt;p&gt;I created 3 VMs (SQL 2019, Win 2019) all the same SKU in Azure (A4v2, all the same configuration, using spot instances) and installed the Northwind sample database. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One in Europe, with HCM installed.&lt;/li&gt;
&lt;li&gt;One in East US.&lt;/li&gt;
&lt;li&gt;One in Australia, with HCM installed.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I created a simple Web API that will send the same query to any of these databases. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Web API is deployed to Azure App Service in East US.&lt;/li&gt;
&lt;li&gt;The App Service uses HCM to talk to Australia and Europe, the SQL Servers are not exposed to the internet.&lt;/li&gt;
&lt;li&gt;The App Service is connected to the VNET where the East US SQL Server is. So that call is over the network via Private IP Address. The database not exposed to the internet.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--M4nrDBK3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/6xu9v9tnz9j9208fb2ee.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--M4nrDBK3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/6xu9v9tnz9j9208fb2ee.png" alt="HCM Architecture Experiment 1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 1 Test
&lt;/h3&gt;

&lt;p&gt;I setup a JMeter Test Plan with 100 users, 1 second ramp up, 100 Loops per SQL Server – so 10,000 requests per SQL Server. &lt;strong&gt;All running the same query on each database.&lt;/strong&gt; It took a few minutes to run.&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 1 Results
&lt;/h3&gt;

&lt;p&gt;Here’s the application dependency map. Europe is on top, US in the middle, Australia on the bottom. You can see Australia took the longest. That makes sense, it's the furthest from US East.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nFxKnNL5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ws243grrx6ivrrjt5h43.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nFxKnNL5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ws243grrx6ivrrjt5h43.png" alt="Application Map, Europe, US East, Australia"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I also outputted the results of the SQL Statistics to make sure I was only comparing network time and not latency introduced due to compute time.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;customMetrics 
| where name == 'SQLStatistics'
| extend networkTime = toint(customDimensions["NetworkServerTime"]), database = tostring(customDimensions["Database"])
| project timestamp, networkTime, ['database']
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here is the result which is very similar to the application dependency map above.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ReFjfXrX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/5vma4eyae4fb3810tq1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ReFjfXrX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/5vma4eyae4fb3810tq1r.png" alt="SQL Statistics"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 2 Set Up
&lt;/h3&gt;

&lt;p&gt;For experiment 2, I compared the latency between 2 VMs running in the same region as the App Service (US East). &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One VM is connected through a virtual network integration&lt;/li&gt;
&lt;li&gt;The other VM is connected via the Hybrid Connection Manager (HCM)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Remember that these are both the same size VM SKUs with the same hardware, same OS and SQL versions, running the same query in the same Azure region. So “everything else being equal”, the request time difference should be isolated to just network latency.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IZhDnENs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hc8h0c0hmwtiwzlcj825.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IZhDnENs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hc8h0c0hmwtiwzlcj825.png" alt="HCM Architecture Experiment 2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 2 Test
&lt;/h3&gt;

&lt;p&gt;Similar to Test 1. 10,000 runs (100 users, 1 second ramp up, 100 loops).&lt;/p&gt;

&lt;h3&gt;
  
  
  Experiment 2 Results
&lt;/h3&gt;

&lt;p&gt;It looks like HCM introduced about an average of &lt;strong&gt;9ms of latency&lt;/strong&gt; &lt;code&gt;(13.1ms - 4.4ms)&lt;/code&gt; over connecting directly through a virtual network.  In the image below SQL connected via HCM is on top and SQL connected via network is on the bottom.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZEYeXelw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/rv3krxxtboxdgsubuuc2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZEYeXelw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/rv3krxxtboxdgsubuuc2.png" alt="Application Map, East US Only"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Results and Final Thoughts
&lt;/h1&gt;

&lt;p&gt;Overall, I found that HCM does not introduce much overhead at 9ms of additional latency inside the same data center. It costs about $10/month per listener, and there is a charge for data transfer over the first 5GB per month. More pricing information here: &lt;a href="https://azure.microsoft.com/en-us/pricing/details/service-bus/"&gt;Azure Service Bus Pricing&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You could build something similar to HCM yourself. For instance you can use &lt;a href="https://github.com/inlets/inlets"&gt;Inlets&lt;/a&gt; but you'd have to maintain the server yourself and pay for it. Plus, HCM is a breeze to set up and can be &lt;a href="https://docs.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections#adding-a-hybrid-connection-to-your-app-programmatically"&gt;automated with the Azure CLI&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If anyone is interested in seeing a latency comparison between Inlets and HCM, let me know!  &lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>networking</category>
      <category>hybrid</category>
    </item>
  </channel>
</rss>
