<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: JFK</title>
    <description>The latest articles on DEV Community by JFK (@jkewley).</description>
    <link>https://dev.to/jkewley</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jkewley"/>
    <language>en</language>
    <item>
      <title>Migrate Azure DevOps work item queries to a new organization</title>
      <dc:creator>JFK</dc:creator>
      <pubDate>Wed, 18 Dec 2019 05:07:55 +0000</pubDate>
      <link>https://dev.to/jkewley/migrate-azure-devops-work-item-queries-to-a-new-organization-2ogb</link>
      <guid>https://dev.to/jkewley/migrate-azure-devops-work-item-queries-to-a-new-organization-2ogb</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Note: I originally published this article on Medium on 12/6/2019&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;At &lt;a href="https://www.bluechip-llc.com/"&gt;Blue Chip Consulting Group&lt;/a&gt;, we are often brought in to help companies with mergers and acquisitions. Sometimes this means moving &lt;a href="https://azure.microsoft.com/en-us/services/devops/?nav=min"&gt;Azure DevOps&lt;/a&gt; projects from one organization to another.&lt;/p&gt;

&lt;p&gt;There are several reliable tools to help with this type of work. The &lt;a href="https://github.com/nkdAgility/azure-devops-migration-tools"&gt;Azure DevOps Migration Tools&lt;/a&gt; project is a decent open source option which handles several key aspects of migration very well. Unfortunately, it's not well documented and &lt;a href="https://nkdagility.github.io/azure-devops-migration-tools/"&gt;requires a deep understanding&lt;/a&gt; of how to configure each task. &lt;a href="https://www.opshub.com/products/opshub-visual-studio-migration-utility/"&gt;OpsHub has a nice commercial offering&lt;/a&gt; which we've used with great success, but it focuses mostly on the migration of work items and expects core pieces such as areas, iterations, and user accounts to be set up ahead of time.&lt;/p&gt;

&lt;p&gt;A recent engagement had a project with dozens of custom shared queries which needed to be migrated. I tried using the DevOps Migration Tool to accomplish this, but was unsuccessful after several tries so I went to the &lt;a href="https://docs.microsoft.com/en-us/rest/api/azure/devops/wit/queries?view=azure-devops-rest-5.1"&gt;Azure DevOps REST API documentation&lt;/a&gt; to see what it would take to write something myself. It turns out that the query and create payloads looked very similar, so I decided to roll my own solution.&lt;/p&gt;

&lt;p&gt;I typically work in C#, but decided to go with Powershell because of the &lt;a href="https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertfrom-json?view=powershell-6"&gt;dynamic handling of Json payloads&lt;/a&gt; that it offers. It was apparent from the documentation and some sample requests I tested in &lt;a href="https://www.getpostman.com/"&gt;Postman&lt;/a&gt; that a recursive enumeration of the &lt;a href="https://docs.microsoft.com/en-us/rest/api/azure/devops/wit/queries/get?view=azure-devops-rest-5.1"&gt;Get result&lt;/a&gt; would be easy to pipe into the target DevOps instance to &lt;a href="https://docs.microsoft.com/en-us/rest/api/azure/devops/wit/queries/create?view=azure-devops-rest-5.1"&gt;create the queries&lt;/a&gt; in the target organization.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GWT4Lp8a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/7bnvrcyuftnyar9q5i9l.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GWT4Lp8a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/7bnvrcyuftnyar9q5i9l.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The script requires a few items to be in place before you run it&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The project name&lt;/strong&gt; : the script assumes you're moving the queries to a project with the same name in the destination DevOps organization. If you're targeting a different instance it should be an easy tweak to the PowerShell script to introduce a different target project name&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The source and target organization names&lt;/strong&gt; : These are the custom url segments that identify the organization and follow dev.visualstudio.com in the DevOps base url. This script will work even if you are using the old url format of .visualstudio.com.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Personal Access Tokens for each organization&lt;/strong&gt; : The source organization requires Work Item Read and the target organization required Work Item Read/Write. &lt;a href="https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&amp;amp;tabs=preview-page"&gt;Here's an article that shows how to create a PAT in your DevOps instance&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You'll want to set up &lt;a href="https://docs.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops"&gt;areas and iterations&lt;/a&gt; in the target organization before running this script if your queries depend on them. DevOps Migration Tools handles this well if you enable the NodeStructuresMigrationConfig processor. Doing so is as easy as using these settings for the &lt;a href="https://github.com/nkdAgility/azure-devops-migration-tools/blob/master/docs/configuration.md"&gt;configuration.json&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"Processors"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"ObjectType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"VstsSyncMigrator.Engine.Configuration.Processing.NodeStructuresMigrationConfig"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"PrefixProjectToNodes"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"Enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="err"&gt;…&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And that's it. The script can be executed multiple times. It'll skip queries with the same name and folder hierarchy location in the destination organization.&lt;/p&gt;

&lt;p&gt;The script can be found in &lt;a href="https://github.com/jkewley/AzureDevOpsQueryMigration"&gt;this GitHub repository&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>azure</category>
      <category>api</category>
      <category>rest</category>
    </item>
    <item>
      <title>Use Fiddler to live test JavaScript changes to your web site</title>
      <dc:creator>JFK</dc:creator>
      <pubDate>Wed, 18 Dec 2019 04:53:47 +0000</pubDate>
      <link>https://dev.to/jkewley/use-fiddler-to-live-test-javascript-changes-to-your-web-site-1eeb</link>
      <guid>https://dev.to/jkewley/use-fiddler-to-live-test-javascript-changes-to-your-web-site-1eeb</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Note: I originally published this article on Medium on 12/3/2019&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are time when I want to quickly test changes to JavaScript code running on a live web site, but going through a full publish process is too slow or cumbersome. Some development platforms such as SharePoint can make it impossible to run a local instance and iterate on a development machine.&lt;/p&gt;

&lt;p&gt;For these scenarios, I've found that using &lt;a href="https://www.telerik.com/fiddler"&gt;Fiddler&lt;/a&gt; to serve local file content in place of a live script is a quick, easy solution. Here's how I set it up using the Lutron web site as an example. I chose Lutron because, like the situation I found myself in, they use SharePoint for their public web site.&lt;/p&gt;

&lt;p&gt;Load up the site in your browser of choice and hit F12 to see the content scripts. Lutron has a file called sp.init.js that looks like a good candidate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OERhpit4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/6rcy0p0kb8ajfzi4ta99.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OERhpit4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/6rcy0p0kb8ajfzi4ta99.JPG" alt="sp.init.js in the source tree"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Like most sites these days, the Lutron site is serving a minimized version of this file, so we'll format it using the pretty print button&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dJvqySPT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/6v0xi2ks3ysk8nw8v3w1.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dJvqySPT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/6v0xi2ks3ysk8nw8v3w1.JPG" alt="Pretty print button"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then save a copy off to the hard drive&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iAMZwELA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/72okmiqvmkrkyttq79ho.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iAMZwELA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/72okmiqvmkrkyttq79ho.JPG" alt="Save a copy of the formatted script file"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before moving on, let's disable the browser cache so that a request for the file is sent with every page refresh.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PqGpMbWq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/c976evth6zh4obhm9f1k.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PqGpMbWq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/c976evth6zh4obhm9f1k.JPG" alt="Disable cache option"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now load up Fiddler and refresh the page. Fiddler acts as a proxy and captures all the requests being sent to the server. Note that you will need to &lt;a href="https://docs.telerik.com/fiddler/Configure-Fiddler/Tasks/DecryptHTTPS"&gt;enable SSL capture in Fiddler&lt;/a&gt; if the traffic is encrypted. After the page loads, search (control+F) for 'init' in Fiddler to highlight the request. Click on the appropriate line to select it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1Ys8pTSB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/0kj9y52e1xrlzvedobj1.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1Ys8pTSB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/0kj9y52e1xrlzvedobj1.JPG" alt="Searching for init. We want the second one."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now click the AutoResponder tab and the Add Rule button. The current request Url is copied into the Rule Editor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MtVqq6uA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/j7lwoxh8e4q8z8mfvj5r.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MtVqq6uA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/j7lwoxh8e4q8z8mfvj5r.JPG" alt="New rule"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see, I've provided the local path to the file I saved earlier (which doesn't need to have a matching file name). I've also checked the boxes for &lt;strong&gt;Enable rules&lt;/strong&gt; and &lt;strong&gt;Unmatched requests passthrough&lt;/strong&gt;. By default Fiddler opts for an exact match, but in this case there is a cache buster on the Url, so I will remove that part of the Url and the EXACT specifier, and then click the Save button to update my rule.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XstBIp86--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/cayyt35k14es4s8t6kpd.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XstBIp86--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/cayyt35k14es4s8t6kpd.JPG" alt="Remove EXACT rule"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are a number of &lt;a href="https://docs.telerik.com/fiddler/knowledgebase/autoresponder"&gt;rule matching options for AutoResponder&lt;/a&gt; should you require them.&lt;/p&gt;

&lt;p&gt;Now I'll edit the code and begin my iterative testing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eYVG9FSr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/x9biftxj915egp5lt5jc.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eYVG9FSr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/x9biftxj915egp5lt5jc.JPG" alt="Adding an alert"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With Fiddler enabled and the browser cache disabled a refresh will serve my local file instead of the one from the Lutron site.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GlP619Cf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/eqnu53qvvvylizbicb1b.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GlP619Cf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/eqnu53qvvvylizbicb1b.JPG" alt="Hello world popup"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This technique can also be used to test changes to other web site content, such as HTML and CSS.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>fiddler</category>
      <category>testing</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Demo: restrict access to Azure key vault using service endpoints</title>
      <dc:creator>JFK</dc:creator>
      <pubDate>Wed, 18 Dec 2019 03:18:50 +0000</pubDate>
      <link>https://dev.to/jkewley/demo-restrict-access-to-azure-key-vault-using-service-endpoints-2ehj</link>
      <guid>https://dev.to/jkewley/demo-restrict-access-to-azure-key-vault-using-service-endpoints-2ehj</guid>
      <description>&lt;p&gt;Azure Key Vault is a service which provides secure storage and management of the secrets, encryption keys, and certificates an application requires at run time. It improves upon legacy secret management approaches such as storing values in configuration or source code while providing a consistent access experience across Azure PaaS, IaaS, and serverless platforms.  &lt;/p&gt;

&lt;p&gt;In this post I'll show how to lock down access to a key vault so that these secrets can only be accessed via the Microsoft Azure backbone network. To do so, I'll restrict access at the network level using a service endpoint bound to a subnet in a virtual network, and apply firewall settings to deny Internet traffic. The provisioned Azure function will retrieve secrets at run time from the key vault via the service endpoint. &lt;/p&gt;

&lt;p&gt;The entire solution is scripted in Powershell and can be found in the &lt;a href="https://github.com/jkewley/KeyVaultServiceEndpoint"&gt;linked github repository&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Function
&lt;/h2&gt;

&lt;p&gt;I'll be using an Azure function hosted on a dedicated App Service plan as the client application which will be accessing the key vault. &lt;a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-networking-options#matrix-of-networking-features"&gt;At the time of this writing service endpoint access from consumption-based functions is not available&lt;/a&gt; which is why I'm using an app service plan.&lt;/p&gt;

&lt;p&gt;Like just about every other Azure service offering available, App Service plans have continued to grow and evolve over time. One of the recent additions is &lt;a href="https://docs.microsoft.com/en-us/azure/app-service/web-sites-integrate-with-vnet#regional-vnet-integration"&gt;regional vnet integration&lt;/a&gt; which allows you to connect your app service plan to a non-internet routable virtual network. This is an evolution of the previous implementation of vnet connectivity which used a point-to-site VPN and virtual gateway to establish this network flow. As of this writing, both of the approaches are still supported. The new approach makes sense when your Azure assets are in the same region, whereas the old approach is required if you're trying to connect to services in another region.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Vault
&lt;/h2&gt;

&lt;p&gt;When working with key vault in Azure you interact with two &lt;a href="https://docs.microsoft.com/en-us/azure/key-vault/key-vault-secure-your-key-vault#resource-endpoints"&gt;resource endpoints&lt;/a&gt; with distinct access planes&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;data plane&lt;/strong&gt; is where the secured data is stored. This plane can only be reached when an access policy is in place which explicitly provides the appropriate permission (read, list, update, etc.) for the specified resource (secret, key, certificate) for the identified caller. This is the plane which applications access when they require keys, secrets, and certificates.&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;management plane&lt;/strong&gt; is where you perform administrative activites on the key vault including configuration of role based access control (access to the management plane) and access policies (access to the data plane)
Requests to either plane require proof of authentication by a trusted Azure Active Directory. This is a very common approach when securing access to Azure platforms and APIs, and provides for strong governance. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Our team typically provisions a vault early in the development life cycle for each application or integration we deploy to Azure. This helps us reinforce the practice of not including secure data in code which might get committed to source control.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;The Powershell script will produce the following architecture:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TWPyPMDb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/0bgektxnsdn010fupaxw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TWPyPMDb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/0bgektxnsdn010fupaxw.png" alt="Key vault architecture"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Script settings
&lt;/h2&gt;

&lt;p&gt;The script has a few variables which you'll want to update&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# used to create uniquely named assets to avoid naming collisions&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$yourUniquePrefix&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"myPrefix"&lt;/span&gt;&lt;span class="w"&gt;          

&lt;/span&gt;&lt;span class="c"&gt;# storage account name. append prefix manually due to storage naming constraints (lowercase, alpha, max 24)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$storageName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"myprefixtestpaasstorage"&lt;/span&gt;&lt;span class="w"&gt;   

&lt;/span&gt;&lt;span class="c"&gt;# the name value from 'az account list' for the subscription where you want to deploy the demo&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$subscriptionName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"YOUR SUBSCRIPTION NAME"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script leverages the &lt;a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest"&gt;latest Azure CLI&lt;/a&gt; (version 2.0.77) which introduced support for &lt;code&gt;az functionapp vnet-integration add&lt;/code&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  What the script does
&lt;/h2&gt;

&lt;p&gt;This script will deploy the following assets in the East US2 region:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a resource group&lt;/li&gt;
&lt;li&gt;a vnet and subnet&lt;/li&gt;
&lt;li&gt;a key vault&lt;/li&gt;
&lt;li&gt;a storage account&lt;/li&gt;
&lt;li&gt;an app service plan (Premium!) and function app with a managed identity&lt;/li&gt;
&lt;li&gt;a deployed Powershell function app which retrieves and returns the key vault secret&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It connects the app service plan to the vnet and sets up a service endpoint for the subnet. Finally, it provides the function app's identity &lt;code&gt;get&lt;/code&gt; rights to the key vault's secrets.&lt;/p&gt;

&lt;p&gt;The function uses Powershell and the local managed identity endpoint to demonstrate that it is able to retrieve the secrets. As of the time that this was written, &lt;a href="https://github.com/Azure/azure-webjobs-sdk/issues/746"&gt;key vault configuration references are not supported with service endpoints&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The script walks through four validation tests.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First it shows that it can read the secret using the Azure CLI over the Internet with a firewall rule in place which allows the caller's IP address. &lt;/li&gt;
&lt;li&gt;Next it invokes the function, which returns the secret in the http response payload&lt;/li&gt;
&lt;li&gt;It then removes the firewall rule and demonstrates that it can no longer read the secret from the Internet&lt;/li&gt;
&lt;li&gt;It makes a final call to the function app to show that it is still able to access the vault through the subnet's service endpoint&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://github.com/jkewley/KeyVaultServiceEndpoint"&gt;The code is heavily commented to make it easy to follow along&lt;/a&gt;. Enjoy!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>powershell</category>
      <category>keyvault</category>
      <category>security</category>
    </item>
  </channel>
</rss>
