<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Igor Bertnyk</title>
    <description>The latest articles on DEV Community by Igor Bertnyk (@ib1).</description>
    <link>https://dev.to/ib1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ib1"/>
    <language>en</language>
    <item>
      <title>Docusaurus authentication with Entra ID and MSAL</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Wed, 02 Oct 2024 16:10:23 +0000</pubDate>
      <link>https://dev.to/ib1/docusaurus-authentication-with-entra-id-and-msal-417b</link>
      <guid>https://dev.to/ib1/docusaurus-authentication-with-entra-id-and-msal-417b</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Docusaurus (&lt;a href="https://docusaurus.io" rel="noopener noreferrer"&gt;https://docusaurus.io&lt;/a&gt;)  is a well-regarded open-source tool for building documentation websites. It is a static-site generator that builds a single-page application leveraging the full power of React. However, it does not provide any kind of authentication out of the box. Adding authentication is crucial for securing access to your documentation. &lt;/p&gt;

&lt;h2&gt;
  
  
  Adding Entra ID Authentication
&lt;/h2&gt;

&lt;p&gt;Let’s try to add Entra ID authentication to the static website generated by Docusaurus. In its heart, it is still a React application, so we could use relevant react packages to help with this task. However, there are some Docusaurus-specific techniques that are described below. With that in mind, let’s start! &lt;/p&gt;

&lt;h3&gt;
  
  
  Registering a Single Page Application
&lt;/h3&gt;

&lt;p&gt;First, you will need to register a Single Page Application in Entra ID (previously known as Azure AD, Microsoft marketing department strikes again). The process is well-known and you can find step-by-step instructions there: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://learn.microsoft.com/en-us/entra/identity-platform/scenario-spa-app-registration" rel="noopener noreferrer"&gt;Single-page application: App registration&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;It is recommended to use MSAL.js 2.0 with auth code flow for enhanced security. Do not forget to add redirect URLs, and add &lt;a href="http://localhost:3000/" rel="noopener noreferrer"&gt;http://localhost:3000/&lt;/a&gt; for local debugging. &lt;/p&gt;

&lt;h2&gt;
  
  
  Installing MSAL Libraries
&lt;/h2&gt;

&lt;p&gt;Next, install MSAL libraries for React&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install react react-dom 
npm install @azure/msal-react @azure/msal-browser 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Configuring Entra ID Registration Details
&lt;/h2&gt;

&lt;p&gt;In the root of the application add authConfig.js file where you configure your Entra ID registration details. &lt;/p&gt;

&lt;p&gt;AuthConfig.js Content&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;LogLevel&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@azure/msal-browser&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="cm"&gt;/**
 * Configuration object to be passed to MSAL instance on creation. 
 * For a full list of MSAL.js configuration parameters, visit:
 * https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/configuration.md 
 */&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;msalConfig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;clientId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;YOUR CLIENT ID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;authority&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://login.microsoftonline.com/YOUR_TENANT_ID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;redirectUri&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;http://localhost:3000&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;postLogoutRedirectUri&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;cache&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;cacheLocation&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sessionStorage&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// This configures where your cache will be stored&lt;/span&gt;
        &lt;span class="na"&gt;storeAuthStateInCookie&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// Set this to "true" if you are having issues on IE11 or Edge&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;system&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;   
        &lt;span class="na"&gt;loggerOptions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;    
            &lt;span class="na"&gt;loggerCallback&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;containsPii&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;  
                &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;containsPii&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;      
                    &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;     
                &lt;span class="p"&gt;}&lt;/span&gt;       
                &lt;span class="k"&gt;switch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;level&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="nx"&gt;LogLevel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="na"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                        &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                    &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="nx"&gt;LogLevel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="na"&gt;Info&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                        &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                    &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="nx"&gt;LogLevel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="na"&gt;Verbose&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;debug&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                        &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                    &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="nx"&gt;LogLevel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="na"&gt;Warning&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;warn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                        &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                    &lt;span class="nl"&gt;default&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;   
            &lt;span class="p"&gt;}&lt;/span&gt;   
        &lt;span class="p"&gt;}&lt;/span&gt;   
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="cm"&gt;/**
 * Scopes you add here will be prompted for user consent during sign-in.
 * By default, MSAL.js will add OIDC scopes (openid, profile, email) to any login request.
 */&lt;/span&gt;
&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;loginRequest&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;scopes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Adding Login Functionality
&lt;/h2&gt;

&lt;p&gt;Finally, we need to add the actual login functionality. That is where things become more interesting. On a first look, there is no place where we can insert a custom login component and enforce authentication. However, the Docusaurus uses a technique called swizzling. &lt;/p&gt;

&lt;p&gt;For Docusaurus, component swizzling means providing an alternative component that takes precedence over the component provided by the theme. You can think of it as Monkey Patching for React components, enabling you to override the default implementation. We need to find and override the very top component in the hierarchical tree, and Docusaurus provides just that: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://docusaurus.io/docs/swizzling#wrapper-your-site-with-root" rel="noopener noreferrer"&gt;Swizzling Wrapping your site with &amp;lt;Root&amp;gt; componen&lt;/a&gt;t &lt;/p&gt;

&lt;p&gt;The &amp;lt;Root&amp;gt; component is rendered at the very top of the React tree, above the theme &amp;lt;Layout&amp;gt;, and never unmounts. It is the perfect place to add stateful logic that should not be re-initialized across navigations, such as user authentication status. Swizzle it manually by creating a file at src/theme/Root.js. The code is provided below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;PublicClientApplication&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;EventType&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@azure/msal-browser&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;MsalProvider&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;AuthenticatedTemplate&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;useMsal&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;UnauthenticatedTemplate&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@azure/msal-react&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;msalConfig&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@site/authConfig&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="cm"&gt;/**
 * MSAL should be instantiated outside of the component tree to prevent it from being re-instantiated on re-renders.
 */&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;msalInstance&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;PublicClientApplication&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;msalConfig&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Default to using the first account if no account is active on page load&lt;/span&gt;
&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getActiveAccount&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getAllAccounts&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Account selection logic is app dependent. Adjust as needed for different use cases.&lt;/span&gt;
    &lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setActiveAccount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getActiveAccount&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// Listen for sign-in event and set active account&lt;/span&gt;
&lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;addEventCallback&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;eventType&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="nx"&gt;EventType&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;LOGIN_SUCCESS&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;account&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;account&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;account&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setActiveAccount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;account&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Default implementation, that you can customize&lt;/span&gt;
&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;Root&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;activeAccount&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getActiveAccount&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;claims&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;activeAccount&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;activeAccount&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;idTokenClaims&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;handleRedirect&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="c1"&gt;//instance.loginRedirect()&lt;/span&gt;
        &lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;loginPopup&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
                &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;msalConfig&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;create&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;
            &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;MsalProvider&lt;/span&gt; &lt;span class="nx"&gt;instance&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;msalInstance&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;            
            &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;AuthenticatedTemplate&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="o"&gt;&amp;lt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;            &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/AuthenticatedTemplate&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;            &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;UnauthenticatedTemplate&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;div&lt;/span&gt; &lt;span class="nx"&gt;style&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{{&lt;/span&gt;&lt;span class="na"&gt;margin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;auto&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}}&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;button&lt;/span&gt; &lt;span class="nx"&gt;onClick&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;handleRedirect&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
                        &lt;span class="nx"&gt;Sign&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt;
                    &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/button&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;                &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/div&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;            &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/UnauthenticatedTemplate&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;        &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/MsalProvider&lt;/span&gt;&lt;span class="err"&gt;&amp;gt;
&lt;/span&gt;    &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Code Implementation
&lt;/h2&gt;

&lt;p&gt;Let’s step into the code.  &lt;/p&gt;

&lt;p&gt;MSAL React instance should be instantiated outside of the component tree to prevent it from being re-instantiated on re-renders.  &lt;/p&gt;

&lt;p&gt;You can listen to authentication events and set an active account or do other actions depending on event type. &lt;/p&gt;

&lt;p&gt;From the active account you can retrieve token claims that you can use further for role-based access control if desired. &lt;/p&gt;

&lt;p&gt;MSAL supports two types of authentication interfaces: redirect (loginRedirect) and popup (loginPopup). In the end, the result is the same. &lt;/p&gt;

&lt;p&gt;Use MsalProvider component with MSAL instance and AuthenticatedTemplate to display/hide content depending on user authentication status. “&amp;lt;&amp;gt;{children}&amp;lt;/&amp;gt;” will render the Docusaurus-generated site. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;And just like that, we added an Entra ID authentication to our documentation site. This setup enhances security and provides a seamless user experience. &lt;br&gt;
Everyone loves Docusaurus!&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fig1o515tgp9gbn6h9rh8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fig1o515tgp9gbn6h9rh8.png" alt="Everyone loves Docusaurus"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docusaurus</category>
      <category>entraid</category>
      <category>msal</category>
      <category>authentication</category>
    </item>
    <item>
      <title>Azure Container Apps - the future of K8s in the cloud?</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Wed, 17 Nov 2021 19:23:21 +0000</pubDate>
      <link>https://dev.to/ib1/azure-container-apps-the-future-of-k8s-in-the-cloud-5cm1</link>
      <guid>https://dev.to/ib1/azure-container-apps-the-future-of-k8s-in-the-cloud-5cm1</guid>
      <description>&lt;p&gt;At Ignite-2021 conference, Microsoft announced the preview launch of Azure Container Apps, a new fully managed serverless container service.&lt;br&gt;
Azure Container Apps (ACA) provide a serverless hosting service that sits on top of an AKS service, allowing you to deploy multiple containers without dealing with the underlying infrastructure. &lt;/p&gt;

&lt;p&gt;Let see a current Kubernetes landscape in Azure:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Service&lt;/th&gt;
&lt;th&gt;Azure Container Apps&lt;/th&gt;
&lt;th&gt;Azure Kubernetes Service (AKS)&lt;/th&gt;
&lt;th&gt;Azure Container Instances&lt;/th&gt;
&lt;th&gt;Web App for Containers&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Sample usage&lt;/td&gt;
&lt;td&gt;Build and deploy modern apps and microservices&lt;/td&gt;
&lt;td&gt;Deploy and scale containers on managed Kubernetes&lt;/td&gt;
&lt;td&gt;Launch containers with hypervisor isolation&lt;/td&gt;
&lt;td&gt;Run containerized web apps&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Autoscale&lt;/td&gt;
&lt;td&gt;Yes (KEDA)&lt;/td&gt;
&lt;td&gt;Yes (config)&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes (App Service plan)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;HTTPS ingress&lt;/td&gt;
&lt;td&gt;Yes (Envoy)&lt;/td&gt;
&lt;td&gt;Yes(may need additional infrastructure)&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Split traffic&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Manual&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes (deployment slots)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Dapr&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Manual&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Long-running jobs&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  What new service brings to the table?
&lt;/h2&gt;

&lt;p&gt;Kubernetes is hard. It is true even in managed K8s environments like AKS and Google Kubernetes Engine (GKE). It is true for newcomers or developers who want to focus on solving business requirements instead of tackling and mastering the underlying application platform.&lt;br&gt;
ACA offers worry-free deployment and auto-management of many aspects of the containerized applications including service discovery and traffic splitting, support of long running processes, open-source technologies like Dapr, KEDA, and Envoy, It is optimized for running applications that span many microservices deployed in containers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Portal
&lt;/h2&gt;

&lt;p&gt;Azure Portal at the moment looks pretty bare. You can create ACA, configure monitoring, secrets and view revisions but auto-scaling and other details are managed via configurations.&lt;br&gt;
Currently, the maximum available container size is 2 cores / 4 GiB of memory (while other services support up to 16 GiB)&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzejavwwrct98rxijphc5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzejavwwrct98rxijphc5.png" alt="Create ACA"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also configure internal or external ingress:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fagokw65769hsk5vjmf44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fagokw65769hsk5vjmf44.png" alt="ACA Ingress"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Microservices
&lt;/h2&gt;

&lt;p&gt;Multiple ACA can be deployed into a single environment. You can think of environments as similar to Kubernetes Pods. &lt;br&gt;
By doing so, they will be put under the same virtual network and isolated from the outside world. Each environment also has its own Log Analytics workspace to provide monitoring. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F156hqruggq01krryx60s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F156hqruggq01krryx60s.png" alt="Microservices"&gt;&lt;/a&gt;&lt;br&gt;
When you deploy your container app a new &lt;em&gt;revision&lt;/em&gt; is created. A revision is an immutable snapshot of a container app. Revisions allow you to have the old and new versions running simultaneously and use the &lt;em&gt;traffic splitting&lt;/em&gt; functionality to direct traffic to old or new versions of the application.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5fr3gs5qa4nje40jwvi1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5fr3gs5qa4nje40jwvi1.png" alt="Revisions"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Dapr
&lt;/h2&gt;

&lt;p&gt;Dapr (Distributed Application Runtime) is a runtime that helps build resilient, stateless, and stateful microservices.&lt;br&gt;
Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include Service to Service calls, Pub/Sub, Event Bindings, State Stores, and Actors.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwgivdgt3bfm6q7uie7th.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwgivdgt3bfm6q7uie7th.png" alt="Dapr"&gt;&lt;/a&gt;&lt;br&gt;
There is an excellent example created by Jeff Hollan showing how&lt;br&gt;
&lt;a href="https://github.com/Azure-Samples/container-apps-connect-multiple-apps" rel="noopener noreferrer"&gt;multi-container communication&lt;/a&gt; can use direct calls as well as be proxied by Dapr. Dapr can provide mTLS, auto-retries, and additional telemetry.&lt;br&gt;
This example also shows how to deploy the app using GitHub Actions and Azure BICEP.&lt;/p&gt;

&lt;h2&gt;
  
  
  Google Cloud
&lt;/h2&gt;

&lt;p&gt;GCP's similar technology, GKE Autopilot mode is intended to be a hands-off fully managed Kubernetes experience that allows users to focus more on the workloads and less on managing cluster infrastructure. The difference is that Azure team chose a very opinionated route, supporting selected open-source technologies at the expense of choice. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Azure Container Apps is a perfect addition to Azure's container landscape. By using this ACA, you don't spend time managing Kubernetes clusters and can focus on building and deploying your application. &lt;br&gt;
The biggest pros for Container Apps are the serverless container orchestration, ability to scale to 0, and usage-based pricing. Autoscaling, HTTPS ingress and Dapr take away major headaches of running your application on Kubernetes.&lt;br&gt;
The possible downside is that when deploying to ACA, you are buying the way the MS built this service such as using Dapr for service mesh, KEDA for scaling, Envoy for ingress and Log Analytics for monitoring. The deployment is also performed using Azure tools so you may not bring in your beloved Helm charts. And, as of now, Windows containers are not supported.&lt;br&gt;
However teams facing a modernization and planning their move to Azure Kubernetes Service may be very interested in  Azure Container Apps due to simplicity and abstraction of the complex Kubernetes system.&lt;br&gt;
Thanks for reading, here is a containerized cat for you, a next step in containerization :)&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frqesigom66lh18m85xli.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frqesigom66lh18m85xli.png" alt="Container cat"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>kubernetes</category>
      <category>containers</category>
      <category>k8s</category>
    </item>
    <item>
      <title>The ultimate guide to hosting multilingual Angular application on Google App Engine</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Thu, 26 Aug 2021 22:56:18 +0000</pubDate>
      <link>https://dev.to/ib1/the-ultimate-guide-to-hosting-multilingual-angular-application-on-google-app-engine-o7k</link>
      <guid>https://dev.to/ib1/the-ultimate-guide-to-hosting-multilingual-angular-application-on-google-app-engine-o7k</guid>
      <description>&lt;p&gt;Google App Engine is a good option to serve SPA applications, such as Angular. It requires little maintenance, can be easily auto-scaled if necessary, and you can extend it with CDN or load balancer. However, an Angular application requires several tricks to make it work properly. For example, to support deep links and browser refresh, routed apps must fallback to index.html. But navigation to the index page should happen only for routes, not for existing files and assets, so there should be exception for such things.&lt;br&gt;
For localized, multilingual application that becomes even more complicated. Let's see why.&lt;br&gt;
Internationalization (aka i18n) of the applications is often mandated for countries other than USA, and useful even there, if you want to make them accessible world-wide. To localize my app I have followed the official guide:&lt;br&gt;
&lt;a href="https://angular.io/guide/i18n"&gt;Localizing your Angular app&lt;/a&gt;&lt;br&gt;
This is a fairly involved process that I will not describe here. Sufficient to say that after all preparations, translation of the resource files and changing of the build process, in your "dist" folder you will get a copy of your application for every language supported. So, for example, if you support English and French, in the "dist' folder there will be "en" and "fr" subfolders, each containing a full translated application.&lt;br&gt;
In a typical multilingual application navigation to the different language is facilitated by URL prefix, e.g. "&lt;a href="https://myapp.com/fr/login"&gt;https://myapp.com/fr/login&lt;/a&gt;" will show the login page in French, so we need to map these URL patterns to the underlying application assets.&lt;br&gt;
Let's quickly go through the build and deployment process and then we will see how to do that mapping.&lt;/p&gt;

&lt;p&gt;As I mentioned, the build process is slightly changed:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;npm run build -- --configuration=production --localize&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;To deploy to Google Cloud you need "gcloud CLI". If you are using CI/CD pipelines, it is often preinstalled on all major DevOps platforms and agents.&lt;br&gt;
You'd need to authorize the CLI session, either manually, or with the help of the secret credentials file, which you can download from the GCP console (navigate to APIs &amp;amp; Services/Credentials).&lt;br&gt;
This is a sample command:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;gcloud auth activate-service-account --key-file=$(gcloud.secureFilePath)&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Finally, to deploy your app to the App Engine, you can execute the following command:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;gcloud app deploy --verbosity=info &amp;lt;path to your dist folder&amp;gt;/app.yaml --promote --stop-previous-version --quiet --project=&amp;lt;your GCP project ID&amp;gt;&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Replace the values in angle brackets accordingly.&lt;/p&gt;

&lt;p&gt;Now we get to the most important part: how to make our Angular app work on App Engine.&lt;br&gt;
The magic resides in the "app.yaml" file, with the help of which we can overcome all of the obstacles I listed above.&lt;br&gt;
The full reference can be found &lt;a href="https://cloud.google.com/appengine/docs/standard/python/config/appref"&gt;here&lt;/a&gt;, but for Angular we need just a subset of configuration options. &lt;br&gt;
Several points to mention before I attach the full app.yaml file for your reference.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Remember that app.yaml should be in the root (dist) folder.&lt;/li&gt;
&lt;li&gt;Python instance in Standard Environment is the simplest hosting option to set up.&lt;/li&gt;
&lt;li&gt;Affinity is not required, as Angular package will be downloaded to the client, and sessions will be facilitated by the browser.&lt;/li&gt;
&lt;li&gt;Downloaded javascript packages will be often cached on a client side, so usually we should not expect an extremely high load on our server.&lt;/li&gt;
&lt;li&gt;We need to provide a fallback for all routes to the index.html page in the corresponding language.&lt;/li&gt;
&lt;li&gt;Images, fonts, and other static assets should be exempted from the above rule.&lt;/li&gt;
&lt;li&gt;It might be a good option to have a "landing page" from which the user can navigate to the language of choice, or it can automatically redirect the user based on the browser language settings.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And behold! A sample app.yaml file that you can freely reuse to host you Angular application on App Engine:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;I hope that was helpful. Any comments on how to make this config even better are welcome. Here is a multilingual, French-speaking cat for you:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--d9x2pKZz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jq0ig6b1fph9ibbeejpm.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--d9x2pKZz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jq0ig6b1fph9ibbeejpm.jpg" alt="French-speaking cat"&gt;&lt;/a&gt;&lt;br&gt;
&lt;sup&gt;(Saved from etsy.com)&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>appengine</category>
      <category>angular</category>
      <category>i18n</category>
      <category>yaml</category>
    </item>
    <item>
      <title>Google Secret Manager Configuration Provider for ASP.NET Core</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Wed, 18 Aug 2021 21:12:51 +0000</pubDate>
      <link>https://dev.to/ib1/google-secret-manager-configuration-provider-for-asp-net-core-3gpk</link>
      <guid>https://dev.to/ib1/google-secret-manager-configuration-provider-for-asp-net-core-3gpk</guid>
      <description>&lt;p&gt;If you are brave enough to run ASP.NET Core applications on Google Cloud Platform, sooner or later you will encounter a question on how to store your application settings in a secure way. In Azure we have Key Vault for that purpose, which is nicely integrated with ASP.NET Core thanks to the built-in Configuration Provider. In GCP, the corresponding service is called &lt;a href="https://cloud.google.com/secret-manager" rel="noopener noreferrer"&gt;Google Secret Manager&lt;/a&gt;.&lt;br&gt;
Secret Manager is a secure and convenient storage system for API keys, passwords, certificates, and other sensitive data, and certainly is a good place to store application settings as well. However there is no integration with ASP.NET Core out of the box.&lt;br&gt;
Fear not! ASP.NET Core supports &lt;a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-5.0#custom-configuration-provider" rel="noopener noreferrer"&gt;Custom configuration providers&lt;/a&gt;, and we will try to develop one for the Google Secret Manager now.&lt;/p&gt;
&lt;h2&gt;
  
  
  SecretManagerConfigurationSource
&lt;/h2&gt;

&lt;p&gt;First of all, we need to create a class that implements IConfigurationSource. The only purpose of which is to return an instance of our custom provider, which we develop shortly.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SecretManagerConfigurationSource&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;IConfigurationSource&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="n"&gt;IConfigurationProvider&lt;/span&gt; &lt;span class="nf"&gt;Build&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;IConfigurationBuilder&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;SecretManagerConfigurationProvider&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  SecretManagerConfigurationProvider
&lt;/h2&gt;

&lt;p&gt;Next, let's start creating our main component, SecretManagerConfigurationProvider, which needs to inherit from the ConfigurationProvider abstract class:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SecretManagerConfigurationProvider&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;ConfigurationProvider&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;To retrieve secrets from the Secret Manager, we can use SecretManagerServiceClient from Google.Cloud.SecretManager.V1 Nuget package. We will also need a Google Project ID where the Secret Manager is enabled and your application is running. &lt;/p&gt;

&lt;p&gt;How can we get a Project ID? We can, for example, put it into applications settings. But in fact, we can get it easily at runtime using Google API:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="nf"&gt;GetProjectId&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;instance&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Google&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Gax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Platform&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Instance&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;projectId&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;instance&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="n"&gt;ProjectId&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;IsNullOrEmpty&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;projectId&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;projectId&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;Returning to our Configuration Provider, we are ready to initialize it in the constructor:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="nf"&gt;SecretManagerConfigurationProvider&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;_client&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;SecretManagerServiceClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Create&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;_projectId&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;GoogleProject&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetProjectId&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;The next step is to overload the "Load" method and populate a Dictionary of key/value pairs that is exposed as Data property in the parent class.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;secrets&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ListSecrets&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;ProjectName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;_projectId&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="k"&gt;foreach&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;secret&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;secrets&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;secretVersionName&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;SecretVersionName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;secret&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SecretName&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ProjectId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;secret&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SecretName&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SecretId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"latest"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
   &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;secretVersion&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AccessSecretVersion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;secretVersionName&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
   &lt;span class="nf"&gt;Set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;secret&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SecretName&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SecretId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;secretVersion&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToStringUtf8&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  SecretManagerConfigurationExtensions
&lt;/h2&gt;

&lt;p&gt;Finally, let's add a convenience extension method to easily add our new configuration provider to the ASP.NET Core application.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;IConfigurationBuilder&lt;/span&gt; &lt;span class="nf"&gt;AddGoogleSecretsManager&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;this&lt;/span&gt; &lt;span class="n"&gt;IConfigurationBuilder&lt;/span&gt; &lt;span class="n"&gt;configurationBuilder&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;configurationBuilder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;SecretManagerConfigurationSource&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;configurationBuilder&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;Typically, you'd use this method in Program.cs file:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;IHostBuilder&lt;/span&gt; &lt;span class="nf"&gt;CreateHostBuilder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
    &lt;span class="n"&gt;Host&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CreateDefaultBuilder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ConfigureAppConfiguration&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddGoogleSecretsManager&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ConfigureWebHostDefaults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;webBuilder&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;webBuilder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;UseStartup&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Startup&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;());&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;The full source code of Google Secret Manager Configuration Provider, with more error handling, comments, and tests can be found in this Github repo:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/i-b1" rel="noopener noreferrer"&gt;
        i-b1
      &lt;/a&gt; / &lt;a href="https://github.com/i-b1/GoogleSecretManagerConfigurationProvider" rel="noopener noreferrer"&gt;
        GoogleSecretManagerConfigurationProvider
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Google Secret Manager Configuration Provider
    &lt;/h3&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;More than that, you can download a &lt;a href="https://www.nuget.org/packages/IB.Google.SecretManager.ConfigurationProvider/" rel="noopener noreferrer"&gt;IB.Google.SecretManager.ConfigurationProvider NuGet package&lt;/a&gt; and use freely in your projects&lt;/p&gt;

&lt;p&gt;And that is essentially everything that was needed to implement a custom configuration provider. Now you can start using Google Secret Manager to store your applications settings in a more secure way, simplify your CD pipelines by removing those variables from different stages, and stop worrying about those pesky cat hackers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ljfwvfv43kekas0qdqi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ljfwvfv43kekas0qdqi.png" alt="cat hacker"&gt;&lt;/a&gt;&lt;sup&gt;(Image credit: iridi/Getty Images)&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>secretmanager</category>
      <category>configurationprovider</category>
      <category>aspnetcore</category>
      <category>appengine</category>
    </item>
    <item>
      <title>A brief history of application architecture (of the 21st century)</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Tue, 22 Jun 2021 18:06:44 +0000</pubDate>
      <link>https://dev.to/ib1/a-brief-history-of-application-architecture-of-the-21st-century-23d4</link>
      <guid>https://dev.to/ib1/a-brief-history-of-application-architecture-of-the-21st-century-23d4</guid>
      <description>&lt;h2&gt;
  
  
  Table of contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;N-Tier Architecture&lt;/li&gt;
&lt;li&gt;Ports &amp;amp; Adapters&lt;/li&gt;
&lt;li&gt;Onion Architecture&lt;/li&gt;
&lt;li&gt;CQRS&lt;/li&gt;
&lt;li&gt;Beyond&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  N-Tier Architecture &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Admittedly created long before year 2000, this type of architecture became popular in web applications around that time.&lt;/p&gt;

&lt;h4&gt;
  
  
  Tiers and Layers
&lt;/h4&gt;

&lt;p&gt;Layers are a way to separate responsibilities and manage dependencies. Each layer has a specific responsibility. A higher layer can use services in a lower layer, but not the other way around.&lt;br&gt;
Tiers are physically separated, running on separate machines. A tier can call to another tier directly, or use asynchronous messaging (message queue).&lt;/p&gt;

&lt;p&gt;Typical app consist of 3 layers:&lt;br&gt;
Presentation Layer holds the Part that the User can interact with, e.g. WebApi.&lt;br&gt;
Business Logic holds all the logics related to the business requirements.&lt;br&gt;
Data Access Layer usually holds ORMs to access the Database&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ipavjnrjyp4arfza37u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ipavjnrjyp4arfza37u.png" alt="3 Tiers"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  More complex implementation
&lt;/h4&gt;

&lt;p&gt;It is not restricted to 3 tiers. A tier can call to another tier directly, or use asynchronous messaging (message queue). &lt;br&gt;
Although each layer might be hosted in its own tier, that's not required. Several layers might be hosted on the same tier. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7elf3et2vbrobchatos6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7elf3et2vbrobchatos6.png" alt="many tiers"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Benefits and Disadvantages
&lt;/h4&gt;

&lt;p&gt;When to use&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Simple web applications.&lt;/li&gt;
&lt;li&gt;Migrating an on-premises application to Azure with minimal refactoring.&lt;/li&gt;
&lt;li&gt;Unified development of on-premises and cloud applications.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Challenges&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It's easy to end up with a middle tier that just does CRUD operations on the database, adding extra latency without doing any useful work.&lt;/li&gt;
&lt;li&gt;Monolithic design prevents independent deployment of features.&lt;/li&gt;
&lt;li&gt;Makes it difficult for developers to change an application and for operations teams to scale the application up and down to match demand.&lt;/li&gt;
&lt;li&gt;The Database is usually the Core of the Entire Application, i.e. it is the only layer that doesn’t have to depend on anything else. Like in Jenga tower, any small change in the Business Logics layer or Data access layer may prove dangerous to the integrity of the entire application. &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Ports &amp;amp; Adapters &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;h4&gt;
  
  
  History
&lt;/h4&gt;

&lt;p&gt;Ports and Adapters or also known as Hexagonal Architecture, is a popular architecture invented by Alistair Cockburn in 2005.&lt;br&gt;
&lt;em&gt;"Allow an application to equally be driven by users, programs, automated test or batch scripts, and to be developed and tested in isolation from its eventual run-time devices and databases."&lt;/em&gt; &lt;br&gt;
This is one of the many forms of DDD (Domain Driven Design Architecture)&lt;/p&gt;

&lt;h4&gt;
  
  
  Motivation
&lt;/h4&gt;

&lt;p&gt;The idea of Ports and Adapters is that the application is central to your system. All the inputs and outputs reach or leave the core of the application through a port. This port isolates the application from external technologies, tools and delivery mechanics.&lt;/p&gt;

&lt;h4&gt;
  
  
  Benefits
&lt;/h4&gt;

&lt;p&gt;Using this port/adapter design, with our application in the centre of the system, allows us to keep the application isolated from the implementation details like ephemeral technologies, tools and delivery mechanisms, making it easier and faster to test and to replace implementations.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qwi2fdkqqhzi1511ro1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qwi2fdkqqhzi1511ro1.png" alt="Ports and Adapters"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Components
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Core
A place where the business logic of the application happens&lt;/li&gt;
&lt;li&gt;Ports
Ports represent the boundaries of the application (usually interfaces)&lt;/li&gt;
&lt;li&gt;Inbound ports
communication points between the outside and the application core&lt;/li&gt;
&lt;li&gt;Outbound Ports
This are the interfaces the core need to communicate with the outside world &lt;/li&gt;
&lt;li&gt;Adapter
Implementation of ports&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Onion Architecture &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;he Onion Architecture was introduced by Jeffrey Palermo in 2008:&lt;br&gt;
&lt;a href="https://jeffreypalermo.com/2008/07/the-onion-architecture-part-1/" rel="noopener noreferrer"&gt;https://jeffreypalermo.com/2008/07/the-onion-architecture-part-1/&lt;/a&gt;&lt;br&gt;
This is an evolution of Ports &amp;amp; Adapters Architecture.&lt;/p&gt;

&lt;p&gt;The idea of the Onion Architecture is to place the Domain and Services Layers at the center of your application, and externalize the Presentation and Infrastructure.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7whzyteakg09tpc1ptc7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7whzyteakg09tpc1ptc7.png" alt="Onion Architecture"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Layers (rings) : Core
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Core
Domain and Application Layer form the core of the application. These layers do not depend on any other layers&lt;/li&gt;
&lt;li&gt;Domain Model
This is the center of the architecture. It contains all the domain entities. These domain entities don’t have any type of dependencies&lt;/li&gt;
&lt;li&gt;Domain Services
responsible to define the necessary interfaces to allow to store and retrieve objects.&lt;/li&gt;
&lt;li&gt;Application Services
This layer has the purpose to open the door to the core of the onion. The Service layer also could hold business logic for an entity. In this layer, service interfaces are kept separate from its implementation, keeping loose coupling and separation of concerns in mind.
The Core Layers should never depend on any other layer.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Outer Layers
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;UI / Test Layer
It's the outermost layer, and keeps peripheral concerns like UI and tests. For a Web application, it represents the Web API or Unit Test project &lt;/li&gt;
&lt;li&gt;Infrastructure
This ring has an implementation for the dependency injection principle because the architecture contains the core interfaces in the center and their implementations are at the edges of the application services ring.
Infrastructure can be anything:  Entity Framework Core Layer to access DB, Email Notification sender, Messaging System etc. &lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Benefits and Drawbacks
&lt;/h4&gt;

&lt;p&gt;Benefits&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Onion Architecture layers are connected through interfaces. Implementations are provided during run time.&lt;/li&gt;
&lt;li&gt;Application architecture is built on top of a domain model.&lt;/li&gt;
&lt;li&gt;All external dependency, like database access and service calls, are represented in external layers.&lt;/li&gt;
&lt;li&gt;No dependencies of the Internal layer with external layers.&lt;/li&gt;
&lt;li&gt;Couplings are towards the center.&lt;/li&gt;
&lt;li&gt;Flexible and sustainable and portable architecture.&lt;/li&gt;
&lt;li&gt;Can be quickly tested because the application core does not depend on anything.
Cons&lt;/li&gt;
&lt;li&gt;Not easy to understand for beginners&lt;/li&gt;
&lt;li&gt;Sometimes it is not clear how to split responsibilities between layers&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Command Query Responsibility Segregation (CQRS) &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;h4&gt;
  
  
  Context and problem
&lt;/h4&gt;

&lt;p&gt;In traditional architectures, the same data model is used to query and update a database. That’s simple and works well for basic CRUD operations. In more complex applications, however, this approach can become unwieldy. For example, on the read side, the application may perform many different queries, returning data transfer objects (DTOs) with different shapes. Object mapping can become complicated. On the write side, the model may implement complex validation and business logic. As a result, you can end up with an overly complex model that does too much.&lt;br&gt;
Read and write workloads are often asymmetrical, with very different performance and scale requirements.&lt;br&gt;
There is often a mismatch between the read and write representations of the data, such as additional columns or properties.&lt;/p&gt;

&lt;h4&gt;
  
  
  Solution
&lt;/h4&gt;

&lt;p&gt;CQRS addresses separate reads and writes into separate models, using commands to update data, and queries to read data.&lt;br&gt;
Commands should be task-based, rather than data-centric. (“Book hotel room,” not “set ReservationStatus to Reserved.”) Commands may be placed on a queue for asynchronous processing, rather than being processed synchronously.&lt;br&gt;
Queries never modify the database. A query returns a DTO that does not encapsulate any domain knowledge.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Typical&lt;/th&gt;
&lt;th&gt;CQRS&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd45a87sjc2ln4oqx1mml.png" alt="typical"&gt;&lt;/td&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiyg4rcz2tvd3egjuyyjl.png" alt="CQRS"&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;sup&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/architecture/patterns/cqrs" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/architecture/patterns/cqrs&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0f6la53p5h6t7i61mi86.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0f6la53p5h6t7i61mi86.png" alt="CQRS write side"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  MediatR
&lt;/h4&gt;

&lt;p&gt;MediatR is an open source implementation of the mediator pattern. It allows you to compose messages, create and listen for events using synchronous or asynchronous patterns.&lt;/p&gt;

&lt;p&gt;See how CQRS implementation with MediatR in ASP.NET Core allows to significantly simplify Controller's code.&lt;br&gt;
&lt;strong&gt;Before&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;HttpPost&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;IHttpActionResult&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;Register&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RegisterBindingModel&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;User&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;UserName&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Email&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
       &lt;span class="n"&gt;Email&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Email&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
       &lt;span class="n"&gt;DateOfBirth&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DateTime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Parse&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DateOfBirth&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToString&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;CultureInfo&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;InvariantCulture&lt;/span&gt;&lt;span class="p"&gt;)),&lt;/span&gt;
       &lt;span class="n"&gt;PhoneNumber&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PhoneNumber&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
       &lt;span class="n"&gt;Name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Name&lt;/span&gt;
   &lt;span class="p"&gt;};&lt;/span&gt;
   &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;UserManager&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CreateAsync&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Password&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
   &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(!&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Succeeded&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;GetErrorResult&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
   &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;Ok&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;After&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;

&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;HttpPost&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;IHttpActionResult&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;Register&lt;/span&gt; &lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;FromBody&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="n"&gt;CreateUserCommand&lt;/span&gt; &lt;span class="n"&gt;command&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;Ok&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Mediator&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Send&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;command&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The source of the demo can be downloaded from  &lt;a href="https://github.com/i-b1/OnionArchitectureDemo" rel="noopener noreferrer"&gt;https://github.com/i-b1/OnionArchitectureDemo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Next Steps &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;h4&gt;
  
  
  Event Sourcing
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/architecture/patterns/event-sourcing" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/architecture/patterns/event-sourcing&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Designing Microservices:
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/dotnet/architecture/microservices/" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/dotnet/architecture/microservices/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thank you for reading, next step would be to consider how Onion stacks up against CATs (Cloud Application Technologies???) and microservices.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflodvy9rxg4r4ecfqeef.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflodvy9rxg4r4ecfqeef.png" alt="Onion vs Cat"&gt;&lt;/a&gt;&lt;br&gt;
&lt;sup&gt;Photo by &lt;a href="https://unsplash.com/@halacious?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText" rel="noopener noreferrer"&gt;HalGatewood.com&lt;/a&gt; on &lt;a href="https://unsplash.com/s/photos/diagram?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText" rel="noopener noreferrer"&gt;Unsplash&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>onion</category>
      <category>cqrs</category>
      <category>ddd</category>
    </item>
    <item>
      <title>How to pass Google Cloud Architect Certification exam</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Wed, 15 Jul 2020 19:07:16 +0000</pubDate>
      <link>https://dev.to/ib1/how-i-passed-google-cloud-architect-certification-exam-3i4a</link>
      <guid>https://dev.to/ib1/how-i-passed-google-cloud-architect-certification-exam-3i4a</guid>
      <description>&lt;p&gt;There are multiple articles out there about &lt;a href="https://cloud.google.com/certification/cloud-architect"&gt;Google Cloud Professional Architect Certification&lt;/a&gt;, so I'll just describe my personal experience and impressions from this exam and my preparations to passing this certification.&lt;/p&gt;

&lt;h2&gt;
  
  
  GCP Architect Exam
&lt;/h2&gt;

&lt;p&gt;On the outside, this exam is a typical multiple-choice test, with some questions having more than one answer. There is no lab environment where you need to perform some actions, like some other providers like Cloudera and Microsoft started to incorporate into their tests. The test has 50 questions and you get 2 hours to complete them.&lt;br&gt;
The biggest difficulty is that many questions revolve around three &lt;strong&gt;case studies&lt;/strong&gt; which represent fictitious scenarios for companies migration to the GCP. The intent, I think, is to evaluate how you professionally perform in the Cloud Architect role, rather than around the Google Cloud platform itself. That makes you balance between functional and non-functional requirements and consider cost, existing infrastructure and processes, regional vs global representation, scalability and latency, among others. The answers could be different depending on what is emphasized in the particular question.&lt;br&gt;
I guess the most peculiar thing about the exam is that it does not provide any feedback, so it is difficult to judge which areas need improvement.&lt;br&gt;
Previously, you could only take this exam in one of the test centers but now it is available online too, "thanks" to the COVID-19 situation in the world. Many test centers started to open now though, so you can choose what is most suitable to you. I have to warn though that if you are taking it from home there are some requirements to your home environment, webcam, and computer that you need to check, and the dedicated proctor is going to watch over you during the exam. Logistically, it is simpler to get it from the test center. &lt;/p&gt;

&lt;h2&gt;
  
  
  Exam preparation
&lt;/h2&gt;

&lt;p&gt;Coursera's &lt;a href="https://www.coursera.org/learn/preparing-cloud-professional-cloud-architect-exam/home/welcome"&gt;Preparing for the Google Cloud Professional Cloud Architect Exam&lt;/a&gt; gives a good overview of the certification and provides some good points for your preparation. It is also free. This course is a part of a larger "Cloud Architecture with Google Cloud Professional Certificate" program, however I found it too bloated and there are better paid options from other learning providers.&lt;br&gt;
The one that I liked is &lt;a href="https://linuxacademy.com/course/google-cloud-certified-professional-cloud-architect"&gt;Google Cloud Certified Professional Cloud Architect&lt;/a&gt; by Matthew Ulasien. It is just over 30 hours long and includes access to 15 labs that will help to cement your knowledge of the GCP. There is also a sample test you could try multiple times with different questions, and a Lucidchart multi-page diagram to help you navigate the lessons and GCP resources. Overall, this course provides an excellent value for the money.&lt;br&gt;
Speaking of labs, &lt;a href="https://www.qwiklabs.com"&gt;Qwiklabs&lt;/a&gt; has a vast catalog of hands-on labs on almost all aspects of GCP and this resource is invaluable if you want to dive into more practical usage of the Google Cloud platform.&lt;br&gt;
Finally, I found that the &lt;a href="https://www.amazon.com/Professional-Cloud-Architect-Certification-enterprise-grade-ebook/dp/B07VDKQPNT"&gt;Professional Cloud Architect – Google Cloud Certification Guide&lt;/a&gt; book is very well structured and, despite several insignificant errors, provides complete information necessary to pass the exam and will help to fill the gaps in online lessons.&lt;br&gt;
Pay special attention to VMs, images and managed instance groups, storage options, Kubernetes &lt;em&gt;gcloud&lt;/em&gt; and &lt;em&gt;kubectl&lt;/em&gt; commands, networking options and connection to on-premises data and, of course, case studies.&lt;/p&gt;

&lt;h2&gt;
  
  
  In conclusion
&lt;/h2&gt;

&lt;p&gt;A word of encouragement to all candidates: do not be afraid, the exam is not as difficult as it seems, and with about 2 months of studying, following the guide above, it is possible to prepare for the test. Of course, practical experience would help immensely, and I recommend to get a GCP free-tier to explore it in detail, as well as going through the related Qwick Labs.&lt;br&gt;
Lastly, don’t forget to take the &lt;a href="https://cloud.google.com/certification/practice-exam/cloud-architect"&gt;practice exam&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;This is &lt;a href="https://www.credential.net/39f5a712-a875-46de-a66b-5f3df6daec92#gs.aw0nga"&gt;my certificate&lt;/a&gt; and I wish all the best to everyone preparing for this exam. Don't worry, here is a cat for you:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mu_gdSCk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8ns559iesol5r7t0ym39.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mu_gdSCk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8ns559iesol5r7t0ym39.jpeg" alt="fabio lamanna - iStock - Getty Images"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>gcp</category>
      <category>googlecloud</category>
      <category>architecture</category>
      <category>certification</category>
    </item>
    <item>
      <title>Azure DevOps Multi-Stage YAML Pipelines  for Google Firebase Functions</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Mon, 15 Jun 2020 20:29:31 +0000</pubDate>
      <link>https://dev.to/ib1/azure-devops-multi-stage-yaml-pipelines-for-google-firebase-functions-2oi9</link>
      <guid>https://dev.to/ib1/azure-devops-multi-stage-yaml-pipelines-for-google-firebase-functions-2oi9</guid>
      <description>&lt;p&gt;In one of my previous posts I've discussed &lt;a href="https://dev.to/ib1/azure-devops-recipe-deploying-google-cloud-function-to-gcp-22l3"&gt;how to deploy Google Cloud functions&lt;/a&gt; using Azure Pipelines. However many of the developers are using Firebase Functions, which are the same Google Cloud Functions in disguise, but deployed using Firebase CLI. I'd like to see how we can do it in the context of Azure DevOps. But this time let's make it closer to reality.&lt;br&gt;
Usually we'd have many environments: Development, Staging, Production, and deployment to each one is governed by multiple rules.  UAT tests need to be passed, various approvals received etc. In Azure DevOps this problem is solved using "stages". For a long time "classic" Release Pipelines supported stages that could target deployments to different environments. A fairly new feature is &lt;a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&amp;amp;tabs=yaml" rel="noopener noreferrer"&gt;multi-stage YAML pipelines&lt;/a&gt;. I wanted to evaluate this feature from several standpoints: whether it supports approvals for stages, can I see where the current build is deployed, can I re-deploy a previous build, and if unit and coverage tests results can be displayed. &lt;br&gt;
With that in mind, let's build a multi-stage pipeline to deploy a Node.js-based Firebase function to GCP.&lt;/p&gt;
&lt;h2&gt;
  
  
  Deploying Firebase functions
&lt;/h2&gt;

&lt;p&gt;In comparison to pure Google Cloud Functions deploying Firebase is a breeze. We only need to install &lt;strong&gt;firebase-tools&lt;/strong&gt; NPM package globally and issue one command, is you'd see below in the pipeline source. The only things we need to have is a GCP project name and a Firebase authentication token.&lt;/p&gt;
&lt;h3&gt;
  
  
  Firebase Token
&lt;/h3&gt;

&lt;p&gt;This command generates a token that can be used in CI/CD pipelines:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 login:ci

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;With that information we are ready to move to the Azure DevOps.&lt;/p&gt;
&lt;h2&gt;
  
  
  Azure Multi-Stage Pipeline
&lt;/h2&gt;

&lt;p&gt;Here is the full pipeline source and then we can discuss different parts of it.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;As you could see, the pipeline hierarchy is Stage/Job/Steps/Task.&lt;br&gt;
The first stage is building, testing the project, and producing a deployment artifact. There is not much interesting here other than that stage represents a typical Continuous Integration pipeline in its entirety. Next stages correspond to the Continuous Deployment part.&lt;br&gt;
Here some things are becoming more exciting. We can see that the "deployment job" can have several additional parameters, such as &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;environment name - corresponds to the names on the "Environments" page. We'll talk about it later in the "Approvals" section.&lt;/li&gt;
&lt;li&gt;strategy - e.g. &lt;em&gt;runOnce&lt;/em&gt; or &lt;em&gt;matrix&lt;/em&gt; are useful in single- or multi-platform deployments.
For every environment we could create a separate stage and provide different variables and deployment options. One of the first steps to do is to download an artifact that was published by the "build" stage. In that way we can ensure that each of the deployment stages uses exactly the same build package.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Variables
&lt;/h3&gt;

&lt;p&gt;Very often we don't want to hard-code environment-specific parameters in your pipeline. That is where variables come to play.&lt;br&gt;
Variables can be specified in several ways. One of them is "pipeline" variables. When editing the pipeline there is a "Variables" tab as you could see on a screenshot below, where you could add and edit the variables. In the pipeline you could refer to them via special syntaxis, for example as &lt;strong&gt;$(firebase-project)&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjdnsg8rqbv183vr1gjv3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjdnsg8rqbv183vr1gjv3.png" alt="Pipeline variables"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another way is specifying variables in the "Variables Groups" on the "Library" page. The advantage of this method is that we can secure the access to different groups, so, for example, developers would not be able to see the Production variables.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjd2bnnibttcxhjwrjnmv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjd2bnnibttcxhjwrjnmv.png" alt="Variables Groups"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Tackling Approvals
&lt;/h3&gt;

&lt;p&gt;While in "Classic" Release pipelines we are able to add approvals to the stage itself, for YAML pipelines they are handled via the "Environments" page. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fmptiup0y3fb84xc06f69.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fmptiup0y3fb84xc06f69.png" alt="Environments"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can add approvers and set a minimum number of approvers here.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Frd58a0qbtzsro3vjobgy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Frd58a0qbtzsro3vjobgy.png" alt="Approvers"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Viewing deployments
&lt;/h3&gt;

&lt;p&gt;On the pipeline page we could visually see how many stages were successfully executed, as well as view test results and generated artifacts&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjt2c85qts8uobldvhq9n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fjt2c85qts8uobldvhq9n.png" alt="Build results"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Final words
&lt;/h2&gt;

&lt;p&gt;In this post we were able to use new YAML-based multi-stage pipelines to produce a combined CI/CD pipeline for Google Firebase functions deployment. We achieved all of the goals:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;unit testing&lt;/li&gt;
&lt;li&gt;test results&lt;/li&gt;
&lt;li&gt;reproducible builds&lt;/li&gt;
&lt;li&gt;environment-specific variables&lt;/li&gt;
&lt;li&gt;deployment approvals&lt;/li&gt;
&lt;li&gt;viewing where the build is deployed&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;However YAML pipelines still do not have a parity with the "classic" release pipelines. For example, the pipeline is stuck in "Pending" status until a final stage is approved and deployed, which feels awkward and leads to ridiculous build time displayed - one/two, or more days. Also, at a glance, we can only see which stages were completed, but we need to drill down to view the environment.&lt;br&gt;
Hopefully, the multi-stage pipelines will grow and mature as the cat on the image below did.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8an4i5uqsy4k6ifj3zga.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8an4i5uqsy4k6ifj3zga.jpg" alt="Cat stages"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>firebase</category>
      <category>gcp</category>
    </item>
    <item>
      <title>Protecting Azure Functions with API Management Service</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Tue, 12 May 2020 03:31:45 +0000</pubDate>
      <link>https://dev.to/ib1/protecting-azure-functions-with-api-management-service-53el</link>
      <guid>https://dev.to/ib1/protecting-azure-functions-with-api-management-service-53el</guid>
      <description>&lt;p&gt;Cloud functions are great. Who does not like automatic scalability and hands-off infrastructure management? HTTP-triggered Azure functions however are exposed to the public Internet. So if we want to use them, for example, as a part of a microservice-based application, we'd want to take steps to secure those functions from malicious attacks. &lt;br&gt;
One way would be to bake authentication and authorization into the function itself. However this does not prevent public access to the HTTP endpoint. And even though the function would be secured, Azure will still charge for the execution, making it a target of "denial-of-wallet" attack.&lt;br&gt;
I wish that there was an option to create a function in something like Google Cloud Virtual Private Network, or even Azure's own VNet, exposing only internal IPs by default. Well, you can use a dedicated Azure App Service Environment (ASE), but the pricing makes it prohibitive, and it greatly reduces the flexibility.&lt;br&gt;
One of the solutions, as it often is in software engineering, is to insert another level of indirection. Azure API Management Service can help to secure your APIs. &lt;br&gt;
In a high-level view, as seen in a cover image, APIM will stand between the client and the function endpoint, managing authentication and access, while the function will be protected by the Azure Active Directory. In addition, API Management comes with a vast number of features, like limiting quotas, API documentation, integration with payment service and many others, which is impossible to fit into one article. And that is, anyway, not what the title says.&lt;br&gt;
Let's get to the subject and do step-by-step configuration of the API Management and Azure Function, as there are several tricky steps that better not to be skipped.&lt;/p&gt;
&lt;h2&gt;
  
  
  Enable APIM Managed Identity
&lt;/h2&gt;

&lt;p&gt;The first thing that we need to do is to enable APIM Managed Identity. Well, the first thing is to create an instance of the API Management Service, but it could be easily provisioned in Azure Portal Beware though that it takes up to an hour to get it. This and consequent steps we will be doing in the Azure Portal. Navigate to your APIM instance, select the "Managed Identity" menu, and enable the checkbox.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fccylmypt97cbzsgl6suq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fccylmypt97cbzsgl6suq.png" alt="APIM Management Identity"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Secure Azure Functions with Azure Active Directory
&lt;/h2&gt;

&lt;p&gt;Having finished the first step, let go to the Azure Function you want to secure. (You've already created one, right?). We can safely disable existing levels of protection, like Function Keys, and make a function anonymous. Now we need to integrate it with Azure AD. For that, click on (1) "Authentication/Authorization" link in Platform Features tab. It will open a new page. We will return to the previous one and use a link number (2) "API Management" when we finish with Active Directory steps. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fs5irskals0udleqieg3d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fs5irskals0udleqieg3d.png" alt="Azure Function Settings"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enable App Service Authentication and select "Log in using AAD" from the dropdown.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0fzmlx7066sjz4ser20z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0fzmlx7066sjz4ser20z.png" alt="Authentication"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's choose the "Express" setting to create a new Active Directory app. Give it a name, and remember it, as we'll need it in the next step.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7u8zk8b4l4lvp5wd9pb2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7u8zk8b4l4lvp5wd9pb2.png" alt="Active Directory"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Finding AAD Application ID
&lt;/h2&gt;

&lt;p&gt;Once an Azure AD application is created, we need to find its Application ID.&lt;br&gt;
In the top search bar, search for "Azure Active Directory", and select "Enterprise Applications" from the left menu. Now look up your newly created AD app (you can use a filter to type the name), and copy the Application ID.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftefpd6l2gdjni6mu2ulo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftefpd6l2gdjni6mu2ulo.png" alt="AAD Application ID"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Importing API to Azure Management Service
&lt;/h2&gt;

&lt;p&gt;We are done with Active Directory. At this point you can check that your Function is not callable from the direct URL anymore, and you should get an Unauthorized response. Now we need to expose your function again and the easiest way to do that is from Function's Platform Features page (a link #2 "API Management" from the image above we promised to return to). &lt;br&gt;
Select an existing instance of APIM, "Create New API", and click on "Link API" button. It will import your functions endpoints to the APIm, and you'll be redirected to the corresponding APIM page.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fw4y6spwfc6mlw8b02nz2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fw4y6spwfc6mlw8b02nz2.png" alt="Importing API"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Applying Inbound Policy to the API
&lt;/h2&gt;

&lt;p&gt;Finally we are approaching one of the most important steps - applying inbound policy for the API that we imported from the Azure function.&lt;br&gt;
To be able to successfully call a function via API Management, an inbound policy rule should insert authorization token (APIM Managed Identity) and be able to verify it using our Active Directory App. &lt;br&gt;
To modify an inbound policy select the API and click on Policies "&amp;lt;/&amp;gt;" link. It will open an XML Editor. Insert the lines below after the "base" policy. Replace [Azure AD Application ID] with the values we found earlier, and save.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fc5bjiqgny9vxbbk2psb5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fc5bjiqgny9vxbbk2psb5.png" alt="API Policy"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flqb17n4hszfzer605fr1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flqb17n4hszfzer605fr1.png" alt="Edit Policy"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  &amp;lt;authentication-managed-identity resource="[Paste Azure AD Application ID]" ignore-error="false" /&amp;gt;
  &amp;lt;set-header name="Ocp-Apim-Subscription-Key" exists-action="delete" /&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Checking Azure Active Directory Integration
&lt;/h2&gt;

&lt;p&gt;Let's test our integration. We can do it right in the APIM console.&lt;br&gt;
Select one API endpoint and go to the Test tab. Here we can enter parameters, if they are required for your function and click Test. After that we will get a (hopefully) successful result. But if not, click on a Trace tab and ensure that you see "Managed Identity token is added to the Authorization header" message.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgc22uyafqlpgdyn0d7kl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgc22uyafqlpgdyn0d7kl.png" alt="Trace"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We are done! Or are we? Our function direct URL is secured, but we now exposed it via API Management URL, and it is publicly accessible again. The thing is, now APIM gives you flexibility to apply an authentication method of your choice, being it Subscription Keys, JWT tokens, OAuth 2.0, OpenID Connect, or integration with third-party providers like Okta or Auth0. But that is a different story and possibly a subject for another post.&lt;br&gt;
One hint though: remember that policies are evaluated in order. So, for example, if you decide to authenticate your API with JWT Tokens, a "validate-jwt" policy should come before the "authentication-managed-identity" policy that we implemented here. Otherwise an "Authorization" header will be replaced before having a chance to be validated.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;In this article we described a way to secure publicly accessible HTTP Azure functions with API Management Service and Azure Active Directory. Now you can relax and manage your API with style, like a cat manages his money in the picture below.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Felny77astq1x2k82ehml.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Felny77astq1x2k82ehml.jpg" alt="Cat in Management"&gt;&lt;/a&gt;&lt;br&gt;
&lt;sup&gt;© APTYP_KOK / GETTY IMAGES&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>api</category>
      <category>azurefunctions</category>
      <category>security</category>
    </item>
    <item>
      <title>Azure DevOps Recipe: Deploying Google Cloud function to GCP</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Tue, 07 Apr 2020 20:24:34 +0000</pubDate>
      <link>https://dev.to/ib1/azure-devops-recipe-deploying-google-cloud-function-to-gcp-22l3</link>
      <guid>https://dev.to/ib1/azure-devops-recipe-deploying-google-cloud-function-to-gcp-22l3</guid>
      <description>&lt;ul&gt;
&lt;li&gt;Deploying Using Google Cloud Build&lt;/li&gt;
&lt;li&gt;
Deploying Using Azure Pipelines

&lt;ul&gt;
&lt;li&gt;Setting up Google Service Account&lt;/li&gt;
&lt;li&gt;Storing Security Key&lt;/li&gt;
&lt;li&gt;Creating CI/CD pipeline&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Final thoughts&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;It goes without saying that Azure Devops pipelines and Azure cloud is a natural fit. A deployment to Azure is streamlined with many ready-to-use templates and  Azure CLI installed by default on managed agents. However the reality is that a lot of companies have to deal with a multi-cloud environment. It would be beneficial if we can manage our builds and deployment in one place, no matter what cloud provider is used.&lt;br&gt;
In this recipe we will consider the deployment options for Google cloud function on GCP and walk through detailed steps of creating CI/CD pipeline in Azure DevOps. For the purpose of this exercise we assume that you already developed a function. If not, you can use one of the many tutorials, &lt;a href="https://cloud.google.com/functions/docs/tutorials/http" rel="noopener noreferrer"&gt;like this one&lt;/a&gt;, to create one. &lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Using Google Cloud Build
&lt;/h2&gt;

&lt;p&gt;So one way to deploy a cloud function would be to use a native Google Cloud Build. We can set up a &lt;em&gt;connected external&lt;/em&gt; Cloud repository at &lt;a href="https://source.cloud.google.com" rel="noopener noreferrer"&gt;https://source.cloud.google.com&lt;/a&gt; that will be automatically synchronized with our main repo. Then we can create a &lt;em&gt;Cloud Build Trigger&lt;/em&gt; that can run a YAML pipeline not dissimilar to Azure's one. When a new change is pushed to the Git repo, it will be synced to the Google repository and trigger the build and deployment.&lt;br&gt;
There are however several issues with that. First of all, an external repository could be hosted only on Github or Bitbucket, as seen on a screenshot below. So if your source code is in Azure Repos or anywhere else, you are out of luck.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ff3k9xy412s5flwmx4uch.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ff3k9xy412s5flwmx4uch.png" alt="Google External Repo" width="499" height="251"&gt;&lt;/a&gt;&lt;br&gt;
But most importantly, it moves a control and auditing out of Azure Devops and that contradicts our goal of keeping everything under one roof.&lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Using Azure Pipelines
&lt;/h2&gt;

&lt;p&gt;Fortunately, Azure Pipelines are flexible enough to deploy to practically any environment. We will outline basic steps to do that for Google Cloud Functions and GCP.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Setting up a Service Account
&lt;/h3&gt;

&lt;p&gt;We will need a Google Service Account to secure a communication between Azure Pipelines and GCP. Taking from the &lt;a href="https://cloud.google.com/solutions/creating-cicd-pipeline-vsts-kubernetes-engine" rel="noopener noreferrer"&gt;Google documentation&lt;/a&gt; here is how to do it using Google Cloud Shell. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Login to &lt;a href="https://console.cloud.google.com/" rel="noopener noreferrer"&gt;GCP Console&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Select the project where your function is deployed.&lt;/li&gt;
&lt;li&gt;Activate Cloud Shell.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Set default configuration values to save some typing. Replace [PROJECT_ID] and [ZONE] with appropriate values.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud config &lt;span class="nb"&gt;set &lt;/span&gt;project &lt;span class="o"&gt;[&lt;/span&gt;PROJECT_ID]
gcloud config &lt;span class="nb"&gt;set &lt;/span&gt;compute/zone &lt;span class="o"&gt;[&lt;/span&gt;ZONE]
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Create a Service Account:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud iam service-accounts create azure-pipelines-publisher &lt;span class="nt"&gt;--display-name&lt;/span&gt; 
&lt;span class="s2"&gt;"Azure Pipelines Publisher"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Assign the Storage Admin IAM role to the service account:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;PROJECT_NUMBER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;gcloud projects describe &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="si"&gt;$(&lt;/span&gt;gcloud config get-value core/project&lt;span class="si"&gt;)&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--format&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'value(projectNumber)'&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;

&lt;span class="nv"&gt;AZURE_PIPELINES_PUBLISHER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;gcloud iam service-accounts list &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--filter&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"displayName:Azure Pipelines Publisher"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--format&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'value(email)'&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;

gcloud projects add-iam-policy-binding &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="si"&gt;$(&lt;/span&gt;gcloud config get-value core/project&lt;span class="si"&gt;)&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--member&lt;/span&gt; serviceAccount:&lt;span class="nv"&gt;$AZURE_PIPELINES_PUBLISHER&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--role&lt;/span&gt; roles/storage.admin
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;/ol&gt;

&lt;p&gt;We also need to generate and download a service account key to use later on in an Azure Pipeline. The easiest way is to navigate to the IAM &amp;amp; Admin / Service Accounts menu, and select "Edit" on &lt;em&gt;azure-pipelines-publisher@[PROJECT_ID].iam.gserviceaccount.com&lt;/em&gt; that we just created. Then create a key as on a screenshot below.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fmd801un1c9d915ynycll.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fmd801un1c9d915ynycll.png" alt="Create Google Service Account Key" width="691" height="434"&gt;&lt;/a&gt;&lt;br&gt;
Keep the file as we are going to use it in a moment.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Storing service account key in Azure DevOps
&lt;/h3&gt;

&lt;p&gt;We pretty much finished with Google Platform, let's switch to Azure DevOps and continue there.&lt;br&gt;
To upload a JSON file go to the Library page under the Pipelines navigation panel and select the "Secure Files" tab. Here we can add our key to the library.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fepjfxi97kwv6g8zu9f2e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fepjfxi97kwv6g8zu9f2e.png" alt="Upload Key File" width="784" height="329"&gt;&lt;/a&gt;&lt;br&gt;
After the key is uploaded, edit it and toggle "Authorize for all pipelines" to be able to use it in our pipeline.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fa7hz5cfrkuo6mcyk74bq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fa7hz5cfrkuo6mcyk74bq.png" alt="Save Key" width="422" height="342"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Creating CI/CD pipeline
&lt;/h3&gt;

&lt;p&gt;Our simple pipeline will deploy one cloud function to GCP. Of course, it can be extended with unit tests and other functions, but we want to show the bare minimum.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First, let's use the secure key that we uploaded earlier:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DownloadSecureFile@1&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;authkey&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Download&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Service&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Account&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Key'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;secureFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;GoogleServiceAccountKey.json'&lt;/span&gt;
    &lt;span class="na"&gt;retryCount&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Next, we need a Google Cloud SDK to deploy our function.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;!!!UPDATE 2020-05: Good news, on the "ubuntu-latest" hosts Google Cloud SDK (292.0.0) is installed by default. So you probably can skip this step. See this link for more details: &lt;a href="https://github.com/actions/virtual-environments/blob/master/images/linux/Ubuntu1804-README.md" rel="noopener noreferrer"&gt;Ubuntu1804&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The biggest challenge is that Google Cloud SDK is not installed on Microsoft Hosted Agents, understandably so. There are a couple of ways to do that.&lt;br&gt;
&lt;a href="https://cloud.google.com/sdk/docs/downloads-apt-get" rel="noopener noreferrer"&gt;Official Google Documentation&lt;/a&gt; did not work for me though right out of the gate. But you can follow the link if you'd like to install it using apt-get. Alternatively, we can get the package directly from the Google download site.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
    &lt;span class="s"&gt;wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz&lt;/span&gt;
    &lt;span class="s"&gt;tar zxvf google-cloud-sdk.tar.gz &amp;amp;&amp;amp; ./google-cloud-sdk/install.sh --quiet --usage-reporting=false --path-update=true&lt;/span&gt;
    &lt;span class="s"&gt;PATH="google-cloud-sdk/bin:${PATH}"&lt;/span&gt;
    &lt;span class="s"&gt;gcloud --quiet components update&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;install&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;gcloud&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;SDK'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Finally, we are ready to deploy the function:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="s"&gt;gcloud auth activate-service-account --key-file $(authkey.secureFilePath)&lt;/span&gt;
    &lt;span class="s"&gt;gcloud functions deploy [FUNCTION_NAME] --runtime nodejs8 --trigger-http --region=[REGION] --project=[PROJECT_ID]&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;deploy&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;cloud&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;function'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;As usual, here is a full source code of the YAML pipeline:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Final thoughts
&lt;/h2&gt;

&lt;p&gt;When you work in a multi-cloud environment it is especially important to consolidate DevOps operations for better control, monitoring and auditing. Azure DevOps could be one of the answers as it allows deployment to multiple platforms and integrates many aspects of software development life cycle into a cohesive, easy to use product. As an example, in this recipe we created a sample CI/CD pipeline to deploy a Google function to GCP.&lt;/p&gt;

&lt;p&gt;I hope that was useful, here is a cat (in a cloud!) for you.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1hf6f15h8utu72ne1fn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1hf6f15h8utu72ne1fn6.png" alt="Cat in a cloud" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>gcp</category>
      <category>googlecloudfunction</category>
    </item>
    <item>
      <title>Azure DevOps Recipe: Deploying Azure Logic App using Powershell Script</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Tue, 31 Mar 2020 17:43:12 +0000</pubDate>
      <link>https://dev.to/ib1/azure-devops-recipe-deploying-azure-logic-app-using-powershell-script-3pma</link>
      <guid>https://dev.to/ib1/azure-devops-recipe-deploying-azure-logic-app-using-powershell-script-3pma</guid>
      <description>&lt;p&gt;Azure Logic Apps is a somewhat unique cloud service that allows to connect your business-critical apps and services, automating your workflows without writing a single line of code.&lt;br&gt;
There are numerous articles out there about how to deploy Azure Logic App with Azure Resource Manager templates, including official Microsoft documentation:&lt;br&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-resource-manager-templates-overview"&gt;Automate deployment for Azure Logic Apps by using Azure Resource Manager templates&lt;/a&gt;&lt;br&gt;
But in a corporate environment this kind of resources, especially in Production environment, are predefined and secured. Developers often do not have permission to overwrite existing resources, and deploying with ARM templates does exactly that. The template defines the infrastructure, resources, parameters, and other information for provisioning and deploying your logic app. But what we need is to leave the infrastructure alone and deploy just our logic app and its parameters.&lt;br&gt;
So what to do when you dealing with this kind of restricted environment but still want to leverage the power of CI/CD pipelines?&lt;br&gt;
Fortunately, when you create a Logic App using Visual Studio, a sample deployment Powershell script named "Deploy-AzureResourceGroup.ps1" is created for us too. In fact, it uses Azure PowerShell extension that simplifies the management of Azure Cloud services.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--j00y9aAh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/073uov0gbvwebqhm0vs1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--j00y9aAh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/073uov0gbvwebqhm0vs1.png" alt="Deploy-AzureResourceGroup.ps1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here is a beginning of this script, pay attention to the parameters that we can provide to customize it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;Param&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Parameter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Mandatory&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;$true&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroupLocation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'&amp;lt;YOUR RESOURCE GROUP NAME&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;switch&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$UploadArtifacts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$StorageAccountName&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$StorageContainerName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroupName&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToLowerInvariant&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'-stageartifacts'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TemplateFile&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'LogicApp.json'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TemplateParametersFile&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'LogicApp.parameters.json'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ArtifactStagingDirectory&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'.'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$DSCSourceFolder&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'DSC'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;switch&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ValidateOnly&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="o"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;A full code for the deployment script is provided at the end of the article for those who want to use VS Code or other editor to develop a Logic App.&lt;br&gt;
Let's leverage this script and create CI/CD pipelines in Azure DevOps.&lt;/p&gt;
&lt;h2&gt;
  
  
  Creating a build artifact in Azure DevOps pipeline
&lt;/h2&gt;

&lt;p&gt;The only things that we need to create a deployable artifact are the files that are located in the project's folder: a JSON file containing the workflow, a parameters file, and a script. So a build pipeline could be extremely simple, consisting just of one step, like the one below.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;task&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PublishBuildArtifacts@1&lt;/span&gt;
  &lt;span class="na"&gt;displayName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Publish&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Artifact:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;LogicApp'&lt;/span&gt;
  &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;PathtoPublish&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;src/LOGIC_APP_FOLDER'&lt;/span&gt;
    &lt;span class="na"&gt;ArtifactName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;LogicApp'&lt;/span&gt;
    &lt;span class="na"&gt;publishLocation&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Container'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Execute the build pipeline to create an artifact.&lt;/p&gt;
&lt;h2&gt;
  
  
  Release pipeline
&lt;/h2&gt;

&lt;p&gt;We can start with an empty Release pipeline, and add an artifact produced by the Build pipeline.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HR5Kcn_Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/utnegdwqrergaz99tywd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HR5Kcn_Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/utnegdwqrergaz99tywd.png" alt="Release pipeline"&gt;&lt;/a&gt;&lt;br&gt;
Next, we add a stage, and a single Azure Powershell Task. It is very important to select a &lt;em&gt;Task Version&lt;/em&gt; as "2.*" and, correspondingly, &lt;em&gt;Azure PowerShell Version&lt;/em&gt; as "Specify Other Version" and &lt;em&gt;Preferred Azure PowerShell Version&lt;/em&gt; as "2.1.0" as that is the version that the deployment script is using.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QKo-XGtJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/igr6h6xe3fi5jjvgj47p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QKo-XGtJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/igr6h6xe3fi5jjvgj47p.png" alt="Azure Powershell task"&gt;&lt;/a&gt;&lt;br&gt;
Other important parameters are &lt;em&gt;Script Path&lt;/em&gt; - you can navigate to your artifact and select a deployment script.&lt;br&gt;
Also, &lt;em&gt;Script Arguments&lt;/em&gt; is a field where we can provide parameters to the script itself. Those arguments correspond to the parameters that we see at the beginning of the script. &lt;br&gt;
A good thing about Azure Powershell task is that it will automatically authenticate it to the Azure via service connection, so you do not have to worry about that.&lt;/p&gt;
&lt;h2&gt;
  
  
  Powershell Script
&lt;/h2&gt;

&lt;p&gt;Here is a gist with a complete script's code:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;An alternative method to deploying Azure Logic app is to use Azure Powershell script that is relatively straightforward, does not require elevated permissions, and is applicable in the enterprise and restricted environments.&lt;/p&gt;

&lt;p&gt;Well, that is all for this simple recipe. Good luck, and when developing your Logic App be sure not to use a &lt;strong&gt;cat logic&lt;/strong&gt;!&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--M7GhkpMo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/43iubneyzcru5zx3skgz.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--M7GhkpMo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/43iubneyzcru5zx3skgz.jpg" alt="Cat Logic"&gt;&lt;/a&gt;&lt;br&gt;
&lt;sup&gt;Photo credit: &lt;a href="https://www.reddit.com/r/AnimalsBeingDerps/comments/a6yufu/these_cats/"&gt;Ryozuo&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>logicapps</category>
      <category>powershell</category>
    </item>
    <item>
      <title>A missing step: Backup Azure DevOps Repositories</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Fri, 13 Mar 2020 14:36:58 +0000</pubDate>
      <link>https://dev.to/ib1/a-missing-step-backup-azure-devops-repositories-16p7</link>
      <guid>https://dev.to/ib1/a-missing-step-backup-azure-devops-repositories-16p7</guid>
      <description>&lt;h3&gt;
  
  
  Table of content
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;To backup or not to backup?&lt;/li&gt;
&lt;li&gt;Method 1: Using Git&lt;/li&gt;
&lt;li&gt;Method 2: Using Azure DevOps API&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;To backup or not to backup?
&lt;/h3&gt;

&lt;p&gt;Don't get me wrong, I like Azure DevOps. There are some frustrations here and there, for example in managing permissions and caching build resources. And each of Azure DevOps modules (Dashboards/Wiki, Boards, Repos, Pipelines, Test Plans, Artifacts) might not be THE BEST on a market. But integration and ease of use make it greater than sum of the parts, especially for small and medium-size projects.&lt;br&gt;
Still, there is one thing that puzzles me. Backing up your Git repositories seems to me like common sense and a good practise. It also can be a policy in some companies. However there is no way to do it now either manually or on schedule. Of course, Microsoft is &lt;a href="https://docs.microsoft.com/en-us/azure/devops/organizations/security/data-protection?view=azure-devops" rel="noopener noreferrer"&gt;committed to keep the data safe&lt;/a&gt;, including periodic backups and geo-replication, but we do not have any control over it. And it does not prevent from unintentional or malicious actions leading to data loss. &lt;br&gt;
Microsoft's response to such requests, and I &lt;a href="https://developercommunity.visualstudio.com/content/problem/609097/backup-azure-devops-data.html" rel="noopener noreferrer"&gt;quote&lt;/a&gt;: &lt;em&gt;"In current Azure DevOps, there is no out of the box solution to this, you could backup your projects by downloading them as zip to save it on your local and then upload it to restore them. And you also could backup your work items by open them with Excel to save in your local machine."&lt;/em&gt;&lt;br&gt;
I mean what, LOL. Excel as a backup tool is possibly a new high in data safety. Anyway, are there ways to twist control back into our hands? &lt;br&gt;
Of course there are, and today we explore two of them.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Backup repository using plain old git bash script
&lt;/h3&gt;

&lt;p&gt;One of the methods is to use a bash script to get a complete copy of the repository. Let's not run it from our laptop, but rather spin up a small VM in the cloud.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Plan of attack:&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a cheap Linux virtual machine in Azure&lt;/li&gt;
&lt;li&gt;Generate new SSH Key Pair&lt;/li&gt;
&lt;li&gt;Add SSH Public key to Azure DevOps&lt;/li&gt;
&lt;li&gt;Create bash script to mirror Git Repo&lt;/li&gt;
&lt;li&gt;Execute that script on schedule&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Not diving into too much details, but it is quite easy to create a Linux VM in Azure. It already comes with everything we need: Git and shell scripts. Then we can SSH into it and create a bash script, which I named "devopsbackup.sh". &lt;br&gt;
A script is rather primitive, but it gets the job done. Essentially, it deletes a previous backup and creates a mirror copy of the Git repo. Don't forget to replace variables in angle brackets with your own values.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
error_exit&lt;span class="o"&gt;()&lt;/span&gt;
&lt;span class="o"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;PROGNAME&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;: &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;1&lt;/span&gt;&lt;span class="k"&gt;:-&lt;/span&gt;&lt;span class="s2"&gt;"Unknown Error"&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; 1&amp;gt;&amp;amp;2
        &lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="c"&gt;#&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Executing Azure DevOps Repos backup"&lt;/span&gt;
&lt;span class="nb"&gt;cd&lt;/span&gt; /home/devopsadmin
&lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-rf&lt;/span&gt; repos/
&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; repos
&lt;span class="nb"&gt;cd &lt;/span&gt;repos/
git clone &lt;span class="nt"&gt;--mirror&lt;/span&gt; git@ssh.dev.azure.com:v3/&amp;lt;organization&amp;gt;/&amp;lt;project&amp;gt;/&amp;lt;repo&amp;gt; &lt;span class="o"&gt;||&lt;/span&gt; error_exit &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$LINENO&lt;/span&gt;&lt;span class="s2"&gt;: "&lt;/span&gt;

&lt;span class="nb"&gt;cd&lt;/span&gt; ..
&lt;span class="nb"&gt;exit &lt;/span&gt;0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Allow script execution:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;chmod 0755 devopsbackup.sh&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;We also need to generate SSH key pair by using command&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ssh-keygen -C "devopsbackup"&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;By default, keys will be generated in "~/.ssh" folder. We need to copy a public key "id_rsa.pub" from there and paste into Azure DevOps. Go to the profile settings on a top right and add a new key from there:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F10fatjz3ihjh60ga1yx5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F10fatjz3ihjh60ga1yx5.png" alt="Azure DevOps SSH Key"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can easily create a scheduled execution for our script. Go ahead, type "crontab -e" in the command line and add something like this to the Cron config:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;20 1 * * * /home/devopsadmin/bin/devopsbackup.sh &amp;gt;/dev/null 2&amp;gt;&amp;amp;1&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Next step could be to extend this script using Azure CLI and upload this archive into Azure Blob Storage or Data Lake.&lt;br&gt;
Alternatively, Azure also has a great feature that allows you to create a daily/weekly backup for your VM. So you can just store a snapshot of the whole VM and don't bother with Blob storage, if you like.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Backup default branch using Azure Devops API
&lt;/h3&gt;

&lt;p&gt;That's all well and good, but is there some more modern way that does not require a dedicated VM and shell scripts/cron? &lt;a href="https://docs.microsoft.com/en-us/rest/api/azure/devops/?view=azure-devops-rest-5.1" rel="noopener noreferrer"&gt;Azure DevOps REST API&lt;/a&gt; seems to be promising and allows to manipulate Azure DevOps data, including work items and repositories. Unfortunately this API does not have a parity with Git and full code history cannot be preserved using this method.&lt;br&gt;
However if all you require is a periodic snapshot of the master branch then it could be used to create a simple backup solution. One advantage over previous solution is that we can automatically retrieve information about all our projects and repos, and do not need to hardcode them. So if you add a new project, no modification is required.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Approach:&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use REST API to retrieve hierarchy of projects, repositories, items and blobs&lt;/li&gt;
&lt;li&gt;Use Azure DevOps token (PAT) for the API authentication&lt;/li&gt;
&lt;li&gt;Use Azure Function with timer trigger to run this on schedule&lt;/li&gt;
&lt;li&gt;Use Azure Blob Storage to keep an archive.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without further ado, here is a gist for Azure Function. It requires the following parameters that you can set up in Application Settings:&lt;br&gt;
"storageAccountKey", "storageName", "token", "organization"&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;a&gt;&lt;/a&gt;Conclusion
&lt;/h3&gt;

&lt;p&gt;Comparing these two approaches we can see that newer is not always better. With a help of a simple shell script we can produce a full copy of the repository that could be easily restored or imported into the new project. On the other side, if all you want is a periodic repo snapshot, Azure DevOps REST API and scheduled Azure Function can make those things effortless.&lt;/p&gt;

&lt;p&gt;That is all for today, and remember that you always have to protect your work, like a cat protects its spoils from a dog on the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm1q6me0sreisgzzf4qvt.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm1q6me0sreisgzzf4qvt.jpg" alt="A Cat Protecting Spoils from a Dog"&gt;&lt;/a&gt;&lt;br&gt;
&lt;sup&gt;Dirk Valckenburg, A Cat Protecting Spoils from a Dog, 1717&lt;/sup&gt;&lt;br&gt;
&lt;sup&gt;Cover image by &lt;a href="https://pixabay.com/users/422737-422737/?utm_source=link-attribution&amp;amp;utm_medium=referral&amp;amp;utm_campaign=image&amp;amp;utm_content=445155" rel="noopener noreferrer"&gt;Hebi B.&lt;/a&gt; from &lt;a href="https://pixabay.com/?utm_source=link-attribution&amp;amp;utm_medium=referral&amp;amp;utm_campaign=image&amp;amp;utm_content=445155" rel="noopener noreferrer"&gt;Pixabay&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>backup</category>
      <category>git</category>
    </item>
    <item>
      <title>Azure DevOps integration with Google Hangouts Chat</title>
      <dc:creator>Igor Bertnyk</dc:creator>
      <pubDate>Tue, 03 Mar 2020 19:36:29 +0000</pubDate>
      <link>https://dev.to/ib1/azure-devops-integration-with-google-hangouts-chat-3imn</link>
      <guid>https://dev.to/ib1/azure-devops-integration-with-google-hangouts-chat-3imn</guid>
      <description>&lt;p&gt;In my previous post: &lt;/p&gt;
&lt;div class="ltag__link"&gt;
  &lt;a href="/ib1" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F235243%2F26b90c03-945f-4924-9d7e-29ed2d32a2d5.png" alt="ib1"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="/ib1/user-experience-how-to-design-a-train-station-not-4bb9" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;User Experience: How to design a train station... NOT!&lt;/h2&gt;
      &lt;h3&gt;Igor Bertnyk ・ Jan 28 '20&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#ux&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#design&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#productivity&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#architecture&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;
 we veered away to the topic of UX Design. Let's return to the Azure Devops and discuss various ways of its integration with  third-party services and, in particular, with Google Hangouts Chat.

&lt;h2&gt;
  
  
  Azure DevOps Service Hooks
&lt;/h2&gt;

&lt;p&gt;Let's imagine the situation when you want to receive a notification about new User Stories, Pull Requests, or Build completions on your favorite means of team communication, such as Slack or Microsoft teams. Or you want to trigger some workflow based on Azure DevOps event. Well, you can! Service hooks let you run tasks on other services when events happen in your Azure DevOps Services projects. Service hooks concept is based on a pub/sub model where a producer publishes events to the topic, and a consumer subscribes and handles those events.&lt;br&gt;
If you open Azure DevOps/Project Settings/Service Hooks menu, you will see that there are a lot of predefined integration points, including Jenkins, Trello, and others. There is also a hook to the excellent workflow automation tool &lt;a href="https://zapier.com" rel="noopener noreferrer"&gt;Zapier&lt;/a&gt;. It allows to easily integrate various services and APIs and I highly recommend it, although it is a subject for another post. Surprisingly, there are no predefined integration with Microsoft's own &lt;a href="https://flow.microsoft.com" rel="noopener noreferrer"&gt;Microsoft Power Automate&lt;/a&gt; (formerly Microsoft Flow). &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9rhvfmzgnlq9l7gos0xi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9rhvfmzgnlq9l7gos0xi.png" alt="Azure DevOps Service Hooks"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure DevOps Web Hooks
&lt;/h2&gt;

&lt;p&gt;So what to do if your favorite service is not on the list? What if we want to send a message to Google Chat? Fear not, Web Hooks are to the rescue!&lt;br&gt;
Web Hooks provide a way to send a JSON representation of an event to any public endpoint (HTTP or HTTPS). And that opens multitude of possibilities. &lt;br&gt;
Let's create one. When we select a web hook option, we are also able to select an event trigger. Here we have all kind of triggers related to Pull Requests, Builds, Work Items etc. For the purpose of this post let's choose a "Work Item commented on" option, which triggers an event for every new comment posted on a User Story or a Task.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Foo70ss6iuhp0eh745p6n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Foo70ss6iuhp0eh745p6n.png" alt="azure devops web hook"&gt;&lt;/a&gt;&lt;br&gt;
Next, we need to provide a URL of our public endpoint. We also have some options for authentication. And we can customize to some extent a level of details passed in the event. I will show you later in this post how to setup an endpoint. Let's pretend that we have it already at this point and enter into the form.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1ojewkurwofu521bi7n4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1ojewkurwofu521bi7n4.png" alt="azure devops web hook"&gt;&lt;/a&gt;&lt;br&gt;
There is a convenient "Test" button that sends a mock event to the configured URL and allows you to see a JSON message format that we will use later to transform it to the format that Google Chat can understand.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2s6toszvvttilfmrc57v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2s6toszvvttilfmrc57v.png" alt="azure devops web hook"&gt;&lt;/a&gt;&lt;br&gt;
Great! We finished Azure DevOps configuration. Now to the Google Hangouts Chat.&lt;/p&gt;

&lt;h2&gt;
  
  
  Google Chat Webhooks
&lt;/h2&gt;

&lt;p&gt;Incoming webhooks let you send asynchronous messages into Hangouts Chat from applications, and they are fairly simple to configure. Let's create a new "DevOps" Room and in the members' menu select "Configure Webhooks"&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4sbt3uevcjz2osxm8uva.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4sbt3uevcjz2osxm8uva.png" alt="google chat webhook"&gt;&lt;/a&gt;&lt;br&gt;
Add a new webhook. We can also configure a custom bot image.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fspoptlomyvfm5dbi6b9g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fspoptlomyvfm5dbi6b9g.png" alt="google chat webhook"&gt;&lt;/a&gt;&lt;br&gt;
After we've finished with addition, a dedicated URL is assigned to the webhook. Please make a note of it as we'll need it in a moment.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwmvcga2mgqod9n8jlh97.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwmvcga2mgqod9n8jlh97.png" alt="google chat webhook"&gt;&lt;/a&gt;&lt;br&gt;
Let's move to the final step and tie all of this together.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Function
&lt;/h2&gt;

&lt;p&gt;So now we have configured both Azure DevOps service hook and Google Chat webhook but they still do not talk to each other. DevOps sends a JSON message in a format that Chat is not able to understand. We need a way to transform the message an tie those two services together. There are a lot of options for that. We could use Zapier, mentioned above. Azure Logic apps has an integration with Google Chat out of the box. But why use ready-made solution when we can develop our own, right?&lt;br&gt;
One of the option is Azure Function. It is simple, cheap, can be secured on a function or host level, and support Javascript language, among others.&lt;br&gt;
I will not tell you here how to create a function, as there are many tutorials that go to excruciating level of details, &lt;a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-function-app-portal" rel="noopener noreferrer"&gt;for example here&lt;/a&gt;. On a screenshot below I've created a "WorkItemUpdate" function with HTTP trigger.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fe7ird5oj56vybbo315on.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fe7ird5oj56vybbo315on.png" alt="azure function"&gt;&lt;/a&gt;&lt;br&gt;
A code for the function is below:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;https&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;https&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="nx"&gt;module&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;exports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;JavaScript HTTP trigger function processed a request.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;

        &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;OK&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;

        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;text&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;hostname&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;chat.googleapis.com&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;port&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;443&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/v1/spaces/&amp;lt;space&amp;gt;/messages?key=&amp;lt;key&amp;gt;&amp;amp;token=&amp;lt;token&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Content-Length&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;botReq&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;https&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`statusCode: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;data&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;d&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;d&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;



            &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;end&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;

                &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;done&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="p"&gt;});&lt;/span&gt;

            &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;done&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="nx"&gt;botReq&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;error&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;error&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
            &lt;span class="p"&gt;};&lt;/span&gt;
            &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;done&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="nx"&gt;botReq&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nx"&gt;botReq&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;end&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="c1"&gt;//context.done()&lt;/span&gt;

    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Please pass a JSON body.message.text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="nx"&gt;context&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;done&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Please notice that "path" parameter in the code is the URL that we got from Google Chat webhook. The purpose of the code is to transform JSON received from DevOps event to the { 'text': 'custom message' } format accepted by Chat, and submit it to the Chat's webhook.&lt;/p&gt;

&lt;p&gt;In the Azure portal we again see a convenient "Test" section where we can test our function in isolation. We can also see "Get Function URL" link.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fkb25ymxozx6hvxb83042.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fkb25ymxozx6hvxb83042.png" alt="function url"&gt;&lt;/a&gt;&lt;br&gt;
Copy it, we need it now! Remember that we glossed over URL configuration for Azure DevOps service hook above? That the URL that we need to enter there.&lt;/p&gt;

&lt;p&gt;Finally, all configuration is done. Let's enter a new comment in a User Story, and in a second it will appear as a new message in our Google Chat Room:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2z1tyq710d2e5jeuq25d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2z1tyq710d2e5jeuq25d.png" alt="google chat message"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We  have seen that Azure DevOps can be integrated with a number of services and were able to create a custom integration with Google Hangouts Chat using Azure function. &lt;br&gt;
I hope it was useful, here as a cat (and a dog) for you as a proof that different species and services can live peacefully with each other.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnla1bk4dtc87pyj65cei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnla1bk4dtc87pyj65cei.png" alt="cat and dog together"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>chat</category>
      <category>integration</category>
    </item>
  </channel>
</rss>
