<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Janne Pasanen</title>
    <description>The latest articles on DEV Community by Janne Pasanen (@jannepasanen).</description>
    <link>https://dev.to/jannepasanen</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jannepasanen"/>
    <language>en</language>
    <item>
      <title>Automated testing of Synapse Pipelines using xUnit and Azure DevOps</title>
      <dc:creator>Janne Pasanen</dc:creator>
      <pubDate>Thu, 04 May 2023 04:38:56 +0000</pubDate>
      <link>https://dev.to/jannepasanen/automated-testing-of-synapse-pipelines-using-xunit-and-azure-devops-5b5e</link>
      <guid>https://dev.to/jannepasanen/automated-testing-of-synapse-pipelines-using-xunit-and-azure-devops-5b5e</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;Synapse Analytics pipelines is a powerful integration and ETL/ELT tool in Azure. It offers code-free orchestration with over 90 data source connectors and data flow capabilities for more advanced scenarios.&lt;/p&gt;

&lt;p&gt;As with other integration and data pipelines including code, applications and deployments, Synapse Pipelines solutions should be included in your integration test plan and the tests should be automated.&lt;/p&gt;

&lt;p&gt;This post demonstrates the steps required to get your fully automated Synapse Pipelines tests running from Azure DevOps.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h1&gt;
  
  
  Use case
&lt;/h1&gt;

&lt;p&gt;I've prepared a pipeline for testing purposes which imports movie data from TheMovieDatabase.org and converts the movie data from json to parquet format using Synapse Pipelines. The json data import part is implemented using an Azure Function. Below is the high-level diagram of the data ingestion and transformation process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo13lhyav63u5awkllsfl.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo13lhyav63u5awkllsfl.PNG" alt="High-level solution architecture"&gt;&lt;/a&gt;High-level solution architecture.&lt;/p&gt;

&lt;p&gt;The data transformation part is done using a data flow executed from the data pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frerriljqhzdw35ao0m6z.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frerriljqhzdw35ao0m6z.PNG" alt="Data flow in Synapse Pipelines"&gt;&lt;/a&gt;Data flow in Synapse Pipelines.&lt;/p&gt;

&lt;p&gt;The data flow does a couple of extra tasks before saving the data in parquet format. It convert the release date to a specific date format and flattens the movie genre information from the json result to a separate data structure. &lt;/p&gt;

&lt;p&gt;The following picture illustrates the end result as external tables in Synapse Serverless SQL Pool.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fastrhhwbdky0fwjiainz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fastrhhwbdky0fwjiainz.png" alt="Synapse Serverless SQL Pool tables"&gt;&lt;/a&gt;Synapse Serverless SQL Pool tables.&lt;/p&gt;

&lt;p&gt;The test scenario includes executing the data pipeline automatically from a release pipeline in Azure DevOps and verifying that the pipeline has executed successfully. This is only the bare minimum of what should be tested in real-life but it should be enough to explain the core concept and to give you ideas on how the test project could be developed further.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h1&gt;
  
  
  Test project setup
&lt;/h1&gt;

&lt;p&gt;The test solution is built using C# and xUnit and it consists of two different test projects: one that is testing the backend solution and included in here solely as a placeholder, and another project for the Synapse Pipelines tests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzolh1kas3phe21memtug.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzolh1kas3phe21memtug.png" alt="Test project structure"&gt;&lt;/a&gt;Test project structure.&lt;/p&gt;

&lt;p&gt;The Synapse.Tests projects uses &lt;a href="https://learn.microsoft.com/en-us/dotnet/api/overview/azure/synapse-analytics?view=azure-dotnet" rel="noopener noreferrer"&gt;Synapse Analytics SDK&lt;/a&gt; for connecting to Synapse Analytics instance. You'll need to install &lt;em&gt;Azure.Analytics.Synapse.Artifacts&lt;/em&gt; Nuget package to be able to communicate with the Synapse Analytics instance. &lt;/p&gt;

&lt;p&gt;Here is the complete implementation for executing the desired pipeline and waiting for it to complete. Notice that two different client classes are required: &lt;em&gt;PipelineClient&lt;/em&gt; for executing the pipeline and &lt;em&gt;PipelineRunClient&lt;/em&gt; for monitoring it's progress. &lt;em&gt;SynapseClient&lt;/em&gt; implementation requires the Synapse Workspace name and Azure tenant id as constructor parameters.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using Azure.Analytics.Synapse.Artifacts;
using Azure.Analytics.Synapse.Artifacts.Models;
using Azure.Identity;

namespace Synapse.Tests;

public class SynapseClient
{
    private const int SleepDurationInMs = 15000;
    private readonly string _workspaceName;
    private readonly PipelineClient _pipelineClient;
    private readonly PipelineRunClient _pipelineRunClient;
    private readonly string _tenantId;

    public SynapseClient(string workspaceName, string tenantId)
    {
        _workspaceName = workspaceName;
        _tenantId = tenantId;

        var credentials = new DefaultAzureCredential(new DefaultAzureCredentialOptions
        {
            VisualStudioTenantId = _tenantId
        });
        _pipelineClient = new PipelineClient(new Uri($"https://{_workspaceName}.dev.azuresynapse.net"), credentials);
        _pipelineRunClient = new PipelineRunClient(new Uri($"https://{_workspaceName}.dev.azuresynapse.net"), credentials);
    }

    public async Task&amp;lt;bool&amp;gt; ExecutePipelineAsync(string pipelineName)
    {
        if (string.IsNullOrWhiteSpace(pipelineName))
        {
            throw new ArgumentException("Pipeline name cannot be null or empty", nameof(pipelineName));   
        }

        var executionResult = await _pipelineClient.CreatePipelineRunAsync(pipelineName);

        PipelineRun pipelineRun;
        while (true)
        {
            pipelineRun = await _pipelineRunClient.GetPipelineRunAsync(executionResult.Value.RunId);

            Console.WriteLine("Status: " + pipelineRun.Status);
            if (pipelineRun.Status == "InProgress" || pipelineRun.Status == "Queued")
                Thread.Sleep(SleepDurationInMs);
            else
                return pipelineRun.Status == "Succeeded";
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can now write the tests which utilize the &lt;em&gt;SynapseClient&lt;/em&gt; class.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class MoviesPipelineTests
{
    private const string TenantId = "[]";
    private const string SynapseWorkspaceName = "[]";
    private readonly SynapseClient _synapseClient;

    public MoviesPipelineTests()
    {
        _synapseClient = new SynapseClient(SynapseWorkspaceName, TenantId);
    }

    [Fact]
    public async Task ExecutePipeline_Valid_ReturnTrue()
    {
        Assert.True(await _synapseClient.ExecutePipelineAsync("Movies JSON to Parquet"));
    }

    [Fact]
    public async Task ExecutePipeline_Invalid_ReturnFalse()
    {
        await Assert.ThrowsAsync&amp;lt;ArgumentException&amp;gt;(async () =&amp;gt; await _synapseClient.ExecutePipelineAsync(""));
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As I mentioned earlier, this is just a minimum set of tests which only verifies that the data pipeline in Synapse Analytics has executed successfully. Other tests should be implemented in production environment to verify the outcome of the data pipeline execution for data freshness and quality, such as checking the amount of affected rows and looking for null values in unexpected places.&lt;br&gt;&lt;br&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  DevOps pipeline configuration
&lt;/h1&gt;

&lt;p&gt;Now that the test project is properly configured, we can move on to the automated test execution part. &lt;/p&gt;

&lt;p&gt;The tests are executed using a single yaml pipeline which builds the test projects and executes the correct tests depending on the release pipeline which triggered the test pipeline execution. Tests are triggered whenever there's a release in the Staging environment stage so that we can still react and fix potential issues before deploying to production if the tests fail.&lt;/p&gt;

&lt;p&gt;Tests are executed using AzureCLI task instead of DotNetCoreCLI to enable authentication from the test project using &lt;em&gt;DefaultAzureCredential&lt;/em&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Disable triggering from code updates to repo
trigger: none

# Set up pipeline to trigger on completion of "release_Staging" stage
resources:
  pipelines:
    - pipeline: release_api
      source: Release-Api
      trigger:
        branches:
          - release/*
        stages:
          - release_Staging

    - pipeline: release_synapse
      source: Release-Synapse
      trigger:
        branches:
          - release/*
        stages:
          - release_Staging

variables:
  - template: Variables/variables.yaml
  - name: BuildConfiguration
    value: Release

jobs:
  - job: release_integrationtests
    displayName: "Execute integration tests"
    pool:
      name: Azure Pipelines
      vmImage: windows-2022

    steps:
      - task: UseDotNet@2
        displayName: "Use .NET SDK 7.0.x"
        inputs:
          version: "7.0.x"

      - task: DotNetCoreCLI@2
        displayName: "Restore project dependencies"
        inputs:
          command: "restore"
          projects: "$(Build.SourcesDirectory)/Tests/IntegrationTests/src/**/*Tests*/*.csproj"
          feedsToUse: "select"

      - task: DotNetCoreCLI@2
        displayName: "Build the project"
        inputs:
          command: "build"
          arguments: "--no-restore --configuration $(BuildConfiguration)"
          projects: "$(Build.SourcesDirectory)/Tests/IntegrationTests/src/**/*Tests*/*.csproj"

      - task: AzureCLI@2
        displayName: "dotnet test Api project"
        condition: eq(variables['resources.triggeringalias'], 'release_api')
        inputs:
          azureSubscription: $(azureSubscriptionName)
          scriptType: pscore
          scriptLocation: inlineScript
          inlineScript: |
            dotnet test $(Build.SourcesDirectory)\Tests\IntegrationTests\src\Api.Tests\ --configuration $(BuildConfiguration) --logger:"trx;LogFileName=TestResultsApi.trx"

      - task: AzureCLI@2
        displayName: "dotnet test Synapse pipelines project"
        condition: eq(variables['resources.triggeringalias'], 'release_synapse')
        inputs:
          azureSubscription: $(azureSubscriptionName)
          scriptType: pscore
          scriptLocation: inlineScript
          inlineScript: |
            dotnet test $(Build.SourcesDirectory)\Tests\IntegrationTests\src\Synapse.Tests\ --configuration $(BuildConfiguration) --logger:"trx;LogFileName=TestResultsSynapse.trx"

      - task: AzureCLI@2
        displayName: "dotnet test all projects"
        condition: eq(variables['resources.triggeringalias'], '')
        inputs:
          azureSubscription: $(azureSubscriptionName)
          scriptType: pscore
          scriptLocation: inlineScript
          inlineScript: |
            dotnet test $(Build.SourcesDirectory)\Tests\IntegrationTests\src\ --configuration $(BuildConfiguration) --logger:"trx;LogFileName=TestResultsAll.trx"

      - task: PublishTestResults@2
        displayName: "Publish Test results"
        inputs:
          testResultsFormat: "VSTest"
          testResultsFiles: "$(Build.SourcesDirectory)/Tests/IntegrationTests/src/*.Tests/**/TestResults*.trx"
          mergeTestResults: true
          failTaskOnFailedTests: true
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Conditions are used to only execute the tests based on the triggering pipeline. The test pipeline can also be executed manually when all tests from both test projects need to be performed. &lt;/p&gt;

&lt;p&gt;The release_api project pipeline is introduced in the test pipeline for demonstrating the capability of using conditional test execution from a single pipeline.&lt;/p&gt;

&lt;p&gt;Running the test pipeline executes the tests and publishes the results on the "Tests" tab of the current release.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3j0py62d458d6mcd6dh2.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3j0py62d458d6mcd6dh2.PNG" alt="Tests summary"&gt;&lt;/a&gt;Test execution summary.&lt;/p&gt;



&lt;h1&gt;
  
  
  Summary
&lt;/h1&gt;

&lt;p&gt;Testing Synapse Pipelines using Synapse Analytics SDK is really as simple as executing any other integration tests. The only tricky part is monitoring the pipeline execution which might require more advanced test setup configuration if the pipeline execution time is high, such as splitting your data pipeline or preparing a test-specific dataset.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>testing</category>
      <category>synapse</category>
      <category>xunit</category>
    </item>
    <item>
      <title>Why you probably shouldn't use Logic Apps for enterprise integrations</title>
      <dc:creator>Janne Pasanen</dc:creator>
      <pubDate>Thu, 05 Jan 2023 10:11:31 +0000</pubDate>
      <link>https://dev.to/jannepasanen/why-you-probably-shouldnt-use-logic-apps-for-enterprise-integrations-1j6c</link>
      <guid>https://dev.to/jannepasanen/why-you-probably-shouldnt-use-logic-apps-for-enterprise-integrations-1j6c</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Logic Apps are Azure's low-code approach to creating simple, repeatable workflows for executing various types of tasks. There are lots of use-cases where such workflows are a good fit-for-purpose but enterprise-scale integrations platform probably isn't one of those. I'll use my recent real-life experience from working with Logic Apps and a longer experience with more resilient and versatile integration patterns as a basis for my writing. &lt;/p&gt;

&lt;p&gt;With that being said, you should take it with a grain of salt as it's only my humble opinion based on before mentioned experience.&lt;/p&gt;

&lt;p&gt;But why would I say something like the title says? Nobody's doing it, right? Turns out, companies are doing it and here I'm bringing my opinion on why that maybe isn't the best idea. &lt;/p&gt;

&lt;p&gt;Modern integration platforms should provide resiliency and scalability and offer a rich set of features such as efficient monitoring, queueing and state management. Implementing such features with Logic Apps seems counter-productive, although other Azure services could be used to help in some avail, e.g. Data Factory for orchestration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08qynp4rp2xu17vnh3cf.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08qynp4rp2xu17vnh3cf.PNG" alt="Main" width="800" height="282"&gt;&lt;/a&gt;Consumption Logic App designer welcome screen.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Issues with using Logic Apps&lt;br&gt;&lt;br&gt;
&lt;/h2&gt;



&lt;h3&gt;
  
  
  Non-informative workflow failures
&lt;/h3&gt;

&lt;p&gt;Ever experienced arbitrary failures with workflow integrations to other Azure resources which might require you to redeploy the whole solution? Or a built-in connector breaking as a result of an update? &lt;/p&gt;

&lt;p&gt;I've faced these types of annoying issues even with a relatively short experience with Logic Apps. And I'm not saying these things wouldn't happen if you choose to use some other service or approach but these are just the kind of things that'll really ruin your day.&lt;/p&gt;

&lt;p&gt;One special type of non-informative failure is when your Logic App hits memory and/or networking related boundaries with message/object sizes. There are some options for Standard Logic Apps which you can configure, but to my experience, they don't help much and there could still be that one boundary hidden from documentation that just makes you Logic App workflow drop the ball on it's execution.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h3&gt;
  
  
  Infrastructure-as-a-code support
&lt;/h3&gt;

&lt;p&gt;Both Bicep and ARM Templates can be used to deploy and manage Logic Apps in Azure. However, I've faced some odd behavior and missing functionalities when trying to integrate Logic Apps workflows with other Azure resources. &lt;/p&gt;

&lt;p&gt;Together with some missing key features, they may make implementing your solution a lot harder and/or unsafe.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Unreachable Logic Apps webhook&lt;br&gt;&lt;br&gt;
Logic Apps creates a webhook endpoint whenever adding a Http Trigger-type action to your workflow. That endpoint can then be utilized when integrating to other Azure/on-prem resources. I've noticed that the endpoint seems to work quite randomly. Creating a webhook connection to the Logic Apps endpoint can fail intermittenly due to errors in SSL handshake. There are workarounds for this but they require extra work which might hinder the effort of using Logic Apps in the first place if on of the main reasons for going with Logic Apps was to simplify and speed up development.&lt;br&gt;&lt;br&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Generic integration identity support&lt;br&gt;&lt;br&gt;
For those connectors that don't support managed identities, it makes sense to use a generic integration identity for authentication. Generally, this account wouldn't have MFA enabled but the account still needs to login once to create the connection. There is currently no support for this in Logic Apps deployment framework and the login process needs to be executed manually for all instances/environments.&lt;br&gt;&lt;br&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Missing Logic App Standard API Management support&lt;br&gt;&lt;br&gt;
Standard Logic Apps cannot be imported to API Management. If your APIs are hosted in API Management you need to create a proxy solution for your Standard Logic App or use Azure Functions as your API backend solution. On the other hand, Consumption Logic Apps can be imported to API Management.&lt;br&gt;&lt;br&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;



&lt;h3&gt;
  
  
  Monitoring
&lt;/h3&gt;

&lt;p&gt;Monitoring Logic Apps is far from straightforward. Consumption Logic Apps can't be connected to Application Insights instance at all. Logs are generated and can be viewed from the Logic Apps and sent to a Log Analytics workspace but creating a centralized log solution can become tricky. Application Insights connectivity exists for Standard Logic Apps only.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fufj09a5s1k1genpm2124.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fufj09a5s1k1genpm2124.PNG" alt="Diagnostics events" width="800" height="439"&gt;&lt;/a&gt;Consumption Logic App diagnostics can be sent to Log Analytics Workspace.&lt;/p&gt;

&lt;p&gt;Furthermore, reading and interpreting Standard Logic Apps telemetry data from Application Insights requires quite the effort. Telemetry data is hidden in a way that building informative dashboards based on the data is a task for the at-least-intermediate level KQL enthusiasts. &lt;/p&gt;

&lt;p&gt;If your solution requires more than just the bare minimum logging features, you just might want to use Azure Functions for that.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h3&gt;
  
  
  Limited authentication support for incoming requests
&lt;/h3&gt;

&lt;p&gt;Logic Apps expose their endpoint URIs publicly. Although the endpoint is protected with a SAS token, anyone can access that URI unless the Logic App is hosted in App Service Environment(Standard) in an isolated network. Authentication can be performed with limited OAuth support by configuring the required claims with authentication policies, but you can only use a predefined set of basic claims for authorization. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Febib7ylxgo3ai8v68yts.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Febib7ylxgo3ai8v68yts.PNG" alt="Authentication policies" width="800" height="369"&gt;&lt;/a&gt;Logic App authentication policy editor.&lt;/p&gt;

&lt;p&gt;More information on OAuth support: &lt;a href="https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-securing-a-logic-app?tabs=azure-portal#enable-azure-active-directory-open-authentication-azure-ad-oauth" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-securing-a-logic-app?tabs=azure-portal#enable-azure-active-directory-open-authentication-azure-ad-oauth&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you need more authentication options and/or more fine-grained control over the authentication/authorization process, consider hosting the Logic App endpoint in API Management which supports additional authentication mechanisms such as Basic Authentication and Client Certificates.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h3&gt;
  
  
  Pricing
&lt;/h3&gt;

&lt;p&gt;As with all other Azure consumption and pricing calculations and estimations, they all depend on the required capabilities and exact use case and more specifically data usage and the amount of network traffic. &lt;/p&gt;

&lt;p&gt;With Standard Logic Apps compared to Functions, there is not much difference between hosting plan options and pricing.&lt;/p&gt;

&lt;p&gt;More detailed information on Logic Apps billing: &lt;a href="https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-pricing" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-pricing&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The real difference comes with using the connectors. Consumption Logic Apps pricing is based on Logic App triggers and actions. For Standard Logic Apps, only actions apply for extra costs. &lt;/p&gt;

&lt;p&gt;Additional pricing applies for each connector execution in a Logic App.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;th&gt;Price per execution&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Action&lt;/td&gt;
&lt;td&gt;€0.000024(first 4000 free)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Connector type&lt;/th&gt;
&lt;th&gt;Price per execution&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Standard&lt;/td&gt;
&lt;td&gt;€0.000118&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Enterprise&lt;/td&gt;
&lt;td&gt;€0.000942&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Using the above mentioned information it's easy to conclude that the higher the amount of connectors in a Logic App, the higher the costs. Also, with consumption-based Logic Apps, triggers generate extra costs. &lt;/p&gt;

&lt;p&gt;So basically, if you have a consumption Logic App with a single connector and three million executions in a month compared to a Function App with consumption plan with the same amount of requests, the pricing is roughly as follows:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;th&gt;Free executions&lt;/th&gt;
&lt;th&gt;Price estimate&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Function App&lt;/td&gt;
&lt;td&gt;1,000,000&lt;/td&gt;
&lt;td&gt;~17€&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Logic App&lt;/td&gt;
&lt;td&gt;4,000&lt;/td&gt;
&lt;td&gt;~430€&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;And this is just with a single standard connector.&lt;br&gt;&lt;br&gt;&lt;/p&gt;



&lt;h2&gt;
  
  
  Final words
&lt;/h2&gt;

&lt;p&gt;Don't get me wrong, I really like the idea of no-code/low-code approach in general. In addition to the points mentioned earlier, there's just this overall feeling of not being quite there with all the mixed capabilities and available set of features between Consumption and Standard Logic Apps. &lt;/p&gt;

&lt;p&gt;And as with all other Azure services, the decision on whether or not to use a certain service for your scenario is dependent on multiple factors such as the amount of transactions, available budget and team composition and experience.&lt;/p&gt;

&lt;p&gt;I'd still use Logic Apps for simple and repeatable workflows such as importing/exporting files and sending notifications etc. but the idea of using Logic Apps as an enterprise integration platform is an idea you might want to revisit.&lt;/p&gt;

</description>
      <category>emptystring</category>
    </item>
    <item>
      <title>Access private blobs with user credentials using Azure Function and App Service authentication</title>
      <dc:creator>Janne Pasanen</dc:creator>
      <pubDate>Wed, 23 Nov 2022 07:18:16 +0000</pubDate>
      <link>https://dev.to/jannepasanen/access-private-blobs-with-user-credentials-using-azure-function-and-app-service-authentication-4h23</link>
      <guid>https://dev.to/jannepasanen/access-private-blobs-with-user-credentials-using-azure-function-and-app-service-authentication-4h23</guid>
      <description>&lt;p&gt;I﻿ recently came across a requirement of an easy way of downloading blobs from Azure Storage Account and performing authorization as a user on the Storage Account level. Having read about App Service authentication, I wanted to try it out in this scenario. This enables the capability of creating downloadable links to blobs that are authorized as the requesting user on Storage Account level.&lt;/p&gt;

&lt;p&gt;App Service authentication greatly simplifies the authentication flow as the feature is integrated to the platform and takes care of authentication with federated identities which supports multiple providers. The configuration is really simple and it can be done using Azure Portal.&lt;/p&gt;

&lt;p&gt;When the user opens the link to the Function App endpoint, the user is authenticated for the application. Function App then acts on behalf of the user to request access to Storage Account to generate a SAS URI which is then returned as response headers to automatically redirect the user to the created SAS URI.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FbZGWz8G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4j3qz9rz86xs7n6sq5lw.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FbZGWz8G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4j3qz9rz86xs7n6sq5lw.PNG" alt="Image description" width="880" height="359"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These are the bits and pieces used for building the solution:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Function App for hosting the client request endpoint and for performing user authentication and authorization.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure AD application for App Service authentication configuration.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Storage Account for storing blob files.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is the flow for requesting a single blob using App Service authentication:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Authenticate the client request in App Service.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Get access token for the application as the requesting user.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Request access to Storage Account using on-behalf-of credentials and access token.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Get blob data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create blob SAS URI using user delegation key.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Azure AD Application
&lt;/h2&gt;

&lt;p&gt;The easiest way to create the application is to use the App Service authentication wizard from Azure Function App Authentication section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wlKFFS-d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jovitdo0ftb6h1zgh1kz.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wlKFFS-d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jovitdo0ftb6h1zgh1kz.PNG" alt="Image description" width="880" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We start by selecting Authentication settings from Azure Portal and clicking Add identity provider.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xvm6tiN6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nxbzfezob03esir0os2n.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xvm6tiN6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nxbzfezob03esir0os2n.PNG" alt="Image description" width="880" height="1206"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We need to configure the application to use Microsoft identity provider which basically means Azure Active Directory. In this specific scenario I chose to require authentication from users only from the current tenant. The wizard configures the application in Azure AD for the authentication flow, including redirect URIs.&lt;/p&gt;

&lt;p&gt;Authentication settings can also be deployed using bicep templates. There’s an example of such a simple configuration below. In this case, Azure AD application would need to be created manually.&lt;/p&gt;

&lt;p&gt;We’ll not get into details of it here but feel free to experiment yourself with it. More information on official Microsoft documentation &lt;a href="https://learn.microsoft.com/en-us/azure/templates/microsoft.web/sites/config-authsettingsv2?pivots=deployment-language-bicep"&gt;here&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource symbolicname 'Microsoft.Web/sites/config@2022-03-01' = {
  name: 'authsettingsV2'
  kind: 'string'
  parent: functionApp
  properties: {
    globalValidation: {
      requireAuthentication: true
      redirectToProvider: 'Microsoft'
    }
    identityProviders: {
      azureActiveDirectory: {
        enabled: true
        isAutoProvisioned: false
        registration: {
          clientId: clientPrincipalId
          clientSecretSettingName: 'AzureAdOptions__ClientSecret'
          openIdIssuer: 'https://sts.windows.net/${tenant().tenantId}/v2.0'
        }
        validation: {
          allowedAudiences: [
            'api://${clientPrincipalId}'
          ]
        }
        login: {
          loginParameters: ['scope=openid profile email offline_access']
        }
      }
    }
    login: {
      tokenStore: {
        enabled: true
        tokenRefreshExtensionHours: 72
      }      
    }
    httpSettings: {
      requireHttps: true
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In addition to creating the app to Azure AD, you need to create a client secret for the app to be able to request access tokens. Client secrets are created in Azure AD.&lt;/p&gt;

&lt;h2&gt;
  
  
  Function App
&lt;/h2&gt;

&lt;p&gt;I used the configuration below for creating the Function App. I wanted to deploy my function in isolated mode but you can also choose to use "dotnet" runtime.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KCd8Ec8k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eselqu1kb4otk6sutfok.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KCd8Ec8k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eselqu1kb4otk6sutfok.PNG" alt="Image description" width="880" height="1207"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Function endpoint uses wildcard route to be able to access any blob regardless of the folder structure. I implemented a safety mechanism to only allow downloading blobs from preconfigured containers but that is up to you if such a feature is needed.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[Function("BlobFunction")]
public async Task&amp;lt;HttpResponseData&amp;gt; Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "download/{container}/{*path}")] HttpRequestData req, string container, string path)
{
    _logger.LogInformation($"requesting SAS URI for {path} in {container}");

    if (!ValidateConfiguration())
    {
        _logger.LogError($"Invalid configuration!");
        return req.CreateResponse(HttpStatusCode.BadRequest);
    }

    string[] apiScopes = { _azureAdOptions.Scope };

    // Create confidential client application for requesting access tokens
    var app = ConfidentialClientApplicationBuilder.Create(_azureAdOptions.ClientId)
           .WithTenantId(_azureAdOptions.Tenant)
           .WithClientSecret(_azureAdOptions.ClientSecret)
           .Build();

    var headers = req.Headers;
    if (!headers.TryGetValues("X-MS-TOKEN-AAD-ID-TOKEN", out var token))
    {
        _logger.LogError($"Missing auth header");
        return req.CreateResponse(HttpStatusCode.Unauthorized);
    }

    // UserAssertion object is required to act on behalf of the user
    var userAssertion = new UserAssertion(token.First());
    var result = await app.AcquireTokenOnBehalfOf(apiScopes, userAssertion).ExecuteAsync();

    var accessToken = result.AccessToken;
    if (accessToken == null)
    {
        _logger.LogError("Access Token could not be acquired");
        return req.CreateResponse(HttpStatusCode.Unauthorized);
    }

    var client = GetBlobServiceClient(accessToken);

    // We need user delegation key to generate blob SAS uri
    var userDelegationKey = await client.GetUserDelegationKeyAsync(DateTimeOffset.UtcNow, DateTimeOffset.UtcNow.AddMinutes(15));
    var allowedContainers = Environment.GetEnvironmentVariable("AllowedContainers").Split(",");

    if (!allowedContainers.Contains(container))
    {
        _logger.LogError($"Container {container} not allowed");
        return req.CreateResponse(HttpStatusCode.Unauthorized);
    }

    var containerClient = client.GetBlobContainerClient(container);
    if (!await containerClient.ExistsAsync())
    {
        _logger.LogError($"Container {container} not found");
        return req.CreateResponse(HttpStatusCode.NotFound);
    }

    var blobClient = containerClient.GetBlobClient(path);
    if (!await blobClient.ExistsAsync())
    {
        _logger.LogError($"Blob {path} not found in container {container}");
        return req.CreateResponse(HttpStatusCode.NotFound);
    }

    // Generate SAS uri using user delegation key
    var sasUri = blobClient.GetSasUri(userDelegationKey);

    // Return the generated uri in response headers for automatic redirection
    var response = req.CreateResponse(HttpStatusCode.Redirect);
    response.Headers.Add("Location", sasUri.ToString());

    return response;
}

private BlobServiceClient GetBlobServiceClient(string accessToken)
{
    // Create on behalf of credentials from user's access token
    var onBehalfOfCredential = new OnBehalfOfCredential(_azureAdOptions.Tenant, _azureAdOptions.ClientId, _azureAdOptions.ClientSecret, accessToken);
    return new BlobServiceClient(new Uri(_blobServiceOptions.Url), onBehalfOfCredential);
}

private bool ValidateConfiguration()
{
    return !string.IsNullOrWhiteSpace(_azureAdOptions.Tenant) &amp;amp;&amp;amp;
        !string.IsNullOrWhiteSpace(_azureAdOptions.ClientId) &amp;amp;&amp;amp;
        !string.IsNullOrWhiteSpace(_azureAdOptions.ClientSecret) &amp;amp;&amp;amp;
        !string.IsNullOrWhiteSpace(_azureAdOptions.Scope) &amp;amp;&amp;amp;
        !string.IsNullOrWhiteSpace(_blobServiceOptions.Url);
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;App Service authentication uses specific headers to send the tokens along with the request. In this case, we need to extract the user id token from X-MS-TOKEN-AAD-ID-TOKEN header. We use that token to request an access token on behalf of the user as the initial request does not contain a valid one for our application. Access token is then used to authenticate against the Storage Account.&lt;/p&gt;

&lt;p&gt;Generating the SAS URI requires a user delegation key for which the validity period was set to 15 minutes.&lt;/p&gt;

&lt;p&gt;Extension class for generating the SAS URI is shown below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;internal static class BlobClientExtensions
{
    internal static Uri GetSasUri(this BlobClient client, UserDelegationKey userDelegationKey)
    {
        var sasBuilder = new BlobSasBuilder()
        {
            BlobContainerName = client.BlobContainerName,
            BlobName = client.Name,
            Resource = "b",
            StartsOn = DateTimeOffset.UtcNow,
            ExpiresOn = DateTimeOffset.UtcNow.AddMinutes(15),
        };

        sasBuilder.SetPermissions(BlobSasPermissions.Read);

        var blobUriBuilder = new BlobUriBuilder(client.Uri)
        {
            Sas = sasBuilder.ToSasQueryParameters(userDelegationKey, client.AccountName),
        };

        return blobUriBuilder.ToUri();
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We also need a configuration file(local.settings.json) for local development/testing. Replace values from the created app registration.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
    "APPLICATIONINSIGHTS_CONNECTION_STRING": "***************************************",
    "AllowedContainers": "***",
    "AzureAdOptions:Tenant": "***********************************",
    "AzureAdOptions:ClientId": "************************************",
    "AzureAdOptions:ClientSecret": "****************************************",
    "AzureAdOptions:Scope": "api://*************************************/user_impersonation",
    "BlobServiceOptions:Url": "https://**********.blob.core.windows.net/"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Storage Account
&lt;/h2&gt;

&lt;p&gt;The final thing is to add access to Storage Account files for the requesting users. To enable reading and creating SAS URIs for blob files, we need &lt;strong&gt;Storage Blob Data Contributor&lt;/strong&gt; role for the Storage Account. This is enough for us to test our scenario while a better practice would be to apply the role to AD groups rather than individual users.&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing
&lt;/h2&gt;

&lt;p&gt;That’s it, we’re all set! Upload a file to the Storage Account and point the link to the function app endpoint with the equivalent container/path configuration to test the solution.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>c</category>
      <category>azure</category>
      <category>dotnet</category>
    </item>
  </channel>
</rss>
