<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jon Stjernegaard Vöge</title>
    <description>The latest articles on DEV Community by Jon Stjernegaard Vöge (@jonvoge).</description>
    <link>https://dev.to/jonvoge</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jonvoge"/>
    <language>en</language>
    <item>
      <title>Guide: How to add Write Back capabilities to your Power BI reports with Power Apps — Part 3</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Wed, 25 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-your-power-bi-reports-with-power-apps-part-3-kak</link>
      <guid>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-your-power-bi-reports-with-power-apps-part-3-kak</guid>
      <description>&lt;h2&gt;
  
  
  Guide: How to add Write Back capabilities to your Power BI reports with Power Apps — Part 3
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This article is part of a series on integrations between Power BI and Power Apps.&lt;/p&gt;

&lt;p&gt;Part 1 of the series showed how to get started with creating and embedding a basic Power App in a Power BI report for data write-back. Find the article here: &lt;a href="https://www.linkedin.com/pulse/guide-adding-write-back-capabilities-your-power-bi-stjernegaard-v%C3%B6ge" rel="noopener noreferrer"&gt;Link to Part 1!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Part 2, taught you how to choose an appropriate Data Source for your Writeback needs, and how to display the data live in Power BI with a DirectQuery connection. Find the article here: &lt;a href="https://www.linkedin.com/pulse/guide-add-write-back-power-bi-reports-apps-part-2-jon/" rel="noopener noreferrer"&gt;Link to Part 2!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this Part 3 of the series, I’ll teach you a few quick tips &amp;amp; tricks for layouting your embedded Power Apps in Power BI, and overcome the limitations of the Phone Layout that is forced upon Power Apps created from Power BI.&lt;/p&gt;

&lt;p&gt;In short, this article will teach you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;How to change the sizing dimensions of Power Apps with Phone Layouts.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How to identify the right sizing dimensions for your embedded Power App&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  1. Manipulating the size of the canvas in Power Apps made for Phone Layout
&lt;/h2&gt;

&lt;p&gt;If you have followed the first Parts of this Article Series, or have on your own tried creating a new Power App from within the Power Apps for Power BI visualization type in Power BI (which is necessary to enable the PowerBIIntegration.Refresh() function described in Part 2), you will inevitably realize that your Power App will not have dimension properties available in the Power App settings. Rather, you will only be able to choose between ‘Portrait’ or ‘Landscape’ orientation…:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2448%2F0%2A7AuUr8cyp-n1j96J" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2448%2F0%2A7AuUr8cyp-n1j96J"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;… as opposed to the full dimension settings of a Power App with Tablet Layout:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AygofZcoOlHmDy7dd" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AygofZcoOlHmDy7dd"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The inability to alter the sizing of the app may result in the dreaded “grey borders” that often haunt the design of Power Apps embedded in Power BI reports:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2832%2F0%2AEUQ6-VpHr9Raa1sy" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2832%2F0%2AEUQ6-VpHr9Raa1sy"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, there is a workaround. instead of resizing your Power App from within the App Settings, you may simply change the Height and Width properties of your Screens in the Power App, to whatever fits your needs.&lt;/p&gt;

&lt;p&gt;Before:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2At1wf92CnAq0cq3iJ" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2At1wf92CnAq0cq3iJ"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After changing the Screen Height and Width:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3472%2F0%2AcXpwFj-3vCQ4xwkc" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3472%2F0%2AcXpwFj-3vCQ4xwkc"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A bit of rearranging, and we’re left with a Power App much better fitting the dimensions of our visualisation in Power BI:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2A1NHIrG5JPq7ktNHC" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2A1NHIrG5JPq7ktNHC"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You should however use this feature with caution, as it may produce unexpected end-user experiences. You want to ensure that all of your screens are sized equally, and you want to make sure the Power App is only used in a setting where the chosen sizing makes sense.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Identifying the proper sizing dimensions of your embedded Power Apps
&lt;/h2&gt;

&lt;p&gt;But what size should you then choose for your Screen Height and Width?&lt;/p&gt;

&lt;p&gt;Trial and error can be cumbersome, and another tip in terms of identifying the correct Height and Width dimensions, is to open your Power BI report, and resize the Power Apps for Power BI visualisation to your required size. Within the formatting options of the visualization, make note of the height and width dimensions. Use these dimensions as the size dimensions inside of your Power App as well, to make the Power App fit snugly within the frame in Power BI.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2840%2F0%2AFMLzZYexgcKlGsie" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2840%2F0%2AFMLzZYexgcKlGsie"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My personal experience, is that everything inside embedded Power Apps often appear slightly too large. If you experience this too, a bonus tip is to scale up the dimensions of the Power Apps size dimensions to make everything appear smaller, without having to resize each individual element. Just make sure to keep the correct height to width ratio as you are adjusting (e.g. to make elements in your 1000x500 Power App appear smaller, try adjusting the size of the App to 1200*600 within Power Apps).&lt;/p&gt;

&lt;p&gt;Here i made small adjustments to make the content appear slightly smaller:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AWjOUkGaltvn65pkk" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AWjOUkGaltvn65pkk"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Power Apps created from within Power BI will always be in Phone Layout. Use the Screen.Width and Screen.Height properties in your Power App to manually alter the Height and Width of your Power App screens. For sizing guidance, use the dimensions of the Power Apps visualization in Power BI, to get a starting point.&lt;/p&gt;

</description>
      <category>powerplatform</category>
      <category>powerfuldevs</category>
      <category>microsoft</category>
    </item>
    <item>
      <title>Guide: How to add Write-Back capabilities to Power BI reports with Power Apps — Part 2</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Tue, 24 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-power-bi-reports-with-power-apps-part-2-5064</link>
      <guid>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-power-bi-reports-with-power-apps-part-2-5064</guid>
      <description>&lt;h2&gt;
  
  
  Guide: How to add Write-Back capabilities to Power BI reports with Power Apps — Part 2
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This article is part of a series on integrations between Power BI and Power Apps.&lt;/p&gt;

&lt;p&gt;Part 1 showed how to get started with creating and embedding a basic Power App in a Power BI report for data write-back.&lt;/p&gt;

&lt;p&gt;Part 2, this article, will teach you:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;How to pick a data source for your Power Apps output data&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;How to output data to the selected source&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;How to make the data appear Live in your Power BI report, without the user having to refresh anything.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  1. Choosing the right Data Source for your Power Apps output
&lt;/h2&gt;

&lt;p&gt;Before we can show any kind of data created from Power Apps in a Power BI report, we need to store it somewhere. Choosing the right Data Source for hosting your data is an important question. Most often, the choice is between Sharepoint, a SQL server and Dataverse.&lt;/p&gt;

&lt;p&gt;Two main tradeoffs may influence your choice:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Timeliness of data (update frequency) vs. License Costs&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Security &amp;amp; Governance vs. Availability&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;First, you need to ask yourself (or your users), how frequently data needs to be updated. If the answer is less than 8 times a day, you may be just fine using Sharepoint as a Data Source. If you need more than that, perhaps even &lt;em&gt;live&lt;/em&gt; data, you will need either a SQL Server or Dataverse Connection. The kicker however, is that a Power Apps using Dataverse or SQL Server Data Connections, will require Power Apps Premium Licensing, as opposed to the standard license required for Sharepoint. Refer to Microsofts own website for guidance on licenses: &lt;a href="https://powerapps.microsoft.com/en-us/pricing/" rel="noopener noreferrer"&gt;Link to Power Apps Licensing&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Second, you may want to consider how sensitive your data is, and how accessible it should be. Sharepoint is easy to get started with and accessible for users, but also has limited governance and security options. Dataverse and SQL Server allows much more comprehensive security setups. If you are faced with picking between the latter options, it will often be a matter of preference, and it might simply be the easiest to piggyback off of any existing solution you may have in your organization.&lt;/p&gt;

&lt;p&gt;In short:&lt;/p&gt;

&lt;p&gt;Need low cost, low security, infrequent data refresh or high accesibility? &lt;strong&gt;Pick Sharepoint.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Need high security or frequent/live data refreshes? &lt;strong&gt;Pick Dataverse or SQL Server.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Outputting data to your data source
&lt;/h2&gt;

&lt;p&gt;After selecting your data source, prepare an empty table at the destination with the correct columns and data types you wish to input. I am going to input Sales Budgets by Product Category, and have created a table in an SQL database with three columns. Make sure that you define a Primary Key on the table if you are using a SQL Server, as that is required to update records in the table from Power Apps:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AdpIITAtM8elQ5a1k" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AdpIITAtM8elQ5a1k"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, open up your Power BI report with your embedded Power App. The best way to work with embedded Power Apps, is to open the App from inside Power BI, as that will send data from Power BI over to Power Apps for you to work with. To do so, open your report from app.powerbi.com, enter the Edit Mode for the report, open the menu for the Power Apps visualization, and select “Edit” here. This should prompt your Power App to open in edit mode in another window in your browser, with data from your Power BI report available.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3180%2F0%2ALqg63eVkwLwdgvWf" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F3180%2F0%2ALqg63eVkwLwdgvWf"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the Power Apps studio open, find the Data tab on the left hand side, add a new data connection to your data source of choice, and connect to the table you created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2A0NAsn-PxcXfel_Su" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2A0NAsn-PxcXfel_Su"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, you need to create a screen in your App for submitting data to your new data connection. I have created a simple input interface with drop downs and text input controls allowing users to input budget amounts on Product Category, as well as a button control which submits the data to the data source:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2A84boJb_35PmRWhyc" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2A84boJb_35PmRWhyc"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that I am &lt;em&gt;not&lt;/em&gt; using a Form control here, but instead built my own form with simple controls as seen in the Tree View. While Forms may seem simpler to setup initially, they are difficult to customize, and are very bad at submitting multiple rows of data, if you ever run into that requirement. Instead, I configured my Submit Budget button with the following code:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;With
    {
        BudgetExists: LookUp(
            SalesBudget,
            ProductCategory = dd_Category.Selected.Category
        )
    },
    If(
        IsBlank(BudgetExists),
        Patch(
            SalesBudget,
            Defaults(SalesBudget),
            {
                InputID: Last(SalesBudget).InputID + 1,
                ProductCategory: dd_Category.Selected.Category,
                SalesAmount: Value(ti_BudgetValue.Text)
            }
        ),
        Patch(
            SalesBudget,
            BudgetExists,
            {
                SalesAmount: Value(ti_BudgetValue.Text)
            }
        )
    )
);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The code uses two of my favourite functions: With(), to lookup if an input already exists for the selected ProductCategory, and use the result repeatedly throughout the code for efficiency, as well as Patch(), to write new records and update existing records in the data source, instead of using a Form.&lt;/p&gt;

&lt;p&gt;Further, i decided to populate the Text Input control with the default value of the selected ProductCategory, if a value already exists:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;LookUp(SalesBudget, ProductCategory = dd_Category.Selected.Category).SalesAmount
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;With your custom form setup, save and publish your Power App, reload your Power BI Report, and Test that you are able to create rows of data inside your data source, using the embedded Power App.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AexFkW-Lk9_8RVmGO" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AexFkW-Lk9_8RVmGO"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Retrieving the data with Power BI, and displaying it live for users, without the need for refreshes
&lt;/h2&gt;

&lt;p&gt;Now wouldn’t it be nice if those values we input with our App just show up in the same report? I would say so!&lt;/p&gt;

&lt;p&gt;In Power BI, add a new Data Source to your report, and connect to the Data Source and Data Table in which you outputted your Power Apps data. During this setup, you have an important choice to make: To DirectQuery or not to Direct Query.&lt;/p&gt;

&lt;p&gt;If you went with the low cost/low security route of Sharepoint, you are unfortunately limited to Import mode, which will make your data update only when your Power BI dataset is being refreshed.&lt;/p&gt;

&lt;p&gt;If you are using SQL Server or Dataverse however, choosing DirectQuery will enable you to query data from the data source live, and new/updated data coming from the Power App may be reflected immediately in your report. However, DirectQuery comes with a number of limitations you should be aware of. See the Microsoft documentation for more details: &lt;a href="https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about" rel="noopener noreferrer"&gt;Link to documentation.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I will select DirectQuery, as I wish to provide my users with live feedback on the impact of their inputs. After setting up the new Data Source, create any relevant relationships to other tables in your Data Model. You can now build a visualization using the new data coming from the embedded Power App, or you may use the data in existing visualisations as well! Here I have chosen to show the Budget Amounts inputted in my Power App, together with the Sales Amounts from my original dataset.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AslmuemUmba4WpJIW" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AslmuemUmba4WpJIW"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you save and test your report and app now, you will most likely disappointedly notice that when you Submit Data in your Power App, nothing happens in Power BI until you refresh your browser. What is actually happening, is that DirectQuery is not truly “live”, but only occurs every time a visual refresh takes place. Luckily, we can force this to happen automatically!&lt;/p&gt;

&lt;p&gt;Return to Power Apps Studio, and find the button control that submits your data to the data source. You simply need to put the funciton called PowerBIIntegration.Refresh() at the end of the code of your button’s OnSelect property, to trigger a visual refresh of the entire Power BI report after you are done patching data to the data source. This effectively makes the data show up immediately in the Power BI report.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AHeQaLf-xvygEu63Y" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2976%2F0%2AHeQaLf-xvygEu63Y"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There is one catch to the PowerBIIntegration.Refresh function: It is only available within Power Apps which have originally been created from within the Power Apps for Power BI visualisation type in Power BI, by clicking on “Create New” inside the visualisation. The function is not available in a Power App created ‘normally’ from make.powerapps.com, and connected to a Power BI report at a later stage. See Part 1 of this article series for further details.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/embeds/publishingEmbed.html?articleId=8322741095634091622&amp;amp;li_theme=light" rel="noopener noreferrer"&gt;https://www.linkedin.com/embeds/publishingEmbed.html?articleId=8322741095634091622&amp;amp;li_theme=light&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations! You are now able to write data back to your data source and display it live in your report!&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;In this article you were taught how to make your Power Apps data write-back entries show up in Power BI, and how to trigger visual refreshes of your Power BI reports with Power Apps to display DirectQuery data live without the need for any refreshes.&lt;/p&gt;

&lt;p&gt;Do you have any interesting use cases for this feature, or ideas you want sparring on? Feel free to reach out.&lt;/p&gt;

</description>
      <category>powerplatform</category>
      <category>powerfuldevs</category>
      <category>powerbi</category>
      <category>microsoft</category>
    </item>
    <item>
      <title>Guide: How to add Write-Back capabilities to your Power BI reports with Power Apps - Part 1</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Mon, 23 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-your-power-bi-reports-with-power-apps-part-1-49ek</link>
      <guid>https://dev.to/jonvoge/guide-how-to-add-write-back-capabilities-to-your-power-bi-reports-with-power-apps-part-1-49ek</guid>
      <description>&lt;h2&gt;
  
  
  Guide: How to add Write-Back capabilities to your Power BI reports with Power Apps — Part 1
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Power Apps work great as a standalone tool for replacing your old spreadsheet with a more structured and cloud native data entry solution. However, what if I told you that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Power Apps can be embedded into your Power BI Reports&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Power Apps can display, and let user interact with, data coming from your Power BI report&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Power Apps integrated in Power BI reports can be filtered by your users’ selections in Power BI&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Power Apps can write data back to your Power BI data source, and create, edit and delete rows of data. Changes which can be shown instantly in the report!&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In this article, I will cover how to get started with embedding Power Apps into your Power BI Reports, as well as provide a few examples of common use cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with Embedded Power Apps
&lt;/h2&gt;

&lt;p&gt;To get started with Embedded Power Apps, create a new Power BI Report in Power BI Desktop, or on the Power BI Service. Connect the report to your dataset, and create a simple visualisation. In this case, I’ve created a simple table.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2ACW_btjTyxIlxewgu" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2ACW_btjTyxIlxewgu"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AYGTY7eWqjLVFoqhr" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AYGTY7eWqjLVFoqhr"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;TIP: If you started in Power BI Desktop, now is the time to Publish the file to your workspace, and open it in Edit Mode in the Power BI Service, as that will make it easier to author your Power Apps.&lt;/p&gt;

&lt;p&gt;Next, insert a new Power Apps for Power BI visualization. This is a standard visualization type in Power BI. In order to send data from Power BI to Power Apps, drag any relevant fields from your Data Source onto the Fields list of your Power Apps visualization, just like you would with any other visualization in Power BI. You may choose to include Fields that you want to display and interact with, or any fields that you wish to be able to filter the App by. New fields can be added at any point, so don’t worry about forgetting any at this stage.&lt;/p&gt;

&lt;p&gt;Once you start adding Fields to the visualisation, it should change design, and give you two options:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Ad88nvGB1DP9fpP8J" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Ad88nvGB1DP9fpP8J"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We are creating a new embedded Power App, and will as such select the “Create New” option. This prompts your browser to open up a new tab for the Power Apps editor, and it should show a simple app displaying a gallery with some of the data from your Power BI report displayed. This data is available in the App due to the PowerBIIntegration element that you can see in the Tree View of the app. This element can be called and treated like any other data source you would use in your Power Apps:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Awp-vdUCfaFKVVZX2" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Awp-vdUCfaFKVVZX2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To test that all the building blocks are working as intended, create a new screen in your app. You can use a template, or create a blank screen. Just ensure that your new screen has a Gallery, and ensure that the ‘Items’ property of the gallery is populated with ‘PowerBIIntegration.Data’ (without the single quotes around it). This is the formula used to call and use the data coming from Power BI. Hence, try to add a new Text Label to your Gallery, and use ThisItem.&lt;em&gt;YourFieldName&lt;/em&gt; as the Text property, using one of the Fields you added to the PowerApps visualization inside Power BI, in place of &lt;em&gt;‘YourFieldName’.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AMMmuNUd31EFAvmF7" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AMMmuNUd31EFAvmF7"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, to test our very basic app, save and Publish the Power Apps, and jump back into Power BI. If you created your Report in the Power BI Service, make sure to save the report, before refreshing your browser page, otherwise you will need to recreate the report and/or Power Apps visualization as it will not save your changes. If you created your report in Power BI Desktop, you should publish the report to your workspace, to make testing of the Power App more seamless. After saving and publishing both the Power App and your Power BI Report, try refreshing the report page, and you should see your Power App render inside the report. Try out the interaction between Power BI and the Power App, by selecting a data point in your initial visualization, and check if the Power App is being filtered correctly.&lt;/p&gt;

&lt;p&gt;Now you have all the basic building blocks for building Embedded Power Apps which interact with your Power BI Report. From here, you might modify your App to write new data to a data source, or even play around with having your App edit the data that your Power BI Report is based on.&lt;/p&gt;

&lt;p&gt;Continue reading below for a few examples of Embedded Power Apps in practice.&lt;/p&gt;

&lt;h2&gt;
  
  
  Embedded Power Apps to power up report comment functionality
&lt;/h2&gt;

&lt;p&gt;The Power BI Service does include comment functionality, but they are limited in functionality, and the comment data can’t be extracted and used elsewhere. A very simple Power App can be configured to write new comments to a data source, taking into account the filter context of your Power BI report, in order to better filter, display and use the comments for future analysis and decisions.&lt;/p&gt;

&lt;p&gt;In this example, the Power App is used for creating, editing and displaying comments on Financial Statements, which are filtered by Department and YearMonth, to allow more granular commenting and analysis:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AuW6tay435iTmHQ5t" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2AuW6tay435iTmHQ5t"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Embedded Power Apps with Direct Query connection to Power BI for live feedback
&lt;/h2&gt;

&lt;p&gt;Another common use case is having Power Apps write data back to a data source like Dataverse or SQL databases, which allows Direct Query connection to Power BI. This makes it possible to visualize the data inputted with the embedded Power Apps live in the same Power BI report, increasing the potential for interactive reporting solutions.&lt;/p&gt;

&lt;p&gt;Use case examples include Budget Engines, in which analysts may dynamically adjust input values to be picked up by complex calculations in Power BI, in order to test different forecasting scenarios. Or checklist scenarios, in which analysts can use the direct query functionality to check off items or assignments as they go along, dramatically enhancing the interactive potential of Power BI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Data Entry requirements are notoriously difficult to satisfy with Self Service BI tools alone, but Power Apps integrated in Power BI reports provides a powerful and yet approachable way to get started. If you have questions or comments, you are always welcome to reach out.&lt;/p&gt;

</description>
      <category>powerplatform</category>
      <category>powerfuldevs</category>
      <category>microsoft</category>
      <category>database</category>
    </item>
    <item>
      <title>Ingesting API data to Microsoft Fabric with low-code Dataflows: A Practical Example using the Strava API</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Wed, 18 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/ingesting-api-data-to-microsoft-fabric-with-low-code-dataflows-a-practical-example-using-the-strava-api-2i5a</link>
      <guid>https://dev.to/jonvoge/ingesting-api-data-to-microsoft-fabric-with-low-code-dataflows-a-practical-example-using-the-strava-api-2i5a</guid>
      <description>&lt;h2&gt;
  
  
  Ingesting API data to Microsoft Fabric with low-code Dataflows: A Practical Example using the Strava API
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2j5hvlo7ppep6hal4mz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2j5hvlo7ppep6hal4mz.png" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Working with API data is often a task reserved for pro-coders. But in Microsoft Fabric, we can leverage the low-code Dataflows Gen2 to connect to any API and output the data to our Fabric Data Lakehouse or Data Warehouse.&lt;/p&gt;

&lt;p&gt;The article below takes you through the entire process, step by step, of connecting to Stravas API, setting up automatic refresh of Authorization Tokens, Downloading and transforming all Athlete Activities, and outputting them to a Microsoft Fabric data destination.&lt;/p&gt;

&lt;h2&gt;
  
  
  Strava prerequisites
&lt;/h2&gt;

&lt;p&gt;For this guide, I am using the Strava API, as it contains data that I as a hobby cyclist have a personal interest in extracting and visualizing. However, many of the principles described will apply to other APIs from applications you use. I may do a Spotify example in a future blogpost.&lt;/p&gt;

&lt;p&gt;To get started with the Strava API, simply register an account or login to &lt;a href="http://Strava.com" rel="noopener noreferrer"&gt;Strava.com&lt;/a&gt;. Then, go to &lt;a href="https://www.strava.com/settings/api" rel="noopener noreferrer"&gt;Strava | API&lt;/a&gt; and create an App. Then extract the following pieces of information and keep them somewhere safe:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Client ID&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Client Secret&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Don’t bother with the Authorization Token and Refresh Token as we will get these programatically.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6BibPUvi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Apo1L_5Nod973PcPb" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6BibPUvi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Apo1L_5Nod973PcPb" width="800" height="614"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  User Authorization
&lt;/h2&gt;

&lt;p&gt;Next, you’ll need to Authorize your newly created app to access the data of one or more Strava users.&lt;/p&gt;

&lt;p&gt;This is done by sharing the following link with the person(s) you wish to extract data from, and have them login using their Strava Account, and press Authorize:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.strava.com/oauth/authorize?client_id={InsertYourClientIDHere}&amp;amp;response_type=code&amp;amp;redirect_uri=http://localhost/exchange_token&amp;amp;approval_prompt=force&amp;amp;scope=read,activity:read
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After clicking ‘Authorize’, you will be redirected to a localhost URL, and you should now copy the ‘Code’ part of the URL displayed:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Example: http://localhost/exchange_token?state=&amp;amp;code=f97ce75c0113eba8926629369e5843242cc8e9be&amp;amp;scope=read,activity:read

Copy this part: f97ce75c0113eba8926629369e5843242cc8e9be
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Note that this code is shortlived, and will need to be used within a few hours for proper API authorization. Luckily, we will do that right away with Postman.&lt;/p&gt;

&lt;h2&gt;
  
  
  Initial Token Exchange using Postman
&lt;/h2&gt;

&lt;p&gt;Next, we need to do the initial Token Exchange by using the code we just retrieved. I prefer doing this with &lt;a href="https://www.postman.com/downloads/" rel="noopener noreferrer"&gt;Postman&lt;/a&gt;, but you can use whichever tool you wish.&lt;/p&gt;

&lt;p&gt;In Postman, create a new POST request, adding “&lt;a href="https://www.strava.com/oauth/token" rel="noopener noreferrer"&gt;https://www.strava.com/oauth/token&lt;/a&gt;" to the URL. Then add the following four parameters:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;client_id: Add the Client ID you retrieved from your Strava API app&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;client_secret: Add the Client Secret you retrieved from your Strava API app&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;code: Add the code you got from the user authorization URL in the previous step&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;grant_type: Add the value “authorization_code”&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In Postman, these parameters will automatically be added to your POST request URL.&lt;/p&gt;

&lt;p&gt;With all the details inputted, press Send, and you should receive back a Status Code 200 “OK” response, with a JSON response that includes a Bearer + Refresh Token valid for the User in question.&lt;/p&gt;

&lt;p&gt;Below you can see my sample request + result, retrieving a Bearer + Refresh token for my own Strava User Profile:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p5XjaBDT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A-CtM2DKn7xwD4aIV" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p5XjaBDT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A-CtM2DKn7xwD4aIV" width="800" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Write down the Refresh_Token and Access_Token in the same place as your Client ID and Secret. We will need it in the following step.&lt;/p&gt;

&lt;p&gt;ALSO write down the Athlede ID, as we need that as well.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating our Dataflow + Setting up automatic Token Refresh
&lt;/h2&gt;

&lt;p&gt;Now its finally time to jump into Microsoft Fabric.&lt;/p&gt;

&lt;p&gt;First, find a suitable Workspace, and create a Warehouse or Lakehouse to be used as Data Destination for the Strava Data. Then, create a new Dataflow Gen2 to be used for the Strava API:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fvMHmlf2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AllexNguhqI3WBB2L" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fvMHmlf2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AllexNguhqI3WBB2L" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we need to create a query in our Dataflow to refresh our Access Token, so that we don’t accidentally try to download data with an expired token.&lt;/p&gt;

&lt;p&gt;In your Dataflow create a new blank query, and paste the following into the Advanced Editor:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    let
      URL = "https://www.strava.com/api/v3/oauth/token",
      Headers = [#"Content-Type"="application/json"],
      Body = "{""client_id"": ""InsertYourClientIDHere"", ""client_secret"": ""InsertYourClientSecretHere"", ""grant_type"": ""refresh_token"", ""refresh_token"": ""InsertYourRefreshTokenHere""}",

      Source = Json.Document(Web.Contents(URL, [Headers = Headers, Content = Text.ToBinary(Body)])),
      #"Converted to table" = Record.ToTable(Source),
      #"Filtered rows" = Table.SelectRows(#"Converted to table", each ([Name] = "access_token")),
      #"Drill down" = #"Filtered rows"{0}[Value]

      in
      #"Drill down"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code above creates a Query to make a POST Request to the &lt;a href="https://www.strava.com/api/v3/oauth/token" rel="noopener noreferrer"&gt;https://www.strava.com/api/v3/oauth/token&lt;/a&gt; URL just like we did in Postman, but this type the grant_type is “Refresh Token”, meaning we wish to refresh our access and get a new Access Token.&lt;/p&gt;

&lt;p&gt;POST requests are notoriously difficult to work with in Power Query / Dataflows, but a few pointers if you find yourself working with them:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The difference between a GET and POST request in Dataflows / Power Query, is that a POST request will include a [Content = Text.ToBinary(InsertBodyHere)] parameter.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Body of the Content parameter almost always needs to be wrapped in a Text.ToBinary() formula to be sent correctly to the API.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Body should be formatted in JSON, which is also difficult to do. For this request, the syntax used is: “{“”parameterkey1"”: “”parametervalue1"”, “”parameterkey2"”: “”parametervalue2"”}”. Notice that all keys and values are wrapped in double quotes, and that the full curly bracket is also encased in quotes.. This is not a mistake, but very intentional, as this is the only format which will work&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you did everything correctly, the result should be a query which returns a new Refresh and Access Token, exactly like the Postman request. The difference is, that we no longer need that User Authorization URL code! Now we just need the refresh token, to get new access tokens.&lt;/p&gt;

&lt;p&gt;Hence, we can call this query every time we wish to extract new data, to ensure we have the latest token:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mIcYri_9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2At1O6fj6AtQ4jSK7h" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mIcYri_9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2At1O6fj6AtQ4jSK7h" width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Extracting Strava Data
&lt;/h2&gt;

&lt;p&gt;It is finally time to extract some data from the API.&lt;/p&gt;

&lt;p&gt;In my case, I want to retrieve Activity data for my user profile, as documented in the Strava API: &lt;a href="https://developers.strava.com/docs/reference/#api-Activities-getLoggedInAthleteActivities" rel="noopener noreferrer"&gt;Strava Developers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Downloading all my data involves some level of complexity, as the API includes pagination, allowing me to only retrieve 100 activities per API call. Hence, I first need to create one query with an API call to identify the number of Activities to download, and then another query to call each Page of Items to be downloaded from the API.&lt;/p&gt;

&lt;h3&gt;
  
  
  Number of Pages to download
&lt;/h3&gt;

&lt;p&gt;First create a New Text Parameter, “AthleteID” and set the default value equal to the ID of the Athlete you authenticated earlier, which you wrote down after the initial token exchange.&lt;/p&gt;

&lt;p&gt;Then, create a new blank query, “fxGetNumberOfPages”, and paste the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    (Token as text) =&amp;gt;

    let
        RelativePath = "/" &amp;amp; AthleteID &amp;amp; "/stats",
        Payload = [RelativePath = RelativePath, Headers=[Authorization = Token]],
        Source = Web.Contents("https://www.strava.com/api/v3/athletes", Payload),
        JSON = Json.Document(Source)

        in
        JSON
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This query is a reusable Function which will take an Authorization Token as input, and we will use that in another query.&lt;/p&gt;

&lt;p&gt;Create another new query, “GetPages”, and paste the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
    let
      Source = fxGetNumberOfPages("Bearer " &amp;amp; GetRefreshToken),
      #"Converted to table" = Record.ToTable(Source),
      #"Filtered rows" = Table.SelectRows(#"Converted to table", each ([Name] = "all_ride_totals" or [Name] = "all_run_totals" or [Name] = "all_swim_totals")),
      #"Expanded Value" = Table.ExpandRecordColumn(#"Filtered rows", "Value", {"count"}, {"Value.count"}),
      #"Changed column type" = Table.TransformColumnTypes(#"Expanded Value", {{"Value.count", Int64.Type}}),
      #"Calculated sum" = List.Sum(#"Changed column type"[Value.count]),
      #"Divided value" = #"Calculated sum" / 100,
      #"Rounded up" = Number.RoundUp(#"Divided value"),
      #"Converted to table 1" = #table(1, {{#"Rounded up"}}),
      #"Added custom" = Table.AddColumn(#"Converted to table 1", "StartPage", each 1),
      #"Added custom 1" = Table.AddColumn(#"Added custom", "Pages", each {[StartPage]..[Column1]}),
      #"Expanded Custom" = Table.ExpandListColumn(#"Added custom 1", "Pages"),
      #"Removed columns" = Table.RemoveColumns(#"Expanded Custom", {"Column1", "StartPage"}),
      #"Changed column type 1" = Table.TransformColumnTypes(#"Removed columns", {{"Pages", Text.Type}}) in
    #"Changed column type 1"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code will run the function we just defined, using the refreshed Access Token from ‘GetRefreshToken’ as the token input. The result is a JSON response from the API, which the subsequent steps in the query formats into a simple one column table, with one row for each page of items to be downloaded:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fwe-QsWi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AF9TGfWQgsabKlVbl" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fwe-QsWi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AF9TGfWQgsabKlVbl" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w664F81f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AVu8RwvIxnHsDh4_s" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w664F81f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AVu8RwvIxnHsDh4_s" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Downloading the pages
&lt;/h3&gt;

&lt;p&gt;Create a new blank query, and paste in the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
    let
      Source = GetPages,
      #"Added custom" = Table.TransformColumnTypes(Table.AddColumn(Source, "Token", each "Bearer " &amp;amp; GetRefreshToken), {{"Token", type text}}),
      #"Added custom 1" = Table.AddColumn(#"Added custom", "RelativePath", each "/activities?page=" &amp;amp; [Pages] &amp;amp; "&amp;amp;per_page=100"),
      #"Changed column type" = Table.TransformColumnTypes(#"Added custom 1", {{"RelativePath", type text}}),
      #"Added custom 3" = Table.AddColumn(#"Changed column type", "Custom", each Web.Contents("https://www.strava.com/api/v3/athlete", [RelativePath=[RelativePath], Headers=[Authorization = [Token]]])),
      CombineBinaries = Table.AddColumn(#"Added custom 3", "Custom2", each Json.Document([Custom]))
      in
    #"Added custom 3"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The query references the “GetPages” query to start from a one column table of pages to download. Then we add a column for the Access Token as well as a Column which constructs the Relative part of the URL.&lt;/p&gt;

&lt;p&gt;Finally, the API GET request is performed by using Web.Contents(), and the contents are combined in one table:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iEw0MGaP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyBwsp7jE7EGjJ6At" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iEw0MGaP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyBwsp7jE7EGjJ6At" width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I struggled quite a bit getting this to work well in Fabric. I was often able to get something working inside the Dataflow Editor, but most of the time the automatic refresh would fail. However, the above code, with separation of URL and Relative Path seems to work for me.&lt;/p&gt;

&lt;p&gt;After a bit of transformation and filtering, you end up with a final table of all the athletes activities, and you can add your data destination of choice (in my case a Warehouse) before publishing your dataflow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iU3HSZGW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A3-F00qki0sq8FGRh" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iU3HSZGW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A3-F00qki0sq8FGRh" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;My Strava data now lives in my Fabric Data Warehouse, is automatically updated on a schedule, and can be used for any number of things. I can use it in my Semantic Models, create new Views and Queries, and use the data in Power BI reports as I please.&lt;/p&gt;

&lt;p&gt;Working with APIs in Dataflows is not the easiest, and due to the editor sometimes being able to produce in-view results which are not achievable with a scheduled flow, they can even be hard to debug.&lt;/p&gt;

&lt;p&gt;However, with a bit of persistence, trial and error, it is possible to use Dataflows to get data from any API into your Microsoft Fabric Data Platform.&lt;/p&gt;

</description>
      <category>microsoft</category>
      <category>microsoftfabric</category>
      <category>api</category>
      <category>dataflows</category>
    </item>
    <item>
      <title>4 ways to access data from Microsoft Fabric OneLake in Dataverse / Power Platform solutions</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Mon, 16 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/4-ways-to-access-data-from-microsoft-fabric-onelake-in-dataverse-power-platform-solutions-464m</link>
      <guid>https://dev.to/jonvoge/4-ways-to-access-data-from-microsoft-fabric-onelake-in-dataverse-power-platform-solutions-464m</guid>
      <description>&lt;h2&gt;
  
  
  4 ways to access data from Microsoft Fabric OneLake in Dataverse / Power Platform solutions
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxfk5l6n77q76ho33p3wm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxfk5l6n77q76ho33p3wm.png" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;I have written about accessing Dataverse Data from Microsoft Fabric, outlining how it is possible using:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Azure Synapse Link for Dataverse&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Dataverse Link for Fabric&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Dataflows / Dataflows Gen2&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Power BI semantic models created in Power BI Desktop&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But what if we wanted to go the other way around? What if we wanted to use data stored in OneLake as part of a Power Platform Solution?&lt;/p&gt;

&lt;p&gt;There are several potential solutions to this, and this article will outline four different options, each with their pros and cons:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Using Virtual Tables in Dataverse to reference OneLake data&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Using Dataflows in Dataverse to copy OneLake data&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Using Data Pipelines in Microsoft Fabric to copy OneLake data to Dataverse&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Using direct connectors in Power Apps / Power Automate to access OneLake data&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Using Virtual Tables in Dataverse to copy OneLake data
&lt;/h2&gt;

&lt;p&gt;The first option for bringing Fabric / OneLake data into Dataverse and your Power Platform solutions, is to reference the data using Dataverse External Tables, to query Fabric Lakehouses.&lt;/p&gt;

&lt;p&gt;First, go to the Tables section of the Dataverse Environment that should access the data:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ayOAp2ar--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Aioe6CrxNwRlHFPqZ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ayOAp2ar--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Aioe6CrxNwRlHFPqZ" width="800" height="390"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a New Table, select Virtual Table, and create a new Microsoft Fabric connection if you have not used the connection before (it will ask you to authenticate with your Windows Credentials):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CSF_OcPL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AcyposrOCUSw7oQln" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CSF_OcPL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AcyposrOCUSw7oQln" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then select from the list of available workspaces in Fabric, and point to the Lakehouse that holds your tables:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iYFcZSFv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIlItpSSX06iCvpRB" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iYFcZSFv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIlItpSSX06iCvpRB" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the table you wish to query:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ktLUcylE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AAMMT6yZU6H4k0o-U" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ktLUcylE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AAMMT6yZU6H4k0o-U" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Configure Primary Key column (your table must have a unique column that can be used for this), and save the table:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MiFyrd0t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A8qpqVCxO37uymh3n" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MiFyrd0t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A8qpqVCxO37uymh3n" width="800" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The result is a Virtual Table in Dataverse which can be imported to your Power Platform solutions like any other Dataverse table:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8Bmis_Gx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2Avy_tdniFMW7BQloW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8Bmis_Gx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2Avy_tdniFMW7BQloW" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q4pAQfjc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A78hj2Ec0QeXrfU0K" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q4pAQfjc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A78hj2Ec0QeXrfU0K" width="800" height="857"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bvWRKa8A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ad7rKpIfARr_sLrc0" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bvWRKa8A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ad7rKpIfARr_sLrc0" width="800" height="595"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pros:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data is not replicated, but queried.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Cons:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Warehouses are not supported.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Tables must have a unique column to be used as Primary Key.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A general list of Virtual Table considerations and limitations can be found in the Microsoft Documentation: &lt;a href="https://learn.microsoft.com/en-us/power-apps/maker/data-platform/limits-tshoot-virtual-tables?tabs=sql" rel="noopener noreferrer"&gt;Limitations and troubleshooting virtual tables with Power Apps — Power Apps | Microsoft Learn&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Using Dataflows in Dataverse to copy OneLake data
&lt;/h2&gt;

&lt;p&gt;Another option is to use Dataflows in Dataverse to ingest data from Fabric Lakehouses and Warehouses.&lt;/p&gt;

&lt;p&gt;First create a new dataflow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YrCgHf5Y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Al6KK_3Vyi8GFSz7y" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YrCgHf5Y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Al6KK_3Vyi8GFSz7y" width="800" height="645"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Choose SQL Server as your data source, and input the Connection String of your Fabric Warehouse or Lakehouse in the Server details (you can get this from inside Microsoft Fabric), and write the exact name of the Warehouse or Lakehouse as the Database name:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MsstqssV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2An2IU5sFKBMUzyluc" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MsstqssV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2An2IU5sFKBMUzyluc" width="800" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zcI9PWG---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A5emUaxH8NP7HQtkj" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zcI9PWG---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A5emUaxH8NP7HQtkj" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now select the table you wish to import, apply any transformations and map the output to the destination table in Dataverse:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SbcCsshK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AqIZjdix6TGt7DYUU" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SbcCsshK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AqIZjdix6TGt7DYUU" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lgb1YJOB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ArXY5nXhDc8Btw1Hr" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lgb1YJOB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ArXY5nXhDc8Btw1Hr" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3FJdydg1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AQtOAUHxtJr0NbsnK" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3FJdydg1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AQtOAUHxtJr0NbsnK" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Publish the dataflow, and way for the refresh to go through. Now you can use the data in your Power Platform solutions just like any other Dataverse table!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pros:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Allows transformation of data before landing in Dataverse&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Supports both Warehouses and Lakehouses in Fabric&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Cons:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Data is replicated and will take up storage in both Fabric and Dataverse&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data is not live and depends on refresh schedule&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Using Data Pipelines in Microsoft Fabric to copy OneLake data to Dataverse
&lt;/h2&gt;

&lt;p&gt;The third way is to use a Copy Activity in a Microsoft Fabric Data Pipeline to copy data from a Lakehouse or Warehouse in Fabric into a table in Dataverse.&lt;/p&gt;

&lt;p&gt;In order for this integration to work, you need to first setup a Service Principal, and grant it access to your Dataverse Environment, so that your Data Pipeline can use it for authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Creating the Service Principal in Azure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;First, go to portal.azure.com, search for ‘App Registrations’, and create a new Registration (this is your Service Principal / SPN).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--n1noI3pX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ADbEa0Q74D1j9Gu5k" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--n1noI3pX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ADbEa0Q74D1j9Gu5k" width="800" height="640"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Provide it a name, just stick with the default Account Type, and hit create.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ciabYoID--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A8TeDmjGUUElf56ed" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ciabYoID--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A8TeDmjGUUElf56ed" width="800" height="723"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After creation, go to the Overview Page of your App Registration, and note down the Application/Client ID and the Tenant ID. You will need those later.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8dcT7VLs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AXVdM7Tkn-P_AkQLY" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8dcT7VLs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AXVdM7Tkn-P_AkQLY" width="800" height="924"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then jump to the API Permissions page, and add a new Permission.&lt;/p&gt;

&lt;p&gt;Select / Search for Dynamics CRM, select the ‘user_impersonation’ permission and add it to your permissions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Us95z-wV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIPrFKbqiqoyzzuDg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Us95z-wV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIPrFKbqiqoyzzuDg" width="800" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--o4vpqBez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2As4JB6VQZ7JTARxwo" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4vpqBez--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2As4JB6VQZ7JTARxwo" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now go to Certificates and Secrets, and create a new Secret. Note down the Key in the same place as your Application ID and Tenant ID, as you will need this later as well. Note that once you exit the screen with the Key displayed, you will not be able to view the key again, and will have to create a new one to continue setup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--50I8EtDU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2A2WWEm1NEbEdoHgUw" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--50I8EtDU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2A2WWEm1NEbEdoHgUw" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Adding the Service Principal to your Power Platform Environment&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, we need to grant the Service Principal access to the Power Platform Environment that we need it to authenticate against.&lt;/p&gt;

&lt;p&gt;Go to the &lt;a href="https://admin.powerplatform.microsoft.com/" rel="noopener noreferrer"&gt;Power Platform admin center (microsoft.com)&lt;/a&gt;, and under Environments, select the one you intend to use. From this screen, copy the Environment URL and store it with the IDs and Keys you copied from Azure. You will also need this later.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PWfVUrD1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AfG9qXAZjXvabpuMh" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PWfVUrD1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AfG9qXAZjXvabpuMh" width="800" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under Access, select the button under ‘Users’, and on the Users page find the button to go to the App Users List.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gG7tBBkQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AR70aXjsaKHJnvCPX" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gG7tBBkQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AR70aXjsaKHJnvCPX" width="800" height="224"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In here, create a new app user, and select the Service Principal you created in Azure.&lt;/p&gt;

&lt;p&gt;Assign it the required Security Roles. You may use custom roles, or for testing purposes you may select the System Customizer role, as it will ensure it has the necessary privileges.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xOZj2EnQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ALdX9KGuN7zYgw54o" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xOZj2EnQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ALdX9KGuN7zYgw54o" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Creating the Pipeline and setting up the Copy Activity&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Finally its time to setup the actual data copy. Create a new Data Pipeline in a workspace of your choice.&lt;/p&gt;

&lt;p&gt;Add a Copy data Activity, and configure its source to the table you wish to copy into Dataverse.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8VcS8sDs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Aj5BtvJWV2MdRxN-n" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8VcS8sDs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Aj5BtvJWV2MdRxN-n" width="800" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Setup the destination of the Activity to point to a Dataverse Destination by choosing External, and selecting Dataverse.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Uc3LiTrY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyGfbRR0L_AI0gZmO" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Uc3LiTrY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyGfbRR0L_AI0gZmO" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vtu6tmf1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ARYqYugBYtRUP5P79" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vtu6tmf1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ARYqYugBYtRUP5P79" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While creating the connection, you will need all the four values we copied during Steps 1 and 2.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Insert your Environment URL as the Environment Domain.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Insert your Tenant ID as the Tenant ID&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Insert the Application / Client ID as the Service Principal Client ID&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Insert the Secret Key as the Service Principal Key&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dCdDdN8s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ALE1QyoSQWko1eWJy" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dCdDdN8s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ALE1QyoSQWko1eWJy" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the destination table in Dataverse you wish to write to. If it is not created already, you will need to jump into Dataverse and create that now.&lt;/p&gt;

&lt;p&gt;After pointing to the table, feel free to Preview the Data and test the connection, as well as selecting which Write Behaviour you wish, under Advanced options.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c7lDNRD8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AavpYD0EaZxSt3nqR" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c7lDNRD8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AavpYD0EaZxSt3nqR" width="800" height="510"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, we need to map the Source Columns against the Destination Columns, under the Mapping tab.&lt;/p&gt;

&lt;p&gt;Selecting Import Schema will attempt to automatically do this for you, but you may remove any columns you do not wish to include in the copy, as well as do any manual adjustments.&lt;/p&gt;

&lt;p&gt;Note, that in case your table includes complex columns like LookUp columns, you may need to do some additional steps. In this case, I recommend following Scott Sewells excellent video on the topic: &lt;a href="https://www.youtube.com/watch?v=Ktcidjw4e5A" rel="noopener noreferrer"&gt;Fabric Pipelines for Dataverse Part 5: Populate the Lookups (youtube.com)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hsYOBHmN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ak1W1hSyOEN2aFxqK" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hsYOBHmN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ak1W1hSyOEN2aFxqK" width="800" height="512"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After mapping the activity, you can Validate, Save and Run your pipeline, and hopefully it succeeds:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ygAO-zYv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyyXoRnGfWEmRjirf" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ygAO-zYv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AyyXoRnGfWEmRjirf" width="800" height="419"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the data showing up in your Dataverse Table, as having been Created/Modified by your SPN:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yZlSjjrp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AHWfdDRL4u4bHES9g" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yZlSjjrp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AHWfdDRL4u4bHES9g" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pros:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;May be easily included with the rest of your Fabric Data Pipelines.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Good Performance.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Cons:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Requires you to enter Azure to configure SPN — which not everybody has access to.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;May be difficult to setup if your table includes complex columns, e.g. LookUp columns.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Using direct connectors in Power Apps / Power Automate to access OneLake data
&lt;/h2&gt;

&lt;p&gt;The last option is to skip the step of populating a Table in Dataverse which the previous methods take advantage of, and instead connect directly to the SQL endpoint of your Fabric Warehouse and Lakehouse from within Power Apps / Power Automate.&lt;/p&gt;

&lt;p&gt;In Power Apps, simply add a SQL Source Connection:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IpnzTw9T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AvLm0KOYv6sDYvqRV" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IpnzTw9T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AvLm0KOYv6sDYvqRV" width="800" height="965"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And point it towards the same SQL Connection string + database name as mentioned in the Dataflows option above, before choosing the tables you wish to add:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6akGLq0h--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2AqnX8zKzou9gtnnm4" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6akGLq0h--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2AqnX8zKzou9gtnnm4" width="800" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VX3WEEfN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AZsIV4W_xcphmmnVs" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VX3WEEfN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AZsIV4W_xcphmmnVs" width="647" height="1023"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oT7m2lw2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2ACNtvpEOKYi7PIvlW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oT7m2lw2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2ACNtvpEOKYi7PIvlW" width="800" height="821"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The data can now be used in Power Apps controls, but do note that Power Apps does not allow you to write data directly to a Fabric Data Warehouse, despite the endpoint being both Read and Write enabled. This is due to the lack of enforced primary keys in the Warehouse. (Read more in my blog about write-back to Microsoft Fabric: &lt;a href="https://downhill-data.com/2024/05/07/live-data-write-back-to-fabric-data-warehouse-from-power-bi-w-power-apps-and-power-automate/" rel="noopener noreferrer"&gt;https://downhill-data.com/2024/05/07/live-data-write-back-to-fabric-data-warehouse-from-power-bi-w-power-apps-and-power-automate/&lt;/a&gt;):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--K9R8Lg3P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2A1tDpQAOqVBbmPnCV" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--K9R8Lg3P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2A1tDpQAOqVBbmPnCV" width="800" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Power Automate its a similar story. Simply add an action that leverages the SQL connector, and connect using the Server connection string and Database name. Here I am using my Power Automate write-back workaround by using the SQL connector in Power Automate against a Fabric Warehouse:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YFYvH5QK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AshzY0ymjjiRIJogW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YFYvH5QK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AshzY0ymjjiRIJogW" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pros:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Data is always queried live from the Fabric data source.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data is not replicated.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Supports both Warehouses and Lakehouses in Fabric&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Cons:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Power Apps functionality is limited due to lack of enforced Primary Keys (while Power Automate has no issues with that).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data never lands in Dataverse, making you unable to leverage Dataverse for any additional functionality. You have to build any transformations/logic locally in the solution you are developing.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;I hope this post has given you inspiration for different ways to access data from Fabric and OneLake in your Power Platform solutions.&lt;/p&gt;

&lt;p&gt;Here I outlined four possibilities, and I am sure that there are more out there. Do you think I missed an obvious one? Please drop me a comment or message — I would love to hear from you.&lt;/p&gt;

</description>
      <category>powerplatform</category>
      <category>powerfuldevs</category>
      <category>microsoftfabric</category>
      <category>onelake</category>
    </item>
    <item>
      <title>5 ways to get your Dataverse Data into Microsoft Fabric / OneLake</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Fri, 13 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/5-ways-to-get-your-dataverse-data-into-microsoft-fabric-onelake-2h91</link>
      <guid>https://dev.to/jonvoge/5-ways-to-get-your-dataverse-data-into-microsoft-fabric-onelake-2h91</guid>
      <description>&lt;h2&gt;
  
  
  5 ways to get your Dataverse Data into Microsoft Fabric / OneLake
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9UVWtHCl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AttdBhYxt5mI4YHgv" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9UVWtHCl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AttdBhYxt5mI4YHgv" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Microsoft Fabric is top of mind everywhere, and its promises for improving the analytic capabilities of Power BI users are huge. However, integrations and compatibility between Fabric and the wider Power Platform (e.g. Dataverse, Power Apps) is not seeing the focus it warrants.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;For starters:&lt;/em&gt;* How can I leverage data currently stored in Dataverse, in my Microsoft Fabric setup?***&lt;/p&gt;

&lt;p&gt;When Fabric released in 2023, in terms of automatically syncing data, the existing ‘&lt;strong&gt;Azure Synapse Link&lt;/strong&gt;’ was our only solution for this challenge. Later in 2023 however, came the simpler ‘&lt;strong&gt;Dataverse Link to Fabric&lt;/strong&gt;’. and most recently in 2024, it was made possible to create ‘&lt;strong&gt;Shortcuts to Dataverse&lt;/strong&gt;’ from Lakehouses in Fabric.&lt;/p&gt;

&lt;p&gt;In addition, we have approaches such as using Dataflow Gen2 to do scheduled loads of data from Dataverse to a Fabric Destination. And finally the Semantic Model Integration to OneLake which I covered &lt;a href="https://downhill-data.com/2024/05/14/crazy-or-genius-ingesting-data-to-microsoft-fabric-onelake-with-import-semantic-models-instead-of-dataflow-gen2/" rel="noopener noreferrer"&gt;last week on this blog&lt;/a&gt;, also allows for a scheduled load from Dataverse to OneLake.&lt;/p&gt;

&lt;p&gt;But what are the strengths and weaknesses of each approach?&lt;/p&gt;

&lt;p&gt;This article shares and compares the step-by-step process for each of the five approaches to, hopefully aiding you in selecting the right method for your case.&lt;/p&gt;

&lt;p&gt;Note, that each section pretty lengthy and full of screenshots. Feel free to jump to the final section for a quick comparison between all five.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution 1: Azure Synapse Link
&lt;/h2&gt;

&lt;p&gt;Dataverse has long supported &lt;a href="https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-synapse" rel="noopener noreferrer"&gt;Azure Synapse Link for Dataverse&lt;/a&gt;. However, by leveraging the new Fabric OneLake Shortcuts, we can use the old Synapse Link for Dataverse to transport data into an Azure Data Lake Gen2, and subsequently the OneLake shortcut for ADLS Gen2 to make said data available in OneLake / Fabric.&lt;/p&gt;

&lt;p&gt;The steps we need to take to make this happen are as follow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Setup the necessary Azure Resources for the Synapse Link (Subscription, Resource Group, Storage Account/ADLS Gen2)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Prepare Dataverse Tables and establish Azure Synapse Link between the Tables and ADLS Gen2&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create Shortcut in your Fabric/OneLake Data Lakehouse to the Dataverse Tables&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following sections will take you through each step.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Azure Resources
&lt;/h3&gt;

&lt;p&gt;In this step we will create the following Azure Resources, and note down a couple of important links and values:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;A Subscription&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A Resource Group&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A Storage Account (ADLS Gen2) including its URL Endpoint.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;An App Registration with correct roles and permissions on the Storage Account (+ a Client ID &amp;amp; Secret for authentication).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Follow the steps below to create the resources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;First head to &lt;a href="https://portal.azure.com/#home" rel="noopener noreferrer"&gt;portal.azure.com&lt;/a&gt;, and if you do not have a subscription already, create one and connect to a billing account.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Second, create a Resource Group, linking it to your Subscription.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Third, create a Storage Account, linking this to your Subscription and Storage Account. Also make sure to find the Advanced Options, and toggle on ‘Enable Hierarchical Namespace’. &lt;em&gt;This ensure that the Data Lake being created is Gen 2!&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1d6b1cxnmuvq3qluyndh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1d6b1cxnmuvq3qluyndh.png" width="800" height="174"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fourth, go to your Azure Active Directory, open App Registrations, and create a new registration, making sure “Accounts in this organizational directory only” is selected as supported account type.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qgzh3v1o84wphnwev02.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qgzh3v1o84wphnwev02.png" width="800" height="316"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After creating the App Registration, open it, and &lt;strong&gt;from the Overview Page, write down the Directory (tenant) ID and the Application (client) ID for later use.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumnj4sj5aahuuneco8oy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumnj4sj5aahuuneco8oy.png" width="538" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Then go to Certificates &amp;amp; Secrets of the App Registration to create a new Secret for authentication. &lt;strong&gt;Write down the Secret Value, for use later on.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now go back to your Storage Account, and into ‘Access Control (IAM)’. Here, add a new Role Assignment for your newly created App Registration, &lt;strong&gt;giving it the Roles “Storage Blob Data Contributor” and “Storage Blob Data Owner”.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;And finally, open up the “Endpoints” menu of your Storage Account, and &lt;strong&gt;write down the Primary Endpoint for later use.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1tpiu7ssjpvyagq7x1kj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1tpiu7ssjpvyagq7x1kj.png" width="800" height="189"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Prepare Dataverse Tables and Create Azure Synapse Link
&lt;/h3&gt;

&lt;p&gt;After creating all the necessary Azure Resources, we can prepare our Dataverse tables and setup the Azure Synapse Link.&lt;/p&gt;

&lt;p&gt;First go to &lt;a href="http://make.powerapps.com/" rel="noopener noreferrer"&gt;make.powerapps.com&lt;/a&gt; and identify the Tables which you want to sync to your Data Lake.&lt;/p&gt;

&lt;p&gt;For each table, open up the table settings, go to properties, expand the Advanced Options, and make sure “Track Changes” has been enabled. &lt;strong&gt;If you do not do this, your table will not show up in the Synapse Link.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4hhoqg2r17d7zn3fd4v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4hhoqg2r17d7zn3fd4v.png" width="800" height="317"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now go to the Azure Synapse Link page, by selecting it from the navigation panel on the left hand side. If the option is not there, select More &amp;gt; Discover All and find the option in the bottom right side of the page under Data Management.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpp2x7qjoz1p4wnrmuk44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpp2x7qjoz1p4wnrmuk44.png" width="800" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a New Link, and select the Subscription, Resource Group and Storage Account that you created during Step 1.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F085l93gpw9x2p16963ys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F085l93gpw9x2p16963ys.png" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, Add the Tables which you want to sync to your Data Lake. Note that only tables which have “Track Changes” enabled will show up here.&lt;/p&gt;

&lt;p&gt;Also pay attention to the “Advanced” tab, in which you can customize the frequency of refresh taking place, down to a minimum of 15 minutes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa571ei4jvhxw6i9fhgjb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa571ei4jvhxw6i9fhgjb.png" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And you should now be able to see an overview of your Synapse Link and the tables being synced to the Data Lake:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbzmz5vjvamk9nnaajeg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbzmz5vjvamk9nnaajeg.png" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Similarly, you should be able to go to your Storage Account in Azure, find the Storage Browser, and see that the tables are now being synced and are available within your Data Lake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3snovwu7v0ujft2ru0ht.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3snovwu7v0ujft2ru0ht.png" width="800" height="488"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As a final act in Step 2, &lt;strong&gt;please write down the string of the subfolder in which your Dataverse Tables have been uploaded&lt;/strong&gt;. In my case, it is the “dataverse-60xtthdefault-unq…….” string.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftc86sez1ft9zmwi82390.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftc86sez1ft9zmwi82390.png" width="800" height="204"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Create Shortcut in your Fabric/OneLake Data Lakehouse to the Dataverse files in the Data Lake
&lt;/h3&gt;

&lt;p&gt;Our final step on the journey, is to create a shortcut from our OneLake Data Lakehouse to the ADLS Gen2 folder containing our Dataverse data.&lt;/p&gt;

&lt;p&gt;First, go to &lt;a href="http://app.powerbi.com/" rel="noopener noreferrer"&gt;app.powerbi.com&lt;/a&gt;, and if you haven’t already, create a Fabric-Enabled workspace, and create a Data Lakehouse within the Workspace. &lt;a href="https://learn.microsoft.com/en-us/power-bi/fundamentals/fabric-get-started" rel="noopener noreferrer"&gt;Microsoft has published an excellent guide to get you started on this.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now open up the Data Lakehouse and click Files &amp;gt; New Shortcut&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxisue111llw7m9i1vft.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxisue111llw7m9i1vft.png" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From here, choose Azure Data Lake Storage Gen2&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0ilshtq4i6rpfpuoaj6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0ilshtq4i6rpfpuoaj6.png" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the Connection Settings, point the URL to the ADLS Gen2 ‘Primary Endpoint’, which you wrote down as part of step 1.&lt;/p&gt;

&lt;p&gt;For Connection Credentials, you may, depending on your settings, be able to connect with your Organizational Account, but otherwise select Service Principal as ‘Authentication Kind’, and fill in Tenant ID, Client ID and then your Secret Value as the Service Principal Key, all of which are values you wrote down during Step 1.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgspr9jzbq2fsuqv2v5p8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgspr9jzbq2fsuqv2v5p8.png" width="800" height="505"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, give the Shortcut an appropriate name, and then provide it with the Subfolder string which you wrote down as the very last thing in Step 2.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqdyf2dglnewwchrvehdu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqdyf2dglnewwchrvehdu.png" width="800" height="508"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You should now be able to use the Shortcut from the navigation menu of your Lakehouse, and query the copies of your Dataverse data directly from within your Notebooks.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcctpljx3kqj9jk0y20tz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcctpljx3kqj9jk0y20tz.png" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution 2: Dataverse Link to Fabric
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;The first thing you need to do, is to check the following off your list:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;You must have the System Administrator security role in the Dataverse environment.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You must have a Fabric Capacity, Fabric Trial Capacity or a Power BI Premium Capacity available.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You must be an administrator of the Power BI workspace you want to link to, and the workspace should be allocated to one of the capacity types mentioned above.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Then, in addition, you may or may not need to be aware of some Azure Region details regarding your Fabric and Dataverse Envronments.&lt;/p&gt;

&lt;p&gt;Up until a few weeks ago, it was a hard requirement that the regions of your Fabric Workspace and Dataverse environments were the same, else the integration would not work. However, rumour has it that this restriction has been lifted:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mj1r9qSe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2026/0%2Awin1LJsKkA1XCABd" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mj1r9qSe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2026/0%2Awin1LJsKkA1XCABd" width="800" height="808"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If for some reason the feature has not yet been rolled out for your region yet, I would be patient and wait a few weeks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting up Dataverse Link for Fabric
&lt;/h3&gt;

&lt;p&gt;With the prerequisites out of the way, it is time to setup the actual Dataverse Link for Fabric.&lt;/p&gt;

&lt;p&gt;First, go to the Tables area of your Dataverse environment, click Analyze at the top, and Link to Microsoft Fabric:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WtOK7UlT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AbrWs3a9tSPnek5V8" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WtOK7UlT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AbrWs3a9tSPnek5V8" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Set up your Connection by inputting your credentials, and also choose the workspace (or create a new) that you wish to use as destination for the Lakehouse which will be created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WN5ygQ3p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A2n2KUFJwbyPB7zvb" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WN5ygQ3p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A2n2KUFJwbyPB7zvb" width="800" height="544"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lrAnvCNY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A6RdjXB_tnhg09GHZ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lrAnvCNY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A6RdjXB_tnhg09GHZ" width="800" height="540"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After finalizing the setup, your Lakehouse and/or Workspace will be created and opened for you:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nJHHvuq8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Asy56PQKULHOY6A88" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nJHHvuq8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Asy56PQKULHOY6A88" width="800" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And the Dataverse Link for Fabric will be accessible from the Azure Synapse Link page in your Dataverse Environment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--s0inKkJo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOFrK1OVzzePmAVC1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--s0inKkJo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOFrK1OVzzePmAVC1" width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At first the Lakehouse will have to perform an initial load:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--u0Y72CZa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A6QErc2CCzco5hQoE" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--u0Y72CZa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2A6QErc2CCzco5hQoE" width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If even after the initial load, some or all of your tables end up in a weird “Unidentified” folder, you may open the Dataverse Link for Fabric in the Azure Synapse Link page, and click “Refresh Fabric Tables”, which will refresh the metadata in the Link. On this page, you can also monitor the status of the tables being synced:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cCAPCg0U--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Af5afbJWzObVj1HSl" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cCAPCg0U--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Af5afbJWzObVj1HSl" width="800" height="381"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, with a bit of wait time, you should be able to see actual delta tables appear in your Lakehouse, which you can then use in Fabric as you would any other Lakehouse table. Use it in a Semantic Model. Shortcut it into other Lakehouses. Run a notebook. Whatever you please!:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xta6LEG6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AdTjxNVIKeNdKCsV6" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xta6LEG6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AdTjxNVIKeNdKCsV6" width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that what ahs happened under the hood, is that new Delta Tables have been created in a Dataverse Managed Lake, which reference the original Dataverse data. That means that this solution consumes Dataverse Storage as well!&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution 3: Shortcut to Dataverse
&lt;/h2&gt;

&lt;p&gt;The Mad March of 2024 brought the announcement that Lakehouses in Fabric would be able to create shortcuts directly to Dataverse: &lt;a href="https://blog.fabric.microsoft.com/en-us/blog/analyze-dataverse-tables-from-microsoft-fabric/" rel="noopener noreferrer"&gt;Analyze Dataverse tables from Microsoft Fabric | Microsoft Fabric Blog | Microsoft Fabric&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, to my disappointment, this seems to be more or less a small additional functionality that builds on top of Solution 2, more so than offering a true alternative.&lt;/p&gt;

&lt;p&gt;What is new, is that from the Lakehouse, once selecting Shortcut, you are now able to select Dataverse as an option:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WI_rpDva--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AUTxmIwmzXsPOrvtI" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WI_rpDva--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AUTxmIwmzXsPOrvtI" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Dqran6il--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOS5PpcprVO3gDlT7" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Dqran6il--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOS5PpcprVO3gDlT7" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, as described in the &lt;a href="https://learn.microsoft.com/en-us/fabric/onelake/onelake-shortcuts" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;, in order for you to create Shortcuts to Dataverse from a Lakehouse in Fabric, you need to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Have System Administrator privileges on a Dataverse Environment that resides in the same Azure geographic region as your Fabric Capacity Workspace&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable Link to Fabric from said Dataverse Environment&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But wait… Wasn’t that essentially the same as Solution 2? &lt;em&gt;Yes, indeed.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;And won’t that process already automatically populate a Lakehouse in Fabric with all my Dataverse tables, making them available for me to Shortcut into other Fabric Lakehouses? &lt;em&gt;Yes. Indeed.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The new thing here is really, that after setting up the Link to Fabric on an environment, we can now access the tables from the underlying Dataverse Managed Lake directly through the Shortcut wizard in the Lakehouse, and we can do that freely from any Lakehouse in any Workspace.&lt;/p&gt;

&lt;p&gt;Instead of having to work with that auto generated Lakehouse from the ‘Link to Fabric’ functionality, we can consider that setup as only an administrative task. After the initial setup is done, any new Shortcuts you make to Dataverse can be made &lt;em&gt;much cleaner&lt;/em&gt; than previously possible. You can pick and choose which column to select from your environments:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--k3jDF5zL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AjSYAlIUPTNA8zNHG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k3jDF5zL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AjSYAlIUPTNA8zNHG" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And you can even Shortcut multiple tables from &lt;em&gt;separate&lt;/em&gt; environments into the same Lakehouse:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mCJJxGKE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AQ58TeovFSnx4Tbpw" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mCJJxGKE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AQ58TeovFSnx4Tbpw" width="795" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AND you can even cross-query between these shortcutted tables with ease:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0GFgJTby--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AsS1ujjdrGxyVnUNB" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0GFgJTby--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AsS1ujjdrGxyVnUNB" width="800" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In short, Shortcuts to Dataverse doesn’t make the initial setup any easier than before, but after setting up, it’s smooth sailing from there, and very easy to perform cross Dataverse environment analysis in Fabric.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution 4: Dataflow Gen2
&lt;/h2&gt;

&lt;p&gt;Dataflow Gen2 is the low-code tool of choice for Data Ingestion in Microsoft Fabric. The process for ingesting Dataverse data with Dataflows is straightforward:&lt;/p&gt;

&lt;p&gt;You first create a new dataflow, and use the Dataverse Connector to load data from your Environment, selecting which tables to load:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lt5oY75j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AGChTgXfa24N968k4" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lt5oY75j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AGChTgXfa24N968k4" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you wish, you may transform the data, before selecting a data destination to output the data to, with both Fabric Lakehouses and Warehouses being among the options. You are able to both Append and Overwrite destination tables, which are persisted as Delta Tables in OneLake:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BC0p8vta--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AYdPXQWffFZXAohvL" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BC0p8vta--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AYdPXQWffFZXAohvL" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, you may setup a refresh schedule on the Dataflow to automate the process:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yl5jCfiL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ak92TZjGBKeuLiAQN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yl5jCfiL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ak92TZjGBKeuLiAQN" width="800" height="543"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution 5: Semantic Model Integration to Fabric
&lt;/h2&gt;

&lt;p&gt;The process below is the same that I walked through in my blog from last week. You first need to enable two Tenant Settings in your Fabric Admin Portal:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eaWUjVhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2APUUDTHK-f2O4DIIf" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eaWUjVhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2APUUDTHK-f2O4DIIf" width="800" height="258"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, you can connect to a Dataverse table from Power BI desktop, and if you wish, you can use the Power Query Editor to transform the incoming data. After you are done, you publish the model to your Fabric Enabled workspace:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HRejW1qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AvQssdmsdRN0ev6uY" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HRejW1qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AvQssdmsdRN0ev6uY" width="800" height="557"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---HxVP4Uc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2APTO7r0XMijvhdzUj" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---HxVP4Uc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2APTO7r0XMijvhdzUj" width="800" height="534"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Opening the Model from the Workspace, you go into the settings, and enable the OneLake Integration setting:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qEsHfxPC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ATqoVLsAsY0kKYeC6" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qEsHfxPC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ATqoVLsAsY0kKYeC6" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--izHvyFGn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AqM9kv1v9TldWy6xN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--izHvyFGn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AqM9kv1v9TldWy6xN" width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After applying the new setting, data will be automatically ingested into your OneLake when your Semantic Model is either refreshed manually or by a schedule.&lt;/p&gt;

&lt;p&gt;The data can subsequently be accessed by creating a Shortcut within a Lakehouse:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--76eYdW1e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Af2Lldv3edkdbxgx7" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--76eYdW1e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Af2Lldv3edkdbxgx7" width="800" height="769"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WRxD9edM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AA7A_U7w80Gz83SxD" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WRxD9edM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AA7A_U7w80Gz83SxD" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ITsihbI6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AQ6ePswUOCi2XBagN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ITsihbI6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AQ6ePswUOCi2XBagN" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_4T6Xxx4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ABzRcVJZydnC401mQ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_4T6Xxx4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2ABzRcVJZydnC401mQ" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or even by browsing the files using the OneLake explorer on your desktop:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yaw9msBG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AUYqMTdLnEFrlfFUk" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yaw9msBG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AUYqMTdLnEFrlfFUk" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If I go ahead and alter a few values in the table in Dataverse (in this case, deleting a value, and editing a value):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HF5iWt04--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AHCjvJnZNYZmkZ_qr" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HF5iWt04--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AHCjvJnZNYZmkZ_qr" width="800" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The updated values show when I query the shortcutted Lakehouse Table, but only after a refresh of the Semantic Model has occurred:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--diB8rIKh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AkYh87HzXqNIw47K1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--diB8rIKh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AkYh87HzXqNIw47K1" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under the hood, the update is simply performed by creating new Delta Tables, and deleting the old ones. There is no time traveling and retention of old files with this integration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iXOXqgIl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ap6HIXdv8wnW2uIcG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iXOXqgIl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Ap6HIXdv8wnW2uIcG" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The last thing you need to do to automate the process, is to schedule the automatic refresh of your Semantic Model, which is done from the Semantic Model settings, after which your OneLake data will be overwritten each time the model is refreshed.&lt;/p&gt;

&lt;p&gt;A super simple and clean setup for those who are used to working with Power BI and Semantic Models.&lt;/p&gt;

&lt;h2&gt;
  
  
  Comparison remarks and considerations
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Storage and Cost:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Solution 2 and 3 does not move any data from Dataverse into Microsoft Fabric. Instead what happens, is that Delta Tables are created *in *a Dataverse Managed Lake, which counts towards Dataverse Storage.&lt;/p&gt;

&lt;p&gt;This also means that you will pay Dataverse Storage costs for these additional Delta Tables, which can be a costly affair. If this is a concern (e.g. you wish to move large quantities of data), you should consider the Azure Synapse Link for Dataverse approach instead, which instead moves your data to a Data Lake Gen 2 in Azure that you have to manage yourself. Here you have more control over all costs incurred during the process.&lt;/p&gt;

&lt;p&gt;For the Dataflow and Semantic Model, you will “pay” in terms of Capacity Units for your consumption, which I dive into the details of &lt;a href="https://downhill-data.com/2024/05/14/crazy-or-genius-ingesting-data-to-microsoft-fabric-onelake-with-import-semantic-models-instead-of-dataflow-gen2/" rel="noopener noreferrer"&gt;in last weeks blog&lt;/a&gt;. In general though, with my limited testing, Dataflows seem more expensive than Semantic Models in terms of Capacity Units.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Flexibility of process:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With regards to how flexible the process is, Azure Synapse Link, Dataflow Gen2 and Semantic Model Integration provides the greatest amount of flexibility. You can choose for yourself which data to sync, and even perform transformations along the way if you desire.&lt;/p&gt;

&lt;p&gt;On the opposite side, you are not able to have any say in which tables to Sync from Dataverse to Fabric in Solution 2 and 3.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Update Frequency:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Dataverse Link for Fabric has no option for customizing refresh frequency, and instead seems to run on a fixed 1 hour schedule, meaning that you will only achieve near real time data sync to Fabric/OneLake with Solution 2 and 3. For Dataflows and Semantic Models, you can schedule a refresh up to 48 times a day, bringing that down to a delay of 30 minutes.&lt;/p&gt;

&lt;p&gt;Now this is obviously enough for many reporting scenarios, but it will not work as a way to push data Live from Power Apps / Automate to Fabric. In these cases, you will need to write data directly to Fabric instead.&lt;/p&gt;

&lt;p&gt;If you need even more frequent refreshes than every hour, again consider the Azure Synapse Link for Dataverse in Solution 1, which through advanced options may be scheduled to run every 15 minutes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Complexity:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Not all of these approaches are equally straight forward.&lt;/p&gt;

&lt;p&gt;The high Flexibility of Azure Synapse Link come at the expense of a complex setup, if you want full control over your sync. You’ll need to dive into the world of Azure — and that can be a complex place to be. To a lesser extent the same is true of Dataflows and Semantic Models, which require some set up, maintenance and administration, and familiarity with Power BI/Query and the Service.&lt;/p&gt;

&lt;p&gt;Conversely, the Dataverse Link and Shortcut to Dataverse is painless, after you get the few requirements out of the way. You click, click, click and you’re done, and able to see data being synced hourly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;And with that overview, I would like to close this way too long blog post, which has been a long time in the making, as the Fabric team kept inventing new ways to integrate Dataverse and OneLake. Let’s see how long it takes before this blogpost is outdated.&lt;/p&gt;

&lt;p&gt;Did you like the tutorial &amp;amp; comparison? Do you have any other cool ideas for integrating Power Platform and Fabric? Please reach out in the comments, or directly to me.&lt;/p&gt;

</description>
      <category>powerplatform</category>
      <category>powerfuldevs</category>
      <category>microsoftfabric</category>
      <category>onelake</category>
    </item>
    <item>
      <title>My Experience and top tips for the Fabric Analytics Engineer Associate Certification (DP-600)</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Wed, 11 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/my-experience-and-top-tips-for-the-fabric-analytics-engineer-associate-certification-dp-600-p8a</link>
      <guid>https://dev.to/jonvoge/my-experience-and-top-tips-for-the-fabric-analytics-engineer-associate-certification-dp-600-p8a</guid>
      <description>&lt;h2&gt;
  
  
  My Experience and top tips for the Fabric Analytics Engineer Associate Certification (DP-600)
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lj4p6szg34vbu97kxt9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lj4p6szg34vbu97kxt9.png" width="800" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want to showcase your Microsoft Fabric prowess, &lt;a href="https://learn.microsoft.com/en-us/credentials/certifications/exams/dp-600/" rel="noopener noreferrer"&gt;DP-600&lt;/a&gt; is the certification you are looking for. With a &lt;a href="https://learn.microsoft.com/en-us/credentials/certifications/resources/study-guides/dp-600" rel="noopener noreferrer"&gt;syllabus&lt;/a&gt; covering data analytics, data engineering, platform management and enterprise development best practices, it is indeed a test worthy of Analytics Engineers.&lt;/p&gt;

&lt;p&gt;Personally, I felt this was the hardest Microsoft Exam I’ve taken to date, due to the wide range of topics covered, but with a few good learning resources and tips &amp;amp; tricks, I managed to walk out of the exam with a score of 921 out of 1000. Plenty more than the 700 needed to pass.&lt;/p&gt;

&lt;p&gt;Below, I’ve put together a small cheat sheet of learning resources that I used, tips &amp;amp; tricks for the exam, as well as some of the topics which I was surprised to face.&lt;/p&gt;

&lt;h3&gt;
  
  
  Learning Resources before the Exam
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Learning Path — Your point of departure should be the official Microsoft Learning Path. It is especially great at introducing general concepts of Microsoft Fabric, while it is perhaps less good at guiding you towards specific knowledge that may help you answer exam questions: &lt;a href="https://learn.microsoft.com/en-us/credentials/certifications/exams/dp-600/#two-ways-to-prepare" rel="noopener noreferrer"&gt;Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric — Certifications | Microsoft Learn&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Microsoft Lab Exercises — Nothing beats Hands On experience. And while I’d always recommend true project experience over any other resource, the online Lab Exercises created by Microsoft are frankly a great substitute. They are scattered around the Learning Path, but worth visiting on their own. Check them out here: &lt;a href="https://microsoftlearning.github.io/mslearn-fabric/" rel="noopener noreferrer"&gt;mslearn-fabric (microsoftlearning.github.io)&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Microsoft Practice Assessment — Microsoft have their own practice test which allows you to try your best on 50 sample questions (which are not 1:1 equal to actual exam questions, but come pretty close!). Also, the pool of sample questions is greater than 50, so be sure to retake the test once or twice to see more of them. Check it out here: &lt;a href="https://learn.microsoft.com/en-us/credentials/certifications/exams/dp-600/practice/assessment?assessment-type=practice&amp;amp;assessmentId=90" rel="noopener noreferrer"&gt;Practice Assessment | Microsoft Learn&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Pragmatic Works — I am by no means affiliated with Pragmatic Works but found that their CertXP platform had an incredible range of questions which were very similar to the ones on the exam itself. However, the most impressive part of the platform is the thorough explanations that follow each answer. I can only recommend you try it yourself.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  During the exam
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Look for patterns in the answers — Multiple Choice tests will always possess some degree of predictability, and that goes for DP-600 too. In many cases, especially those that ask you to pick the correct line of code, I found that of the four different choices, there would usually be two options which would be almost identical, a third which was quite different, and a fourth which was completely different. In my experience, the correct answer was always one of the two options which were almost identical.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use Microsoft Learn during the exam! — &lt;a href="https://techcommunity.microsoft.com/t5/microsoft-learn-blog/introducing-a-new-resource-for-all-role-based-microsoft/ba-p/3500870" rel="noopener noreferrer"&gt;As of September last year&lt;/a&gt;, all role based exams have access to Microsoft Learn from the exam environment. I used it to search for definitions I had no clue about (DMVs — I’m looking at you!), I used it to validate answers where I was only 80% certain, and I used it to disqualify some answer options for those questions where I knew I had to submit a complete guess anyway.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Read Case Questions before reading the Case — The exam, like all other role based exams from Microsoft, includes a case study that requires you to read quite a bit of text material. However, only a fraction of the text in the case is actually related to the questions that you receive (the pool of questions is much larger than the 6–8 questions you will see). Read the questions first, and then scan the case for the relevant information afterwards.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Questions are not fully up to date — Updates to Microsoft Fabric are coming out faster than a horse can run (as we say in Danish). And some of the questions on the exam are already outdated. They may not be plain wrong, but may be missing an otherwise obvious choice as the answer. As an example, the recent additions to allow RLS editing directly in the semantic model web authoring experience, was not a possible answer option during RLS questions on the test.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Topics which took up a larger part of the exam than I expected
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;DMVs (Dynamic Management Views) and how each view could be used.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Query Folding, and how to tell which steps are folding and which are not. Especially using the Dataflow web experience, not the Power Query desktop experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data Profiling tools in dataflows (again using the web experience UI, not the desktop Power Query UI), with very specific distinctions between Column Quality, Distribution and Profile.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;External tools, including Tabular Editor, ALM Toolkit, DAX Studio, Best Practice Analyzer and Vertipaq Analyzer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;PySpark code snippets. Both “fill in the blanks” type questions, and ‘pick the correct syntax from these examples’ type questions. Especially surprised to see that not only data transformation but also data exploration with PySpark saw quite a bit of emphasis during the test.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Armed with the above, and a bit of studying, I’m sure the certification is within reach for you as well!&lt;/p&gt;

&lt;p&gt;Do you have any other tips &amp;amp; tricks to share with fellow Microsoft Data Professionals? Feel free to share them in the comments.&lt;/p&gt;

</description>
      <category>microsoftfabric</category>
      <category>powerbi</category>
      <category>microsoft</category>
      <category>certification</category>
    </item>
    <item>
      <title>Row Level Security in Microsoft Fabric — Update: No need for Tabular Editor!</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Mon, 09 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/row-level-security-in-microsoft-fabric-update-no-need-for-tabular-editor-1d0o</link>
      <guid>https://dev.to/jonvoge/row-level-security-in-microsoft-fabric-update-no-need-for-tabular-editor-1d0o</guid>
      <description>&lt;h2&gt;
  
  
  Row Level Security in Microsoft Fabric — Update: No need for Tabular Editor!
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2dciy4zd1t0y17te2r8e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2dciy4zd1t0y17te2r8e.png" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In March, I wrote about &lt;a href="https://downhill-data.com/2024/03/19/how-to-setup-rls-in-microsoft-fabric-power-bi-without-breaking-directlake-functionality/" rel="noopener noreferrer"&gt;How to setup RLS in Microsoft Fabric / Power BI without breaking DirectLake functionality — Downhill Data (downhill-data.com)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Back then, RLS was not possible to edit through the Web Authoring experience of Semantic Models in Fabric. Instead, you had to go the roundabout way of using Tabular Edit to add RLS rules to your model.&lt;/p&gt;

&lt;p&gt;However, just a few weeks after my post, an update to the Web Authoring experience was added, which now adds the missing RLS functionality, eliminating the need for Tabular Editor in the solution.&lt;/p&gt;

&lt;p&gt;Therefore, I have revamped the RLS guide to leverage the new interface. Check it out below!&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Semantic Model with an RLS Table
&lt;/h2&gt;

&lt;p&gt;The first step is to create a new Semantic Model on top of your Data Lakehouse or Data Warehouse, in order to have a Semantic Model that uses Direct Lake.&lt;/p&gt;

&lt;p&gt;Below I have done just that, on top of some dummy Taxi data. First I created a Data Warehouse, then I imported the Taxi Data Sample, and finally I created a Usertable with a SQL statement, populating it with a few rows of data to use in RLS:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zzC4jqu0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AdQGfleFtRL2MXsj6" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zzC4jqu0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AdQGfleFtRL2MXsj6" width="800" height="731"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Subsequently, I created a new Semantic Model, including all the tables:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QSVUMPVz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AC7Vu_A8mulpuDlij" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QSVUMPVz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AC7Vu_A8mulpuDlij" width="800" height="570"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice how the “Manage Roles” button is no longer greyed out at the top? That is the recent addition to the web authoring experience that makes setting up RLS easier.&lt;/p&gt;

&lt;p&gt;But before we get that far, we need an additional step.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up the Fixed Identity
&lt;/h2&gt;

&lt;p&gt;Before setting up RLS rules, we need to set up a Fixed Identity to be used instead of the end-users credentials, when authenticating against the data model.&lt;/p&gt;

&lt;p&gt;Open the settings of your newly created Semantic Model:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qfRaZlxd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOvjznzn43X1Fq-MG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qfRaZlxd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOvjznzn43X1Fq-MG" width="800" height="538"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Find the Gateway and cloud connections tab, and create a new connection for your data source:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NJvZ-kc5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AgebDUOfL8hAtihXN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NJvZ-kc5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AgebDUOfL8hAtihXN" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create the cloud connection (the Server and Database should be filled out automatically), and select OAuth 2.0 as authentication method:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--e3tU7Jlr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AwhqXTofdmSSrtGQs" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--e3tU7Jlr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AwhqXTofdmSSrtGQs" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Back in the semantic model settings, map the data source to the newly created connection:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RKIF4qtb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AImj1tqRxZ2NCzLVA" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RKIF4qtb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AImj1tqRxZ2NCzLVA" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations! You have mapped your semantic model to a fixed identity, which will now be used when users access reports built on your model!&lt;/p&gt;

&lt;p&gt;However… This Fixed Identity, using your OAuth credentials, will have access to all of the data in your model that you have access to. Hence, we need to setup our RLS rules.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up RLS on the Semantic Model
&lt;/h2&gt;

&lt;p&gt;Once again, open up your data model in the web authoring experience, and this time, select the “Manage Roles” button at the top. This brings out an interface familiar to any Power BI Desktop user:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zJnhwlm6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A0Hsb0BJS6MsLWNx5" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zJnhwlm6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A0Hsb0BJS6MsLWNx5" width="800" height="493"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Simply use this interface to setup your Security Roles, adding any filters to the Tables required. If you prefer the old DAX editor in place of the modern conditional data filtering interface, it is easy to swap between the two.&lt;/p&gt;

&lt;p&gt;After adding your Roles, you can further go to the ‘Assign’ tab at the top, to add users and AD groups to your roles:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vJZal1NB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AQvnz1gSzeK87eCRN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vJZal1NB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AQvnz1gSzeK87eCRN" width="800" height="514"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Press the Save button, and you’re done! That’s all there is to it! No more need for Tabular Editor for this case.&lt;/p&gt;

&lt;p&gt;If you wish to, you can still open the Security Settings of your Semantic Model to test the rules you’ve made. Just right click on the role in the menu, and select Test As Role, to test if you set up everything correctly:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4rcm7kQS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2ApmxRbR342QKFamoe" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4rcm7kQS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2ApmxRbR342QKFamoe" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And in my case, my report went from this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7TygofwQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AGwHT4Z4SyvP_SwAC" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7TygofwQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AGwHT4Z4SyvP_SwAC" width="800" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To a filter only showing data for Pennsylvania:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1A8AocmE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A27blUpBbKhXw_U28" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1A8AocmE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A27blUpBbKhXw_U28" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope you enjoyed this small update. I for one welcome any implementation that makes development in the browser easier.&lt;/p&gt;

</description>
      <category>microsoftfabric</category>
      <category>powerbi</category>
      <category>data</category>
    </item>
    <item>
      <title>Exploring the new Dynamic Subscriptions for Power BI Reports</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Fri, 06 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/exploring-the-new-dynamic-subscriptions-for-power-bi-reports-3558</link>
      <guid>https://dev.to/jonvoge/exploring-the-new-dynamic-subscriptions-for-power-bi-reports-3558</guid>
      <description>&lt;h2&gt;
  
  
  Exploring the new Dynamic Subscriptions for Power BI Reports
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Dynamic Subscriptions for Power BI were recently released as a Preview Feature in the Power BI Service, allowing for distribution of personalized per recipient reports.&lt;/p&gt;

&lt;p&gt;The feature is explained in some depth in the official &lt;a href="https://learn.microsoft.com/en-gb/power-bi/collaborate-share/power-bi-dynamic-report-subscriptions" rel="noopener noreferrer"&gt;Documentation&lt;/a&gt; &amp;amp; &lt;a href="https://powerbi.microsoft.com/en-us/blog/dynamic-subscriptions-are-now-available-for-power-bi-reports/" rel="noopener noreferrer"&gt;Power BI Blog&lt;/a&gt;, but when playing around with the feature I came across both a few easily overlooked features, as well as some potential use cases that I failed to setup correctly.&lt;/p&gt;

&lt;p&gt;The rest of the article below summarizes my exploration of the feature, both in terms of what works and what does not work in the current Preview state.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up Dynamic Subscriptions
&lt;/h2&gt;

&lt;p&gt;Dynamic Subscriptions are set up from the same place as the regular subscriptions — by opening a Power BI Report either from within its Workspace, or from its Workspace App, and clicking the “Subscribe to Report” button:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ASnws-MbIwhuCZtHi" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ASnws-MbIwhuCZtHi"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Opening Dynamic per Recipient allows you to connect to a published Semantic Model, from which you can selects fields to be used for dynamic content. Currently, you can dynamically control:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Email Address of the recipients&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Email Subject for each recipient&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Report Filters for each recipient&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Attachment Filetype for each recipient&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2A4FAaqmzOISkstgjm" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2A4FAaqmzOISkstgjm"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This means, that you for example can distribute the same report to 10 different recipient emails, each with customized Report Filters, and potentially letting some recipients receive PDFs and others .pptx files. Note that there is no way to manipulate the data that you pull into the Dynamic Report, and everything need to be modeled beforehand. You are however not limited to connecting to the same Semantic Model as the one used for your report.&lt;/p&gt;

&lt;p&gt;In my example I’ve populated a barebones UserTable in my DataWarehouse, with two recipients each belonging to a “State” value in the Geography Dimension of my data model, but this could of course be your regular User or dynamic RLS table:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AiLFbczrJOEhFbOD5" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AiLFbczrJOEhFbOD5"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Afterwards, you setup the basic email details, mapping the recipient to the email field you imported:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AWzepR3RkpoXcg98k" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AWzepR3RkpoXcg98k"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As well as adding your dynamic filters and setting up a schedule for the subscription. Here I am filtering so that each User will only see data relevant to the State that they have access to in the Usertable:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2ARO-IyyjJG0RfhBUd" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2ARO-IyyjJG0RfhBUd"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2AA6mCeylw7DTQSTAx" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2AA6mCeylw7DTQSTAx"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After reviewing, the Subscription is easily edited, activated/deactivated, or tested from the subscription overview:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Ajn210-3DhiGY0-Xe" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F0%2Ajn210-3DhiGY0-Xe"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An overview of Subscriptions may also be found in your personal settings in the Service:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AZgtcEwN498sXbNGT" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AZgtcEwN498sXbNGT"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Testing the Subscription shows how emails and attachments may look:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2AFPfFmAyoeURakMNC" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2AFPfFmAyoeURakMNC"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AgWgXlVbu5Kzuj6Ct" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AgWgXlVbu5Kzuj6Ct"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And in this case, only Pennsylvania data is shown to the recipient of the report.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exploring Filter possibilities and limitations
&lt;/h2&gt;

&lt;p&gt;So that was the basics out of the way. Next up, I set out to do a couple of tests, trying to test the limits of the filter functionality.&lt;/p&gt;

&lt;h3&gt;
  
  
  Measure Filters
&lt;/h3&gt;

&lt;p&gt;First I tried creating a Measure in my Semantic Model, and try to use that as a filter condition. I first tried creating a measure for Today() to dynamically filter by today’s date:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ARvTGV227Amfh9adC" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ARvTGV227Amfh9adC"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2048%2F0%2A6C0zLbzO75roVkdD" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2048%2F0%2A6C0zLbzO75roVkdD"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unfortunately my test failed. I subsequently tried creating a Measure to filter for only “Friday” records, and also tried a numeric measure filter, both tests which failed as well.&lt;/p&gt;

&lt;p&gt;It appears as if it not currently possible to use Measures in filters, despite the ability to select them in the data selection menu. If you have figured out a way to do so, please do drop me a message! I suspect they may be possible to use in dynamic headers in the subscriptions, or something similar.&lt;/p&gt;

&lt;h3&gt;
  
  
  Column Filters
&lt;/h3&gt;

&lt;p&gt;Column filters on the other hand. Now they work great! In my test, I tried both creating a date column in the Data Warehouse for Today’s date, as well as a Text column to filter for State. These tests succeeded, and the result is reports which filter correctly for the supplied values:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ASEUpBQNzX-bkLo5r" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ASEUpBQNzX-bkLo5r"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2A0OtiVrJjWWWW9OAg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2044%2F0%2A0OtiVrJjWWWW9OAg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Measure Filters
&lt;/h3&gt;

&lt;p&gt;Applying Report filters was also a success. In this example I added a Page Level filter for Month = October as the only filter, and the results were indeed only October values:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2Aa41TyORwX6azJbBU" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2Aa41TyORwX6azJbBU"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Visual Filters
&lt;/h2&gt;

&lt;p&gt;And finally, it is possible to use the visual interaction between visualisations and apply those to the subscription. Meaning that filters set in slicers, or values highlighted in visuals, will trigger a new option ‘Pre-Existing filters’ being available in the ‘Map your data’ menu:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AWeJjQMlEr5pLK2d-" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2AWeJjQMlEr5pLK2d-"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2Af2p_ExkVSiZxeSTq" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2Af2p_ExkVSiZxeSTq"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2048%2F0%2ANH_lNfsQ0i1nOE8m" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2048%2F0%2ANH_lNfsQ0i1nOE8m"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Limitations and Oddities
&lt;/h2&gt;

&lt;p&gt;The feature is in preview still, and there are some limitations to the functionality still. The Measures as Filters issue described above is one. Another is the current cap of 50 separate recipients, listed in the official documentation.&lt;/p&gt;

&lt;p&gt;Hence, if you need more than 50 recipients, the good old Power Automate Export To File is still your only option: &lt;a href="https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-automate-power-bi-report-export" rel="noopener noreferrer"&gt;Export and email a report with Power Automate — Power BI | Microsoft Learn&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another weird bug I encountered is that some of my Dynamic Subscriptions, started sending themselves twice at the same time, seemingly without consistency. I couldn’t quite isolate when it would happen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ALSA3xvdx7BkIu4s3" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2046%2F0%2ALSA3xvdx7BkIu4s3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Dynamic Subscriptions for Power BI reports are to me a welcome feature, and may in time (once recipient limitations are liberated) be a worthy substitute for current Power Automate subscription flows.&lt;/p&gt;

&lt;p&gt;The ability to allow the reports to propagate visual level filters to the attachment reports in the subscription is a cool feature, which may help highlight specific insights in key reports.&lt;/p&gt;

&lt;p&gt;I’m curious to see if people brighter than me will be able to make even better use of the dynamic filtering options, and potentially find a way to leverage Measures as well.&lt;/p&gt;

</description>
      <category>powerfuldevs</category>
      <category>powerplatform</category>
      <category>powerbi</category>
      <category>microsoftfabric</category>
    </item>
    <item>
      <title>How to setup RLS in Microsoft Fabric / Power BI without breaking DirectLake functionality</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Wed, 04 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/how-to-setup-rls-in-microsoft-fabric-power-bi-without-breaking-directlake-functionality-11me</link>
      <guid>https://dev.to/jonvoge/how-to-setup-rls-in-microsoft-fabric-power-bi-without-breaking-directlake-functionality-11me</guid>
      <description>&lt;h2&gt;
  
  
  How to setup RLS in Microsoft Fabric / Power BI without breaking DirectLake functionality
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;RLS in Microsoft Fabric is a convoluted topic which at the time of writing, is not very clearly spelled out for users.&lt;/p&gt;

&lt;p&gt;Most documentation refers to setting up RLS on Data Warehouses or Data Lakehouses by applying permissions directly on the tables: &lt;a href="https://learn.microsoft.com/en-us/fabric/data-warehouse/row-level-security" rel="noopener noreferrer"&gt;Row-level security in Fabric data warehousing — Microsoft Fabric | Microsoft Learn&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, to the frustration of many Power BI developers, this breaks the coveted DirectLake functionality, making the storage mode fallback to DirectQuery, leaving us unable to leverage those sweet performance gains.&lt;/p&gt;

&lt;p&gt;However, what if I told you that there is a way to implement RLS without breaking DirectLake? Actually, you just need to do it the old fashioned way: Apply RLS directly on the Semantic Model, but with a twist involving fixed identities. A process which can only be done using external tools.&lt;/p&gt;

&lt;p&gt;Read on to learn how!&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Semantic Model with an RLS Table
&lt;/h2&gt;

&lt;p&gt;The first step is to create a new Semantic Model on top of your Data Lakehouse or Data Warehouse, in order to have a Semantic Model that uses Direct Lake.&lt;/p&gt;

&lt;p&gt;Below I have done just that, on top of some dummy Taxi data. First I created a Data Warehouse, then I imported the Taxi Data Sample, and finally I created a Usertable with a SQL statement, populating it with a few rows of data to use in RLS:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nwsAK40V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A-OHODCzcLrTDi90F" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nwsAK40V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A-OHODCzcLrTDi90F" width="800" height="731"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Subsequently, I created a new Semantic Model, including all the tables:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CyyCHsGG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIzQNAF-KFASvoU3R" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CyyCHsGG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AIzQNAF-KFASvoU3R" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice how the “Manage Roles” button is greyed out at the top? That is the cardinal issue that most people face when setting RLS up on Semantic Models in Fabric.. They simply get to this point and then stop.&lt;/p&gt;

&lt;p&gt;But there is a way!&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up the Fixed Identity
&lt;/h2&gt;

&lt;p&gt;The first thing we need to do, is to set up a Fixed Identity to be used instead of the end-users credentials, when authenticating against the data model.&lt;/p&gt;

&lt;p&gt;Open the settings of your newly created Semantic Model:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HAklHWPs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AW-3EVmXo8ljM7wA6" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HAklHWPs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AW-3EVmXo8ljM7wA6" width="800" height="538"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Find the Gateway and cloud connections tab, and create a new connection for your data source:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--q68fzRRt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AHq3Pt5OsLBjff76C" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--q68fzRRt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AHq3Pt5OsLBjff76C" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create the cloud connection (the Server and Database should be filled out automatically), and select OAuth 2.0 as authentication method:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WNfSWVZH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AWVBa1kabOEoFvUh5" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WNfSWVZH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AWVBa1kabOEoFvUh5" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Back in the semantic model settings, map the data source to the newly created connection:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lfn0yg2z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AUFT2lm8lqSgSsKUV" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lfn0yg2z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AUFT2lm8lqSgSsKUV" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations! You have mapped your semantic model to a fixed identity, which will now be used when users access reports built on your model!&lt;/p&gt;

&lt;p&gt;However… This Fixed Identity, using your OAuth credentials, will have access to all of the data in your model that you have access to. Hence, we need to setup our RLS rules.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up RLS on the Semantic Model
&lt;/h2&gt;

&lt;p&gt;As you saw previously, we are unable to create security roles directly in the browser. And unfortunately, creating a report in the browser based on the semantic model, downloading the .pbix and trying to setup roles in there, does not offer better results:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qOwnEJec--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AuYDO13MJ_2H8VBXQ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qOwnEJec--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AuYDO13MJ_2H8VBXQ" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Instead, what we need to do is use External Tools against the XMLA endpoint of the workspace, to manipulate the semantic model. In this case, we will use the free external tool, Tabular Editor 2: &lt;a href="https://tabulareditor.github.io/TabularEditor/" rel="noopener noreferrer"&gt;Tabular Editor 2.x | TabularEditor&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go to the Premium tab of your workspace settings, and retrieve the connection string shown there. This is your XMLA endpoint for the workspace in question:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r-W23kMc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AFPwQVdLJymVS9RAH" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r-W23kMc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AFPwQVdLJymVS9RAH" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, Open Tabular Editor, and connect to a tabular server, providing the copied workspace connection as the server:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4kQPt-jb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOyw30gWoRiAPaf_3" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4kQPt-jb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AOyw30gWoRiAPaf_3" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After authenticating, select your semantic model:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lj6jkCE_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AIjY-_cqXmp98wDVN" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lj6jkCE_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AIjY-_cqXmp98wDVN" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And finally, we can set up our RLS roles. Right click the Roles folder, and create a new role. Scroll down to Row Level Security, and apply your RLS rule of choice. In this case, I am implementing a simple DAX filter on my Geography dimension:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mt7Cz22i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AUJ-e2Qs5n9ZIVM1b" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mt7Cz22i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AUJ-e2Qs5n9ZIVM1b" width="800" height="505"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After setting up your rules, scroll further down and make sure to change the Model Permission value from None to Read. Else, you will not be able to publish your changes to the semantic model.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TxtwYjRx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AveQ-g4B7Cv5GP5vn" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TxtwYjRx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AveQ-g4B7Cv5GP5vn" width="800" height="489"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another issue you may encounter when saving your changes, is that it complaints about Read/Write permissions on your tenant. In order to save and publish the changes to the model, you need to ensure that your Capacity Settings in your Fabric Tenant Settings allow both Read AND Write to XMLA Endpoints. In my case, I had to go and make this adjustment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sZG1fhY_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2Ao1eJf-6FfBgDTNIF" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sZG1fhY_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2Ao1eJf-6FfBgDTNIF" width="750" height="809"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you can finally save and publish your model changes to the service. The next step is to open the Security Settings of your semantic model, and add members to your newly created roles:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f9DiWp-E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ACB7AP0_aGqGP8Np5" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f9DiWp-E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ACB7AP0_aGqGP8Np5" width="800" height="347"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Right click on the role in the menu, and select Test As Role, to test if you set up everything correctly:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lm7OK4ko--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A_C-Gha6Gcbtl8w5v" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lm7OK4ko--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2A_C-Gha6Gcbtl8w5v" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And in my case, my report went from this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2Y1ohnw---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Aya81lM-H_rtN_FdX" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2Y1ohnw---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2Aya81lM-H_rtN_FdX" width="800" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To a filter only showing data for Pennsylvania:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CswlcPCh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AntLcCnhG_9r1nWQH" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CswlcPCh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AntLcCnhG_9r1nWQH" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And that’s all there is to it!&lt;/p&gt;

&lt;p&gt;With this RLS setup, your report will always use your credentials as a fixed identity for accessing the Lakehouse/Warehouse underneath your semantic model. This in turn warrants you to set up RLS via Tabular Editor, to ensure that no security issues occur, but has the added benefit that DirectLake functionality is intact when consuming the report.&lt;/p&gt;

</description>
      <category>microsoft</category>
      <category>lakehouse</category>
      <category>database</category>
      <category>microsoftfabric</category>
    </item>
    <item>
      <title>How to create Measure Tables in Microsoft Fabric</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Mon, 02 Sep 2024 07:00:00 +0000</pubDate>
      <link>https://dev.to/jonvoge/how-to-create-measure-tables-in-microsoft-fabric-4j6o</link>
      <guid>https://dev.to/jonvoge/how-to-create-measure-tables-in-microsoft-fabric-4j6o</guid>
      <description>&lt;h2&gt;
  
  
  How to create Measure Tables in Microsoft Fabric
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Creating Measure Tables in Microsoft Fabric Semantic Models
&lt;/h2&gt;

&lt;p&gt;Measure Tables have long been used to store Measures in Power BI Semantic Models.&lt;/p&gt;

&lt;p&gt;However, in Microsoft Fabric, where most of the data modeling takes place in the browser, at least if we wish to reap the benefits of the DirectLake storage mode, the process of creating Measure Tables are slightly different.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Creating an empty table
&lt;/h2&gt;

&lt;p&gt;First you need to create an empty table. Or rather, a table with one column in it (a table can’t have zero columns in Fabric), but you should not add any data to this table.&lt;/p&gt;

&lt;p&gt;If the Data Lakehouse is your weapon of choice, you may use a Notebook to do so with Python:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data = [(1, 'MeasureTable')]
columns = ['ID', 'Col1']

measure_df = spark.createDataFrame(data, columns)
measure_df.show()

spark.sql("DROP TABLE IF EXISTS MeasureTable")
measure_df.write.format("delta").saveAsTable('MeasureTable')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UJnz9JGU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2As8AelKiUrYPd3QMR" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UJnz9JGU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2As8AelKiUrYPd3QMR" width="800" height="465"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you are using the Lakehouse SQL Endpoint you can create an empty view, or if you are using Data Warehouse, create a table with SQL:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XzegLwMm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AY_q6KdHdIUjM8CHa" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XzegLwMm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AY_q6KdHdIUjM8CHa" width="800" height="625"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Pulling the table into your Semantic Model
&lt;/h2&gt;

&lt;p&gt;The next step is to create a Semantic Model on top of your Warehouse or Lakehouse:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vp1RTfEv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AKIQXpeB_WHW9AcXe" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vp1RTfEv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AKIQXpeB_WHW9AcXe" width="800" height="529"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And subsequently, add a couple of measures, before finally hiding the only column present in the table, to convert it into a Measure Table:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3JkddsMW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AChwRwiUNEAGmrE6s" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3JkddsMW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AChwRwiUNEAGmrE6s" width="703" height="698"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And there you have it! It is that simple to create centralized measure tables in your Microsoft Fabric semantic models. Go build!&lt;/p&gt;

</description>
      <category>microsoftfabric</category>
      <category>microsoft</category>
      <category>data</category>
      <category>database</category>
    </item>
    <item>
      <title>What is using all of my Microsoft Fabric Capacity? A quick guide to installing and using the Fabric Capacity Metrics App</title>
      <dc:creator>Jon Stjernegaard Vöge</dc:creator>
      <pubDate>Sun, 01 Sep 2024 20:33:02 +0000</pubDate>
      <link>https://dev.to/jonvoge/what-is-using-all-of-my-microsoft-fabric-capacity-a-quick-guide-to-installing-and-using-the-fabric-capacity-metrics-app-38ie</link>
      <guid>https://dev.to/jonvoge/what-is-using-all-of-my-microsoft-fabric-capacity-a-quick-guide-to-installing-and-using-the-fabric-capacity-metrics-app-38ie</guid>
      <description>&lt;h2&gt;
  
  
  What is using all of my Microsoft Fabric Capacity? A quick guide to installing and using the Fabric Capacity Metrics App
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Are you experiencing throttling or perhaps even outages in your Microsoft Fabric Capacity? Are you confused why your Capacity is maxed out, or perhaps wish to explore your usage deeper?&lt;/p&gt;

&lt;p&gt;Then you have come to the right place.&lt;/p&gt;

&lt;p&gt;This brief article will show you how to install the Fabric Capacity Metrics App, and give you some quick pointers for navigating it and understanding the insights it provides.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing the Fabric Capacity Metrics App
&lt;/h2&gt;

&lt;p&gt;To install the Capacity Metrics App, you need to follow a few steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Install the App by going to app.powerbi.com, find the Admin Portal and the Capacity Settings, and click “See usage report”:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mtxnfCYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2AoEm6I1dLhowc-Apq" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mtxnfCYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2042/0%2AoEm6I1dLhowc-Apq" width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;From the App page, select ‘Get It Now’:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y8W1xBhD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AoRMil9DQj1eCSp3U" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y8W1xBhD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2AoRMil9DQj1eCSp3U" width="800" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After installing, open the report and click ‘Connect’:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--x9E4WuqT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AfAEEEeNeDdRW_Fm_" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--x9E4WuqT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AfAEEEeNeDdRW_Fm_" width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Find the Capacity ID in your Capacity Settings, and input it in the Connection menu:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UQFO3T9c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ANVXrPMMG_wZYr-cw" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UQFO3T9c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2044/0%2ANVXrPMMG_wZYr-cw" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--H4YaRBgd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AHgx5Lw1sL8dCm3HW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--H4YaRBgd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2000/0%2AHgx5Lw1sL8dCm3HW" width="800" height="807"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, add data source credentials to the installed app’s data model, and let the report refresh:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6rSnvtCS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Aln3h_IwaJvjk9-aY" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6rSnvtCS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2Aln3h_IwaJvjk9-aY" width="800" height="546"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using the Capacity Metrics App
&lt;/h2&gt;

&lt;p&gt;With the app installed, its time to have a look at the insights it might assist you with. The App is split into multiple pages. From the main navigation, You’ll have access to the Compute and Storage pages, but I find that some of the most impactful insights are the ones hidden in the drill through pages. Read on below for a small tour.&lt;/p&gt;

&lt;h3&gt;
  
  
  Compute Page
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oN6z5lfL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AFXHrA5TOPv-yYVqD" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oN6z5lfL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AFXHrA5TOPv-yYVqD" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Overview page provides an overwhelming amount of information. The way I use the page is as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Am I leveraging the full extent of my Capacity, and what types of processes are driving the usage?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;The top right ‘CU over time’ chart in this case shows that I am only using about 1% of my Trial F64 SKU. I even had to switch the diagram to Logarithmic Y-Axis scaling to see anything in the visualisation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GRplUVub--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AMFleyTht8UaiqDGp" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GRplUVub--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AMFleyTht8UaiqDGp" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The chart also clues me in on whether my usage has been driven by interactive or background processes. In my case, mostly background (no wonder, seeing as this is my developer account tenant, used for testing purposes).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Had my usage been higher, the Throttling and Overages views would have shown more significant insights, allowing me to see Delays, Rejections and Overages. In my case, there are only very limited delays:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--02laNGSj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AWcFiAo1soHCFzSy0" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--02laNGSj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AWcFiAo1soHCFzSy0" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Which artifact types and specific items are spending most of my Capacity?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;The Ribbon chart allows for a quick overview of the most demanding artifact types are eating my capacity. I find that I only really use the CU option here, as that is also the billing metric. In my case, I clearly have something going on with some expensive Warehouse artifacts:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xXWxk6Db--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2028/0%2A_K3bINrIBqafJ1dR" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xXWxk6Db--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2028/0%2A_K3bINrIBqafJ1dR" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Don’t forget to click a data point to filter the other visuals. In my case, it is the DataflowsStagingWarehouse, as shown in the filtered Items table:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y6xRo9y7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AN5maffjuTT0Jghax" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y6xRo9y7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AN5maffjuTT0Jghax" width="800" height="501"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For more details on the Compute page, have a look at the official documentation: &lt;a href="https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-compute-page" rel="noopener noreferrer"&gt;Understand the metrics app compute page — Microsoft Fabric | Microsoft Learn&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Timepoint Details Page
&lt;/h3&gt;

&lt;p&gt;To me, the hidden gem of the Fabric Capacity Metrics App is the Timepoint Details page, which many people don’t event realize exists.&lt;/p&gt;

&lt;p&gt;To find it, right click or select a specific time point in the Top Right visualisation, and drill through to the Timepoint Details Page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fJdRIjjp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AUSJPrEItD25w4y5A" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fJdRIjjp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AUSJPrEItD25w4y5A" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Which specific actions are driving my usage?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;On the TimePoint Details page I can see that there are both Interactive and Background operations going on during the selected 30 second Timepoint. Many of which are indeed related to the Warehouse identified before.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zMFH2enh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ALy-0yCcLv2ZI4UdK" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zMFH2enh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2ALy-0yCcLv2ZI4UdK" width="800" height="574"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Notice how the Interactive Operations show timepoints that start right around the selected Timepoint? This is because interactive operations are smoothed over a 5 minute time interval, in terms of how the usage affects your capacity.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Notice how the Background Operations display activities which are not taking place in the selected TimePoint? This is because background operations are smoothed over 24 hours!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Nonetheless, I can see that at the selected point in time, my capacity was used interactively for some write-back to the Warehouse using the XMLA endpoint (more about live data write-back in this blogpost: &lt;a href="https://downhill-data.com/2024/05/07/live-data-write-back-to-fabric-data-warehouse-from-power-bi-w-power-apps-and-power-automate/(opens%20in%20a%20new%20tab)" rel="noopener noreferrer"&gt;Link to blog about Write-Back to Fabric&lt;/a&gt;), as well as some background acitivities which due to smoothing was also consuming CUs at this time.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Are there any specific Users or Operations which stand out?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Another thing I like to do, is to examine whether any specific Users or actions stand out in the report. In this case, I can see that it is myself, using the jonAdmin account, which is driving most of the usage.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Especially if you are diving into an overage scenario, it is most helpful to know if specific users are misusing Fabric Items, in order for you to help them finding a better solution.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As another example, in a different tenant, I was curious about how expensive it was to perform Copilot queries, and I could observe the following in the Timepoint Details page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--b4pDBvIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AlOPkwDohuIbgItCk" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--b4pDBvIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2AlOPkwDohuIbgItCk" width="800" height="571"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In which I can see that Copilot Queries are most definitely not “free” to execute in terms of Capacity Units. If you are curious specifically about CU consumption of Copilot queries, have a look at Chris Webb’s blog: &lt;a href="https://blog.crossjoin.co.uk/2024/03/17/how-much-does-copilot-cost-in-microsoft-fabric/" rel="noopener noreferrer"&gt;How much does Copilot cost in Microsoft Fabric? (crossjoin.co.uk)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For more information about the Timepoints Detail page, here is a link to the official documentation: &lt;a href="https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-timepoint-page" rel="noopener noreferrer"&gt;Understand the metrics app timepoint page — Microsoft Fabric | Microsoft Learn&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  OneLake Storage page
&lt;/h3&gt;

&lt;p&gt;Finally, we have the OneLake Storage page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Mwu2uplm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AIvuyHwblK0O15rPn" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Mwu2uplm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2048/0%2AIvuyHwblK0O15rPn" width="800" height="566"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Microsoft Fabric, we have to pay for storage, in addition to our Capacity, hence another relevant question for any Fabric Administrator might be:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;How much billable storage are we using, are what are the primary contributers?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Luckily, this page is easy to interpret. My Demo workspace by far takes 1st place in billable storage, and by clicking it I get a filtered view of the Workspace specific Total Storage and Billable Storage.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Clicking the Export Data button in the top right, further lets you download this storage data if you wish:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BvdVGxyJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A9E4rK6cHsCjD1HIK" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BvdVGxyJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn-images-1.medium.com/max/2046/0%2A9E4rK6cHsCjD1HIK" width="800" height="567"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More details about the storage page can be found on this documentation page: &lt;a href="https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-storage-page" rel="noopener noreferrer"&gt;Understand the metrics app storage page — Microsoft Fabric | Microsoft Learn&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;The Capacity Metrics App is an excellent tool for understanding your Capacity Usage and Storage, as well as pinpointing exact operations which might be the cause of overages.&lt;/p&gt;

&lt;p&gt;Use the app to monitor your whole capacity and/or specific workspaces, and to understand if the users of your tenant are leveraging their Fabric Capabilities correctly.&lt;/p&gt;

&lt;p&gt;I hope you learned a thing or two from the sample Questions and Approaches to answering them. What are you using the Metrics App for? Please let me know if you have other cool use cases!&lt;/p&gt;

</description>
      <category>microsoftfabric</category>
      <category>capacitymetrics</category>
      <category>microsoft</category>
      <category>data</category>
    </item>
  </channel>
</rss>
