<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Tommaso Stocchi</title>
    <description>The latest articles on DEV Community by Tommaso Stocchi (@tommasodotnet).</description>
    <link>https://dev.to/tommasodotnet</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tommasodotnet"/>
    <language>en</language>
    <item>
      <title>Aspire Multilanguage with .NET Aspire Community Toolkit</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Mon, 25 Nov 2024 12:10:26 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/aspire-multilanguage-with-net-aspire-community-toolkit-467a</link>
      <guid>https://dev.to/tommasodotnet/aspire-multilanguage-with-net-aspire-community-toolkit-467a</guid>
      <description>&lt;p&gt;Not long ago, I posted about how to use .NET Aspire even when our distributed application involves &lt;a href="https://dev.to/tommasodotnet/net-aspire-multilanguage-57l3"&gt;different languages&lt;/a&gt;. In that article, I used the ability of .NET Aspire to support different languages natively or via containers to build a distributed calculator. In this article, I will show how this has improved thanks to the &lt;a href="https://devblogs.microsoft.com/dotnet/introducing-the-dotnet-aspire-community-toolkit/" rel="noopener noreferrer"&gt;.NET Aspire Community Toolkit&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the .NET Aspire Community Toolkit?
&lt;/h2&gt;

&lt;p&gt;It's an open-source repository where third parties can contribute their own integrations and extensions to the .NET Aspire ecosystem. Writing such integrations is easy, as demonstrated by &lt;a href="https://www.youtube.com/watch?v=XMrtdRaxnOM" rel="noopener noreferrer"&gt;Aaron Powell during dotnetconf&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The .NET Aspire Community Toolkit already offers all the integrations we need to move the distributed calculator away from local containers and run executables locally.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/community-toolkit/hosting-golang?tabs=dotnet-cli" rel="noopener noreferrer"&gt;Golang&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/community-toolkit/hosting-nodejs-extensions?tabs=dotnet-cli%2Cyarn" rel="noopener noreferrer"&gt;NodeJs Extensions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How to use the .NET Aspire Community Toolkit
&lt;/h2&gt;

&lt;p&gt;To use the Golang integration, simply add the appropriate NuGet package to your project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet add package CommunityToolkit.Aspire.Hosting.Golang
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, use the following code to configure the Golang app:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="k"&gt;add&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddGolangApp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"../../go-adder"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"APP_PORT"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsDockerFile&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is a significant improvement over our previous setup:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="k"&gt;add&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ExecutionContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;IsPublishMode&lt;/span&gt;
    &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"acrt6xtihl2b3uxe.azurecr.io/addapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;6000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"APP_PORT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithOtlpExporter&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsContainer&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Additionally, we can leverage the NodeJs Extensions to automatically install the necessary node packages for our NodeJs divider app and frontend:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;divide&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddNodeApp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"divideapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;scriptPath&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"app.js"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;workingDirectory&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"../../node-divider"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithNpmPackageInstallation&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="c1"&gt;// other config&lt;/span&gt;

&lt;span class="c1"&gt;//...&lt;/span&gt;

&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddViteApp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"calculator-front-end"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;workingDirectory&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"../../react-calculator"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;packageManager&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"yarn"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithYarnPackageInstallation&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="c1"&gt;// other config&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;WithNpmPackageInstallation&lt;/code&gt; and &lt;code&gt;WithYarnPackageInstallation&lt;/code&gt; methods will automatically install the necessary packages for the NodeJs and React apps, respectively.&lt;/p&gt;

&lt;h2&gt;
  
  
  More than just other languages
&lt;/h2&gt;

&lt;p&gt;The .NET Aspire Community Toolkit is not just about languages. It also offers integrations with other services, such as &lt;a href="https://learn.microsoft.com/en-us/azure/data-api-builder/overview" rel="noopener noreferrer"&gt;Data API Builder&lt;/a&gt;. In &lt;a href="https://dev.to/tommasodotnet/net-aspire-for-data-chatting-3ghe"&gt;this blog&lt;/a&gt;, I demonstrated how to use Data API Builder to create a chat-with-your-database application. Integrating DAB with .NET Aspire resulted in this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;dabService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"dab"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"mcr.microsoft.com/azure-databases/data-api-builder"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;5000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithBindMount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;@"D:\src\distributed-dab-sk\aw-data-api-builder\dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"/App/dab-bm0/dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithArgs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"--ConfigFileName"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"./dab-bm0/dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithOtlpExporter&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsContainer&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Using the &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/community-toolkit/hosting-data-api-builder?tabs=dotnet-cli" rel="noopener noreferrer"&gt;Data API Builder integration&lt;/a&gt;, we can simplify it to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;dabService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddDataAPIBuilder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"dab"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;@"..\aw-data-api-builder\dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Useful Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/tommasodotNET/aspire-multilanguage" rel="noopener noreferrer"&gt;Updated Distributed Application Repository&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/community-toolkit/overview?tabs=dotnet-cli" rel="noopener noreferrer"&gt;.NET Aspire Community Toolkit&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/CommunityToolkit/Aspire/issues" rel="noopener noreferrer"&gt;Propose your own integration!&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>.NET Aspire and Dapr. Overlap or complementary?</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Sun, 29 Sep 2024 10:53:23 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/net-aspire-and-dapr-overlap-or-complementary-1beb</link>
      <guid>https://dev.to/tommasodotnet/net-aspire-and-dapr-overlap-or-complementary-1beb</guid>
      <description>&lt;p&gt;Although .NET Aspire and Dapr are two different projects, they have some overlap in their goals. Both projects aim to simplify the development of distributed applications by providing a set of building blocks that developers can use. However, there are some key differences between the two projects that make them complementary rather than competing.&lt;/p&gt;

&lt;p&gt;Let's take a closer look at the goals of each project and how they can work together to help developers build better distributed applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  .NET Aspire
&lt;/h2&gt;

&lt;p&gt;.NET Aspire aims to simplify the development of distributed applications, including architectural components such as cache, databases, and queues. These components - now called &lt;a href="https://devblogs.microsoft.com/dotnet/announcing-dotnet-aspire-8-2/" rel="noopener noreferrer"&gt;integrations&lt;/a&gt; - are designed to be easy to use and configure, so developers can focus on building their applications rather than worrying about the underlying infrastructure.&lt;/p&gt;

&lt;p&gt;On top of that, Aspire offers the ability to inject the connection strings of both the integrations and the different services where needed. Let's take a look at the basic .NET Aspire template.&lt;br&gt;
Let's run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet new aspire-starter &lt;span class="nt"&gt;--use-redis-cache&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;code&gt;&amp;lt;Project_Name.AppHost&amp;gt;&lt;/code&gt; project that has been created for us, we can find the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DistributedApplication&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CreateBuilder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;cache&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddRedis&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"cache"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;apiService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AddProject&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Projects&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DistributedDAB_ApiService&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;(&lt;/span&gt;&lt;span class="s"&gt;"apiservice"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AddProject&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Projects&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DistributedDAB_Web&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;(&lt;/span&gt;&lt;span class="s"&gt;"webfrontend"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithExternalHttpEndpoints&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cache&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;apiService&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Build&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;Run&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you can see, we are adding a Redis cache and a project called ApiService. We are also adding a project called WebFrontend that has external HTTP endpoints and references the cache and the ApiService. .NET Aspire is actually injecting the connection strings of the cache and the ApiService. In .NET, we can use the connection string like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Services&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AddHttpClient&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;WeatherApiClient&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;(&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;BaseAddress&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"https+http://apiservice"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In other coding languages we would have read the connection string as an environment variable:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;services__apiservice__http__0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;On top of that, Aspire offers observability via the &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash" rel="noopener noreferrer"&gt;Aspire Dashboard&lt;/a&gt; leveraging OpenTelemetry. Using the dashboard, developers can monitor the performance of their applications, read logs and traces, and troubleshoot issues in real-time. As a cherry on top, we can leverage the &lt;a href="https://github.com/Azure/azure-dev" rel="noopener noreferrer"&gt;Azure Developer CLI&lt;/a&gt; to &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/deployment/azure/aca-deployment" rel="noopener noreferrer"&gt;easily deploy&lt;/a&gt; our applications to Azure.&lt;/p&gt;

&lt;p&gt;.NET Aspire actually follows the deployment of distributed applications from the IDE to the cloud.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dapr
&lt;/h2&gt;

&lt;p&gt;Dapr is a portable, event-driven runtime that makes it easy for developers to build resilient, microservices-based applications. Dapr provides a set of building blocks that developers can use to build their applications, including state management, pub/sub messaging, and service invocation. These building blocks are designed to be language-agnostic, so developers can use them with any programming language.&lt;/p&gt;

&lt;p&gt;Dapr uses the &lt;a href="https://kubernetes.io/docs/concepts/workloads/pods/sidecar-containers/" rel="noopener noreferrer"&gt;sidecar&lt;/a&gt; pattern. Basically, our microservice doesn't talk to the Redis cache or the API service directly. Instead, it talks to the Dapr sidecar, which in turn talks to the cache or the API service. This allows Dapr to provide a consistent interface for all the building blocks, regardless of the underlying implementation.&lt;/p&gt;

&lt;p&gt;What's awesome about Dapr is the ability to define a building block — say a pub/sub message — in a YAML file and then use it in any programming language. This is possible because Dapr is a sidecar that runs alongside the application, providing the building blocks as a service. These YAML files can be swapped out for different environments, making it easy to switch from a Redis Queue to an Azure Service Bus &lt;strong&gt;without changing any code!!!!&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Take a look at this example of an API that subscribes to a topic, and guess which pub/sub component I'm using:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Services&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddDaprClient&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="p"&gt;...&lt;/span&gt;

&lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;UseCloudEvents&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;UseEndpoints&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;endpoints&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;endpoints&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;MapSubscribeHandler&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;

&lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;MapPost&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"/"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Topic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"queue"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"topic"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;DaprClient&lt;/span&gt; &lt;span class="n"&gt;daprClient&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; 
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="p"&gt;...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The answer is: it depends. Locally, it is using a Redis pub/sub running in a container, but when it's deployed to the cloud, it's using Azure Service Bus. The only thing that changes is the YAML file that defines the pub/sub component.&lt;/p&gt;

&lt;p&gt;Moreover, Dapr offers the ability to implement service-to-service invocation via HTTP. If we wanted to call the previous WeatherApiClient from the ApiService, we would do it like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Using generic http client&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;HttpClient&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"http://localhost:&amp;lt;DAPR_SIDECAR_PORT&amp;gt;/v1.0/invoke/apiservice/method/weatherforecast"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// using Dapr client (SDK specific)&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;_daprClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;InvokeMethodAsync&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;WeatherForecast&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="k"&gt;]&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"apiservice"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"weatherforecast"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Overlap or complementary?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/frameworks/dapr?tabs=dotnet-cli" rel="noopener noreferrer"&gt;Dapr can be integrated into .NET Aspire&lt;/a&gt;. We can define a Dapr building block in a YAML file and then use it in our .NET Aspire application.&lt;/p&gt;

&lt;p&gt;Here's an example of how to add Dapr components to a .NET Aspire application, but for the full picture please refer to the &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/frameworks/dapr?tabs=dotnet-cli" rel="noopener noreferrer"&gt;official documentation&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;secretstore&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddDaprComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"secretstore"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"secretstores.azure.keyvault"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;DaprComponentOptions&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="n"&gt;LocalPath&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"..\\.dapr\\components\\secretstore.yml"&lt;/span&gt;&lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;pubsub&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddDaprPubSub&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"pubsub"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;DaprComponentOptions&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="n"&gt;LocalPath&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;".\\.dapr\\components\\pubsub.yml"&lt;/span&gt;&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Cloud component -&amp;gt; complementary
&lt;/h3&gt;

&lt;p&gt;Aspire can define resources, inject their connection strings where needed, and also create those resources when deploying to the cloud via the Azure Developer CLI. Dapr can reference resources already created, defining endpoints and configurations in YAML files that can easily be swapped.&lt;/p&gt;

&lt;p&gt;In situations where we are testing our application locally and don't want to leverage cloud resources, we could easily reference the local Redis cache container as a pub/sub handler and then switch to Azure Service Bus when deploying to the cloud. When we do this, we can still leverage the Aspire integration with the Azure Developer CLI to create all of our resources during deployment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Service to service invocation -&amp;gt; overlap (?)
&lt;/h3&gt;

&lt;p&gt;As I mentioned, .NET Aspire will provide us with the correct connection string for whichever service we need to invoke. Dapr can do the same thing, &lt;strong&gt;but it will do it via the sidecar&lt;/strong&gt;. This is something we need to keep in mind when making our decision.&lt;/p&gt;

&lt;p&gt;Here's a schema of how Dapr implements service-to-service invocation:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw25oec4xfyjsqsfw7zc2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw25oec4xfyjsqsfw7zc2.png" alt="Dapr service-to-service invocation" width="800" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Service A makes an HTTP or gRPC call targeting Service B. The call goes to the local Dapr sidecar.&lt;/li&gt;
&lt;li&gt;Dapr discovers Service B’s location using the name resolution component, which is running on the given hosting platform.&lt;/li&gt;
&lt;li&gt;Dapr forwards the message to Service B’s Dapr sidecar. Note: All calls between Dapr sidecars go over gRPC for performance. Only calls between services and Dapr sidecars can be either HTTP or gRPC.&lt;/li&gt;
&lt;li&gt;Service B’s Dapr sidecar forwards the request to the specified endpoint (or method) on Service B. Service B then runs its business logic code.&lt;/li&gt;
&lt;li&gt;Service B sends a response to Service A. The response goes to Service B’s sidecar.&lt;/li&gt;
&lt;li&gt;Dapr forwards the response to Service A’s Dapr sidecar.&lt;/li&gt;
&lt;li&gt;Service A receives the response.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These extra steps can take a toll on performance, but it is platform agnostic. It doesn't matter which platform we are running our distributed application on: service-to-service invocation won't need to take that into consideration. Also, all these HTTP/gRPC requests are performed on localhost.&lt;/p&gt;

&lt;p&gt;On the other hand, if we leverage the .NET Aspire and Azure Developer CLI integration, we can deploy our application to Azure Container Apps and always have the correct connection string injected.&lt;br&gt;
When running the &lt;code&gt;&amp;lt;Project_Name.AppHost&amp;gt;&lt;/code&gt; locally, all these endpoints will be &lt;code&gt;http://localhost:&amp;lt;PORT&amp;gt;&lt;/code&gt;. When deploying to Azure Container Apps, all these endpoints will be those of the correct Container App instances. Our code won't change because we are referencing an injected environment variable and we don't need to know the actual value of the endpoints.&lt;/p&gt;

&lt;p&gt;In conclusion, .NET Aspire and Dapr are two different projects that can be used together to build better distributed applications. .NET Aspire streamlines the process of going from local development to the cloud, providing all the tools needed for building modern and resilient distributed applications. Dapr can help make our applications flexible and as cloud-agnostic as possible. By using both projects together, developers can take advantage of the strengths of each project to build better distributed applications.&lt;/p&gt;

</description>
      <category>dapr</category>
      <category>microservices</category>
      <category>architecture</category>
      <category>cloud</category>
    </item>
    <item>
      <title>.NET Aspire for Data Chatting</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Thu, 19 Sep 2024 13:26:53 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/net-aspire-for-data-chatting-3ghe</link>
      <guid>https://dev.to/tommasodotnet/net-aspire-for-data-chatting-3ghe</guid>
      <description>&lt;p&gt;.NET Aspire is a great way to orchestrate our distributed applications. It helps from the local development process right to the cloud deployment. And it works with &lt;a href="https://dev.to/tommasodotnet/net-aspire-multilanguage-57l3"&gt;different coding&lt;/a&gt; languages as well! Let's use it to build something a little bit more complex than a distributed application as we've seen the last time. Let's build a chat application! Specifically, I want to be able to chat with my database.&lt;/p&gt;

&lt;h2&gt;
  
  
  The idea
&lt;/h2&gt;

&lt;p&gt;With the growth of AI, we can see more and more chatbots around. They are used in many different scenarios, from customer service to personal assistants. But what if we could chat with our database? We could ask it to retrieve some data, update some records, or even create new ones. This is the idea behind this project. We will build a chat application that allows us to interact with our database using natural language.&lt;/p&gt;

&lt;p&gt;In order to achieve this, we will use the following technologies:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/get-started/aspire-overview" rel="noopener noreferrer"&gt;.NET Aspire&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/semantic-kernel/overview/" rel="noopener noreferrer"&gt;Semantic Kernel&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/azure/data-api-builder/overview" rel="noopener noreferrer"&gt;Data API Builder (DAB)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I've already discussed .NET Aspire in a previous article, so I won't go into details here. Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase.&lt;/p&gt;

&lt;p&gt;Data API Builder is a tool that allows you to create RESTful and GraphQL APIs for your database without writing any code.&lt;/p&gt;

&lt;p&gt;As I mentioned, I want to be able to talk to my database using natural language. To do this, we will use Semantic Kernel to process the text input and generate a query that can be executed by DAB. The response from DAB will be sent back to the user, completing the loop. Of course, we will need to leverage Semantic Kernel's capability to use plugins and the GraphQL API provided by DAB. &lt;/p&gt;

&lt;p&gt;GraphQL is a query language for your API that allows you to request only the data you need, therefor Semantic Kernel will be able to generate the perfect query based on the user input.&lt;/p&gt;

&lt;h2&gt;
  
  
  Data API Builder
&lt;/h2&gt;

&lt;p&gt;Let's start with the core of our application - Data API Builder. DAB is cross-platform, open-source, and independent of language, technology, and frameworks. It's free, with no premium tier, and can run in any cloud. Best of all, it doesn't require any coding! We can either use the &lt;a href="https://learn.microsoft.com/en-us/azure/data-api-builder/how-to-install-cli" rel="noopener noreferrer"&gt;dab-cli&lt;/a&gt; or write the json file ourselves to have the API up and running in no time.&lt;/p&gt;

&lt;p&gt;For this project, I have created a database from the &lt;a href="https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure?view=sql-server-ver16&amp;amp;tabs=ssms" rel="noopener noreferrer"&gt;AdventureWorks template&lt;/a&gt;. Using the dab-cli, I can simply run few commands to initialize the configuration file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dab init &lt;span class="nt"&gt;--database-type&lt;/span&gt; &lt;span class="s2"&gt;"mssql"&lt;/span&gt; &lt;span class="nt"&gt;--connection-string&lt;/span&gt; &lt;span class="s2"&gt;"@env('ConnectionStrings__sql')"&lt;/span&gt;

dab add &lt;span class="s2"&gt;"Customer"&lt;/span&gt; &lt;span class="nt"&gt;--source&lt;/span&gt; &lt;span class="s2"&gt;"SalesLT.Customer"&lt;/span&gt; &lt;span class="nt"&gt;--permissions&lt;/span&gt; &lt;span class="s2"&gt;"anonymous:*"&lt;/span&gt;
dab add &lt;span class="s2"&gt;"ProductModel"&lt;/span&gt; &lt;span class="nt"&gt;--source&lt;/span&gt; &lt;span class="s2"&gt;"SalesLT.ProductModel"&lt;/span&gt; &lt;span class="nt"&gt;--permissions&lt;/span&gt; &lt;span class="s2"&gt;"anonymous:*"&lt;/span&gt;
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and some others to define the relationships between the tables&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dab update Address &lt;span class="nt"&gt;--relationship&lt;/span&gt; customerAddresses &lt;span class="nt"&gt;--target&lt;/span&gt;.entity CustomerAddress &lt;span class="nt"&gt;--cardinality&lt;/span&gt; many &lt;span class="nt"&gt;--relationship&lt;/span&gt;.fields &lt;span class="s2"&gt;"AddressID:AddressID"&lt;/span&gt;
dab update Customer &lt;span class="nt"&gt;--relationship&lt;/span&gt; customerAddresses &lt;span class="nt"&gt;--target&lt;/span&gt;.entity CustomerAddress &lt;span class="nt"&gt;--cardinality&lt;/span&gt; many &lt;span class="nt"&gt;--relationship&lt;/span&gt;.fields &lt;span class="s2"&gt;"CustomerID:CustomerID"&lt;/span&gt;
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will generate a json file that looks something like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"$schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://github.com/Azure/data-api-builder/releases/download/v1.1.7/dab.draft.schema.json"&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"data-source"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"database-type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"mssql"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"connection-string"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"@env('ConnectionStrings__sqldb')"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"options"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"set-session-context"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"runtime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"graphql"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"path"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/graphql"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"allow-introspection"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"entities"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Customer"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"source"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"object"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"SalesLT.Customer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"table"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"graphql"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"singular"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Customer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"plural"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Customers"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"permissions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"role"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"anonymous"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"actions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
              &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"relationships"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"customerAddresses"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"cardinality"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"many"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"target.entity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"CustomerAddress"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"source.fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="s2"&gt;"CustomerID"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"target.fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="s2"&gt;"CustomerID"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Using DAB we can define everything - authentication, authorization, data access roles, row level security, and more.&lt;/p&gt;

&lt;h2&gt;
  
  
  Semantic Kernel
&lt;/h2&gt;

&lt;p&gt;Because we want to be able to chat with our database, we need to process the text input and generate a query that can be executed by DAB. This is where Semantic Kernel comes in. We can add an OpenAPI plugin to Semantic Kernel so that it can invoke an API. Earlier, we defined a GraphQL API using DAB. We can describe the GraphQL API in an OpenAPI file in yaml, so that it can be consumed by Semantic Kernel.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;openapi&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3.0.1&lt;/span&gt;
&lt;span class="na"&gt;info&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Data API builder - Graphql Endpoint&lt;/span&gt;
  &lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;0.12.0&lt;/span&gt;
&lt;span class="na"&gt;servers&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;http://localhost:5000&lt;/span&gt;
&lt;span class="na"&gt;paths&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;/graphql&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;post&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;a graphql endpoint.&lt;/span&gt;
      &lt;span class="na"&gt;operationId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;graphql_post&lt;/span&gt;
      &lt;span class="s"&gt;...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now the only thing left is importing the plugin into Semantic Kernel&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;kernel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ImportPluginFromOpenApiAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;pluginName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"GQLPlugin"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Combine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;AppDomain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CurrentDomain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;BaseDirectory&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;@"Plugins/DAB_GraphQL/openapi.yml"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;executionParameters&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;OpenApiFunctionExecutionParameters&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;ServerUrlOverride&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;Uri&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"services__dab__http__0"&lt;/span&gt;&lt;span class="p"&gt;)!),&lt;/span&gt;
        &lt;span class="n"&gt;EnableDynamicPayload&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;IgnoreNonCompliantErrors&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;EnablePayloadNamespacing&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and of course a good system prompt to make sure it will use the plugin correctly. &lt;/p&gt;

&lt;p&gt;The most important thing is that the Kernel doesn't try to guess the GraphQL schema, rather uses introspection to understand it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="nn"&gt;...&lt;/span&gt;
&lt;span class="c1"&gt;### on the graphql_post tool:&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;the grapghql_post tool provides access to data useful to answer user question.&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;the graphql_post tool is a graphql endpoint.&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="err"&gt;*&lt;/span&gt;&lt;span class="nv"&gt;*Do&lt;/span&gt; &lt;span class="s"&gt;not** guess schema type name, relationships or data types.&lt;/span&gt; 
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;You **must** use introspection to understand the whole schema before planning the query you will execute.&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="err"&gt;*&lt;/span&gt;&lt;span class="nv"&gt;*Remember&lt;/span&gt;&lt;span class="err"&gt;**&lt;/span&gt; &lt;span class="s"&gt;names, including field names and type names, are case sensitive in grapql.&lt;/span&gt;   
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;when composing the graphql request you can use **only** the following parameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;filter, orderBy, first and after .&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;when composing the graphql request you can use **only** the following operators&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;eq,neq,gt,gte,lt,lte,isnull,contains,notcontains,startsWith,endsWith,and,or .&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;mutation operations are not allowed.&lt;/span&gt;
&lt;span class="nn"&gt;...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the result of a simple request:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvcn5he4cmzz6fgao04ae.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvcn5he4cmzz6fgao04ae.png" alt="Chat Screenshot" width="800" height="562"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see, I let the agent outline its reasoning and give a somewhat formatted output. Of course, for business scenarios, we will need to cut out the reasoning and show the results in a more user-friendly way.&lt;/p&gt;

&lt;h2&gt;
  
  
  Putting it all together
&lt;/h2&gt;

&lt;p&gt;I've started mentioning that this whole project rely on .NET Aspire. It's the orchestrator that will make sure the correct services are running and the correct endpoints are being injected were needed.&lt;/p&gt;

&lt;p&gt;Of course, Data API Builder can be ran in a container, so our App Host becomes pretty simple:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DistributedApplication&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CreateBuilder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;sql&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddConnectionString&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"sqldb"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;dabService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"dab"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"mcr.microsoft.com/azure-databases/data-api-builder"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;5000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithBindMount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;@"D:\src\distributed-dab-sk\aw-data-api-builder\dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"/App/dab-bm0/dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithArgs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"--ConfigFileName"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"./dab-bm0/dab-config.json"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithOtlpExporter&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsContainer&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;dabServiceEndpoint&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;dabService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AddProject&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Projects&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DataAPIBuilder_AI_Web&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;(&lt;/span&gt;&lt;span class="s"&gt;"webfrontend"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithExternalHttpEndpoints&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dabServiceEndpoint&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Build&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;Run&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice how the connection string for the database is being injected into the DAB container and the connection string for the DAB container is being injected into the web frontend. This variable will override the one in the above mentioned OpenAPI plugin file.&lt;/p&gt;

&lt;p&gt;Having this, we can leverage the .NET Aspire manifest to deploy the whole application to the cloud, hosting it on Azure Container Apps. &lt;br&gt;
We now have a chatbot for our database, that requires no coding when new tables are added and that runs in a serverless and scalable environment!&lt;/p&gt;

&lt;p&gt;You can find the full code along with some more tecnical details in the &lt;a href="https://github.com/tommasodotNET/distributed-dab-sk" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>database</category>
      <category>chatgpt</category>
      <category>aspire</category>
    </item>
    <item>
      <title>.NET Aspire Multilanguage</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Sun, 08 Sep 2024 12:40:54 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/net-aspire-multilanguage-57l3</link>
      <guid>https://dev.to/tommasodotnet/net-aspire-multilanguage-57l3</guid>
      <description>&lt;p&gt;In this article I want to explore how we can leverage a tool such as .NET Aspire to improve the way we build and deploy distributed applications, even when we work with languages and frameworks other than .NET.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is .NET Aspire
&lt;/h2&gt;

&lt;p&gt;.NET Aspire is an opinionated, &lt;strong&gt;cloud ready stack&lt;/strong&gt; designed to improve the experience of building observable, production ready, distributed applications. It is delivered through a collection of NuGet packages that handle specific cloud-native concerns. &lt;/p&gt;

&lt;p&gt;.NET Aspire is designed to help you with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Orchestration&lt;/strong&gt;: .NET Aspire provides features for running and connecting multi-project applications and their dependencies for local development environments.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integrations&lt;/strong&gt;: .NET Aspire integrations are NuGet packages for commonly used services, such as Redis or Postgres, with standardized interfaces ensuring they connect consistently and seamlessly with your app.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tooling&lt;/strong&gt;: .NET Aspire comes with project templates and tooling experiences for Visual Studio, Visual Studio Code, and the dotnet CLI to help you create and interact with .NET Aspire projects.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The core of .NET Aspire can be found in the AppHost project that's created when you start working with Aspire from the .NET template. The AppHost project defines the components of our distributed application, from services - such as backend or fronted - to resources - such as databases or caches.&lt;br&gt;
This project can generate a manifest describing from an infrastructure perspective our application. It can therefor be interpreted by the Azure Developer CLI to deploy the application to Azure, without the need to write any Infrastructure as Code.&lt;/p&gt;
&lt;h2&gt;
  
  
  Work with different languages
&lt;/h2&gt;

&lt;p&gt;Of course when we work with distributed applications and microservices we might incur in a scenario in which different teams like to write code in different languages. And of course .NET Aspire belongs to the .NET ecosystem. Nonetheless, it is design to be able to integrate with different languages and frameworks.&lt;/p&gt;
&lt;h3&gt;
  
  
  Distributed Calculator
&lt;/h3&gt;

&lt;p&gt;Let's consider a simple example. We have a distributed calculator that is composed of different services: a frontend service and a backend for each operation.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;fronted&lt;/strong&gt;: react&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;subtraction&lt;/strong&gt;: dotnet&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;addition&lt;/strong&gt;: go&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;multiplication&lt;/strong&gt;: python&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;division&lt;/strong&gt;: nodejs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We also have a redis cache to store the state of the calculator.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq220hjw6vezpxzgtxj5m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq220hjw6vezpxzgtxj5m.png" alt="Architecture" width="242" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this scenario we can use .NET Aspire to build the distributed calculator, leveraging all the tool and integrations that come with it. Aspire offers native support for certain languages, but we can also extend it to support other languages using containers.&lt;/p&gt;

&lt;p&gt;At the time of writing, .NET Aspire supports the following languages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/get-started/build-aspire-apps-with-nodejs" rel="noopener noreferrer"&gt;Multiple JS frameworks (React, Angular, Vue, NodeJS)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/get-started/build-aspire-apps-with-python?tabs=bash" rel="noopener noreferrer"&gt;Python&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here's how we can configure all the backend services in the AppHost project:&lt;/p&gt;
&lt;h4&gt;
  
  
  Golang
&lt;/h4&gt;

&lt;p&gt;Golang isn't natively supported, so we will add it as a container. Note that we can decide to use different images for the scenarios in which we are running the AppHost for publishing the manifest or for local development.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Adder in Go&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="k"&gt;add&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ExecutionContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;IsPublishMode&lt;/span&gt;
    &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"acrt6xtihl2b3uxe.azurecr.io/addapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddContainer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;6000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"APP_PORT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithOtlpExporter&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"addapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsContainer&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;addEnpoint&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;add&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Python
&lt;/h4&gt;

&lt;p&gt;Python is natively supported, so we can use the AddPythonProject method to configure the multiplier service. Please follow &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/get-started/build-aspire-apps-with-python?tabs=powershell#initialize-the-python-virtual-environment" rel="noopener noreferrer"&gt;this tutorial&lt;/a&gt; to correctly configure the Python project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Multiplier in Python&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;multiply&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddPythonProject&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"multiplyapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"../../python-multiplier"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"app.py"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;5001&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"APP_PORT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"multiplyapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsDockerFile&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  NodeJS
&lt;/h4&gt;

&lt;p&gt;NodeJS is natively supported, so we can use the AddNodeApp method to configure the divider service.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Divider in NodeJS&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;divide&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddNodeApp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"divideapp"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;scriptPath&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"app.js"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;workingDirectory&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"../../node-divider"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"APP_PORT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"divideapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsDockerFile&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  .NET
&lt;/h4&gt;

&lt;p&gt;No surprise here, .NET Aspire natively supports .NET, so we can use the AddProject method to configure the subtractor service.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Subtractor in .NET&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;subtract&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AddProject&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;Projects&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dotnet_subtractor&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;(&lt;/span&gt;&lt;span class="s"&gt;"subtractapp"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;insights&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"subtractapp"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Redis
&lt;/h4&gt;

&lt;p&gt;The redis cache can be easily configured using the AddRedis method, or in this scenario using Dapr via the AddDaprStateStore method.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Dapr State Store&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;stateStore&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddDaprStateStore&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"statestore"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Dapr is not the focus of this article, but it is worth mentioning that .NET Aspire can be used in conjunction with Dapr to build distributed applications. For further information regarding Dapr, please reference the &lt;a href="https://docs.dapr.io/?_gl=1*rh6dgv*_ga*MjEwMjM4NDMyMS4xNzI1MDEyOTU2*_ga_60C6Q1ETC1*MTcyNTAxMjk1Ni4xLjAuMTcyNTAxMjk1Ni4wLjAuMA.." rel="noopener noreferrer"&gt;official documentation&lt;/a&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  React
&lt;/h4&gt;

&lt;p&gt;Lastly, we can configure the frontend service using the AddNpmApp method.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Configure Frontend in React&lt;/span&gt;
&lt;span class="n"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;AddNpmApp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"calculator-front-end"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;workingDirectory&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"../../react-calculator"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithDaprSidecar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;DaprSidecarOptions&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;AppPort&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="m"&gt;3000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;AppProtocol&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"http"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;DaprHttpPort&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="m"&gt;3500&lt;/span&gt;
    &lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"DAPR_HTTP_PORT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"3500"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;addEnpoint&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;multiply&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;divide&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;subtract&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stateStore&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithReference&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;insights&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithHttpEndpoint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;targetPort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;3000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;env&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"PORT"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithExternalHttpEndpoints&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WithEnvironment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OTEL_SERVICE_NAME"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"calculator-front-end"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;PublishAsDockerFile&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Since we are referencing all the previously configured services, we can easily connect them to the frontend service. When we need to invoke the adder from the frontend, we can easily do so by using the environment variable that has been injected by Aspire:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/calculate/add&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;serviceUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;services__addapp__http__0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;appResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;serviceUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/add`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

      &lt;span class="c1"&gt;// Return expected string result to client&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;appResponse&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Further considerations
&lt;/h3&gt;

&lt;p&gt;The entire application can be deployed to Azure using the Azure Developer CLI. The CLI will read the manifest generated by the AppHost project and deploy the application to Azure, creating all the necessary resources. To learn how to integrate Aspire with the Azure Developer CLI, please reference the &lt;a href="https://learn.microsoft.com/en-us/dotnet/aspire/deployment/azure/aca-deployment-azd-in-depth?tabs=windows" rel="noopener noreferrer"&gt;official tutorial&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;All the code for the distributed calculator can be found in the &lt;a href="https://github.com/tommasodotNET/aspire-multilanguage" rel="noopener noreferrer"&gt;Aspire Multilanguage repository&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>cloud</category>
      <category>dotnet</category>
      <category>python</category>
    </item>
    <item>
      <title>GitHub Copilot vs ChatGPT</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Sat, 25 May 2024 06:30:00 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/github-copilot-vs-chatgptgithub-4kb5</link>
      <guid>https://dev.to/tommasodotnet/github-copilot-vs-chatgptgithub-4kb5</guid>
      <description>&lt;p&gt;In this article I would like to analyze thedifferences between two tools that seem to overlap: GitHub Copilot and ChatGPT. What are the fundamental differences between the two? Which one to choose? And do you really have to choose?&lt;/p&gt;

&lt;h2&gt;
  
  
  ChatGPTChatGPT
&lt;/h2&gt;

&lt;p&gt;Let's start by analyzing ChatGPT. It is a web portal where you can start a chat with a Large Language Model (LLM). There are several &lt;a href="https://openai.com/chatgpt/pricing" rel="noopener noreferrer"&gt;ChatGPT tiers&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv863rha4edbt71k6btuw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv863rha4edbt71k6btuw.png" alt="Screenshot of the ChatGPT pricing page" width="602" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the free tier, we have a fairly minimal experience that allows us to interact with the GPT-3.5 model.&lt;/p&gt;

&lt;p&gt;Things start to get interesting from the Plus tier, which already offers us the possibility to interact with the GPT-4 model. This also gives us access to the web. Access to the web is important, because when we ask a question to the model, it is able to answer based on "native knowledge" derived from its training. Let's say that the model has been trained with data from the web up to 2021, and we ask it who the Prime Minister of the United Kingdom is, it would answer Boris Johnson (the Prime Minister in office at that time). If we gave the same model access to the web, it would be able to give us the exact answer: Rishi Sunak (Prime Minister in office at the time of writing this article).&lt;/p&gt;

&lt;p&gt;The third team, in addition to interacting with other models such as DALL-E, adds the possibility that the data sent through the requests will not be used to retrain the model.&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub Copilot
&lt;/h2&gt;

&lt;p&gt;GitHub Copilot is a fine-tuning of the GPT-4 model for code. Fine-tuning refers to the ability to train a model by specializing it for a specific scenario, in this case working on code. The basic capabilities are therefore the same as GPT-4, which is already highly capable of working on code, with a specific specialization on this feature.&lt;/p&gt;

&lt;p&gt;Just like ChatGPT, GitHub Copilot also offers &lt;a href="https://github.com/features/copilot/plans" rel="noopener noreferrer"&gt;different pricing tiers&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs0bdh7p0fsxgxsv9v81h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs0bdh7p0fsxgxsv9v81h.png" alt="Screenshot of the Copilot pricing page" width="602" height="784"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It can be observed how for the tiers Individual and Business the difference in features is mostly related to "Management and policies". The Individual tier is aimed at individual users, while the Business tier targets more corporate scenarios, where centralized user and policy management provides a significant advantage for tool administrators.&lt;/p&gt;

&lt;p&gt;I will dedicate a separate paragraph to the Enterprise tier later in this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Terms and Conditions
&lt;/h2&gt;

&lt;p&gt;Another fundamental difference between the two tools can be found in the Terms &amp;amp; Conditions (T&amp;amp;C). GitHub Copilot's terms and conditions of use ensure that the underlying model will not be retrained using user-inputted data. Essentially, even in the Individual tier, when GitHub Copilot analyzes your code to provide you with answers and suggestions, it does not use the portions of code analyzed to retrain its algorithm, thereby preserving intellectual property.&lt;/p&gt;

&lt;p&gt;Regarding ChatGPT, this applies starting from the Team tier.&lt;/p&gt;

&lt;p&gt;From the perspective of Copilot in Edge, however, Commercial Data Protection is guaranteed for the types of accounts listed in the first paragraph &lt;a href="https://learn.microsoft.com/en-us/copilot/manage" rel="noopener noreferrer"&gt;of this link&lt;/a&gt;, and only when accessing with the company account and not the personal account.&lt;/p&gt;

&lt;p&gt;Due to data protection concerns, for professional use, I would never recommend a tier that does not offer data protection functionalities, for this purpose from now on we will consider a comparison between the different tiers of GitHub Copilot and ChatGPT Team.&lt;/p&gt;

&lt;h2&gt;
  
  
  IDE Integration
&lt;/h2&gt;

&lt;p&gt;The main advantage of GitHub Copilot is the integration with the IDE: it is in fact born as a tool to suggest code in real time to the developer as he or she writes code. It infers from the context of what has already been written and what is being written to suggest entire portions of code in a pro-active way.&lt;/p&gt;

&lt;p&gt;Over time, GitHub Copilot has evolved by adding several features, in addition to the Code Completion we just talked about: Chat and Smart Actions.&lt;/p&gt;

&lt;p&gt;We can imagine the Chat feature as an implementation of a ChatGPT-like scenario. However, being a model specialized in code, the field is therefore restricted: asking GitHub Copilot who the Prime Minister of the United Kingdom is, it will respond:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69mqyxi4wz7ihh1ti044.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69mqyxi4wz7ihh1ti044.png" alt="Copilot responds " width="602" height="195"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If ChatGPT can answer both code and general questions, what's the advantage of using GitHub Copilot over ChatGPT?&lt;/p&gt;

&lt;p&gt;Keeping in mind that this question compares a feature of a larger product (GitHub Copilot is not just its chat) to a complete product (ChatGPT is a chat service), the strengths of GitHub Copilot lie in its integration with the IDE.&lt;/p&gt;

&lt;p&gt;Without leaving the Visual Studio Code screen (or Visual Studio or JetBrains), we can select portions of code and ask direct questions to our pair programming assistant. From the backend perspective, the question posed to the model will therefore contain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;our context&lt;/strong&gt;: the selected code&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;our question&lt;/strong&gt;, for example "explain to me what this portion of code does"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;the system prompt&lt;/strong&gt;: the system prompt is a basic prompt that has been defined and written on the backend and surrounds the question we have asked. They are the basic instructions. In the simplest cases, we can think of the system prompt as a series of basic indications such as "You are a very helpful virtual assistant in helping. You always try to explain things in a complete but never verbose way and you are able to schematize complex concepts in order to make their understanding easier". This is a remarkably simple system prompt, GitHub Copilot's will clearly be more complex and will contain instructions such as "Only respond to questions related to the programming world", which generates responses like the one in the screenshot above.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This system prompt is one of the strengths of GitHub Copilot: the code generated by the tool is not directly passed on to the end user, but is filtered and double-checked in order to avoid scenarios of &lt;a href="https://owasp.org/www-project-top-10-for-large-language-model-applications/Archive/0_1_vulns/Prompt_Injection.html" rel="noopener noreferrer"&gt;prompt-injection&lt;/a&gt; (a concept similar to SQL injection, but which applies to prompt engineering scenarios).&lt;/p&gt;

&lt;p&gt;Even more important than the system prompt is GitHub Copilot's ability to access the extended context of our IDE. The context is formed by the files open in tabs and the files present in the open folder. In fact, there is the possibility, through the &lt;a class="mentioned-user" href="https://dev.to/workspace"&gt;@workspace&lt;/a&gt; keyword, to ask broader questions about the entire repository that is open.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwi2myj5xvbg32hwr6do.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwi2myj5xvbg32hwr6do.png" alt="Copilot responds correctly to the question based on the context of the solution" width="800" height="485"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k4izlln69gg1s316rcu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k4izlln69gg1s316rcu.png" alt="Copilot responds correctly to the question based on the context of the individual project" width="800" height="606"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the two screenshots above, we can see how GitHub Copilot Chat is able to analyze the entire structure of the folders, without having to specifically select portions of code, and provide me with the exact answers. In the second case, it is even able to understand the intended usage of certain services that I have described and how the APIs that I have defined work. It can also generate links to files so that they can be accessed directly without having to navigate the structure of my repository.&lt;/p&gt;

&lt;p&gt;Taking other Visual Studio Code extensions that integrate a GPT model with our IDE as an example, functions are inherently more limited: &lt;a href="https://marketplace.visualstudio.com/items?itemName=jefferson-pires.VisualChatGPTStudio" rel="noopener noreferrer"&gt;Visual chatGPT Studio - Visual Studio Marketplace&lt;/a&gt;. As we can see in this case, the offered features only cover a subset of the functionalities related to the selection of code sections and the ability to ask questions about them.&lt;/p&gt;

&lt;p&gt;But let's analyze an even more complex scenario than what we have seen so far: let's assume that we have two microservices communicating with each other through a queue. In addition to the "&lt;a class="mentioned-user" href="https://dev.to/workspace"&gt;@workspace&lt;/a&gt;" tag, I also use the tag "#file:" to enrich the context of the chat by inserting, in addition to the selected code, another file. This way, I can ask how the event managed by microservice 2 is formatted inside microservice 1: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcagdwpwey90tmnd328dq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcagdwpwey90tmnd328dq.png" alt="Copilot responds correctly by cross-referencing data from the selected code and another file" width="800" height="590"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is meant by "programming questions"?
&lt;/h2&gt;

&lt;p&gt;It is interesting to focus on GitHub Copilot's narrowed operational context. When it's said that the tool is able to answer programming questions, we should not only think about the world of code in the narrowest sense.&lt;/p&gt;

&lt;p&gt;We are also able to ask questions about frameworks, third-party tools we use in the code, and the hosting architecture of an application, such as "How can I create a secure AKS cluster?"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fibnke6w4qab41eo67t0k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fibnke6w4qab41eo67t0k.png" alt="Copilot provides architectural information on how to secure an AKS cluster" width="800" height="831"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Therefore, general questions such as "What are the general principles of quantum mechanics?" are excluded. Questions that do not pertain in any way, even from an architectural point of view, to writing code.with the writing of code are therefore excluded.&lt;/p&gt;

&lt;p&gt;However, if we need answers on certain topics in order to conduct an analysis for the code we are writing, do we have alternatives to obtain such answers? We can safely use more general tools such as ChatGPT or tools that natively have access to the web to provide answers such as Copilot on Edge.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsrzlepjbhb4fppeair9l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsrzlepjbhb4fppeair9l.png" alt="Screenshot of the Copilot for 365 page" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub Copilot EnterpriseGitHub Copilot Enterprise
&lt;/h2&gt;

&lt;p&gt;In Enterprise tier, the gap in terms of features grows. In addition to normal Chat and Code completion (also present in the first two tiers), some truly interesting features are added:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Knowledge of my organization's knowledge base&lt;/strong&gt;: GitHub Copilot is able to index the repositories in my organization and answer questions related to all the files contained within it. An RAG scenario, or Chat With Your Data, focused on code that offers the possibility to ask the chat questions about the use of other projects compared to the ones I am currently working on (for example a shared function library used within my organization).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pull Request Diff Analysis&lt;/strong&gt;: here we have the ability to have GitHub Copilot analyze the individual commits that make up a Pull Request highlighting the fundamental changes that impact it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Web Search Powered by Bing (beta)&lt;/strong&gt;: We can ask questions about the latest updates to a framework and the answer will be generated by searching online content.Attention: we are also talking here about code-related questions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fine-Tuned Models (coming soon)&lt;/strong&gt;: it is a feature that has not yet been made available in General Availability, but it will allow fine-tuning of the GitHub Copilot model based on our repositories. How does it differ from indexing? Simply put: the model will not have to search for answers based on user questions in an indexed information database, it will have the answer built into its model. Just as ChatGPT in the free tier can tell us who the Prime Minister of the United Kingdom is without searching for the answer, GitHub Copilot will be able to natively know which shared libraries exist within our organization. Suppose we have a library that is used to interact with OpenAI APIs, when GitHub Copilot suggests (even proactively!) code to call OpenAI APIs, it will not suggest a generic HTTP request, but rather suggest the appropriate library invocation. 
## Conclusion
If from a proactive point of view, GitHub Copilot appears to be the only tool that offers the service of suggesting entire portions of code without explicit requests, the native integration of the Chat with the IDE makes the use of the tool significantly easier compared to ChatGPT integrated services, which require a more manual intervention for context construction.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From a cost standpoint, ChatGPT Team has a higher costcompared to GitHub Copilot Business, which offers more advanced programming features.  What is lost by not having the ability to do online searches - replaceable as we have seen with other tools - is gainedin terms of proactivity and ease of use of the tool. GitHub Copilot Enterprise, on the other hand, has a cost higher than ChatGPT Team but offers a series of truly interesting additional features for Enterprise scenarios.&lt;/p&gt;

</description>
      <category>openai</category>
      <category>githubcopilot</category>
      <category>chatgpt</category>
    </item>
    <item>
      <title>Serverless or AKS in an OpenAI solution: let's go against the tide</title>
      <dc:creator>Tommaso Stocchi</dc:creator>
      <pubDate>Tue, 21 May 2024 18:00:00 +0000</pubDate>
      <link>https://dev.to/tommasodotnet/serverless-or-aks-in-an-openai-solution-lets-go-against-the-tide-22oj</link>
      <guid>https://dev.to/tommasodotnet/serverless-or-aks-in-an-openai-solution-lets-go-against-the-tide-22oj</guid>
      <description>&lt;p&gt;Working as a Cloud Solution Architect, one of the most common scenarios encountered in the world ofserverless migration to the enterprise. Usually, a serverless architecture is built to support a PoC or pilot project, and then migrated to enterprise-level environments such as Azure App Service or &lt;a href="https://learn.microsoft.com/en-gb/azure/aks/intro-kubernetes" rel="noopener noreferrer"&gt;Azure Kubernetes Service&lt;/a&gt;.&lt;br&gt;
Does it make sense to consider a reverse path? As always, when it comes to architectures, the answer depends on the scenario we are considering.&lt;/p&gt;

&lt;h2&gt;
  
  
  Retrieval Augmented Generation
&lt;/h2&gt;

&lt;p&gt;Let's consider an architecture based on OpenAI as a scenario, specifically an implementation of &lt;a href="https://help.openai.com/en/articles/8868588-retrieval-augmented-generation-rag-and-semantic-search-for-gpts" rel="noopener noreferrer"&gt;Retrieval Augmented Generation&lt;/a&gt;. By Retrieval Augmented Generation (or RAG) we refer to an architecture that allows us to implement a "chat with your data" project. We use a &lt;a href="https://cookbook.openai.com/articles/how_to_work_with_large_language_models" rel="noopener noreferrer"&gt;Large Language Model (LLM)&lt;/a&gt; to generate responses based on our specific data.&lt;br&gt;
The steps to create this architecture are simple: we first need to upload our data (simple pdf files, for example), divide them into smaller text portions, and for each portion, calculate a numerical vector (embedding) and save it in a vector database. This is to ensure that based on the question asked by the user, we can conduct vector search starting from the question, understand in which indexed text portion the answer to the question is located, and provide that text portion to the LLM, in our case &lt;a href="https://help.openai.com/en/articles/7842364-how-chatgpt-and-our-language-models-are-developed" rel="noopener noreferrer"&gt;ChatGPT&lt;/a&gt;, along with the question itself so that the model can give us the answer.&lt;br&gt;
Semantic search is important in this case because we need the user interaction to be as "human-like" as possible. We need to retrieve one or more paragraphs that may contain an answer to a question, so a simple word-by-word check of the paragraphs would not be sufficient. For example, my paragraph might say, "to obtain an identity card at the Como municipality, you need to apply at the police station, bringing with you [...]”, and the question might be asked as, "How to renew an identity document."  The difference is minimal, but a word-by-word search would not give me a positive result. Vector search, on the other hand, is based on the semantic distance between words: "identity card" and "identity document" are very close together.&lt;/p&gt;

&lt;p&gt;On Azure, we have different options to implement the various components of this architecture. In our case, we will use &lt;a href="https://learn.microsoft.com/en-us/azure/ai-services/document-intelligence/overview?view=doc-intel-4.0.0#document-analysis-models" rel="noopener noreferrer"&gt;Azure Form Recognizer&lt;/a&gt; for dividing the PDF into paragraphs, &lt;a href="https://learn.microsoft.com/en-us/azure/search/search-what-is-azure-search" rel="noopener noreferrer"&gt;Azure Search&lt;/a&gt; for saving the paragraphs and performing vector searches, &lt;a href="https://learn.microsoft.com/en-us/azure/ai-services/openai/overview" rel="noopener noreferrer"&gt;Azure OpenAI&lt;/a&gt; for computing paragraph vectors and formulating responses, and finally an Azure Blob Storage for storing our files. As a code hosting service, if we want to build an architecture capable of scaling to multiple files uploaded simultaneously, we will have four microservices on Azure Kubernetes Service:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Frontend: simple applicationBlazor that allows you to upload PDFs and ask questions&lt;/li&gt;
&lt;li&gt;Document Processing: API that uses Form Recognizer to divide the uploaded PDF into its paragraphs&lt;/li&gt;
&lt;li&gt;Knowledge Processing: API that takes a paragraph and calculates its vector to be saved in memory&lt;/li&gt;
&lt;li&gt;Search Knowledge: API that receives a user's question, calculates its embedding and searches the vector database, and then invokes Azure OpenAI to generate the answer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Document Processing and Knowledge Processing are the two microservices  that will need to scale up to cope with multiple files uploaded together and the resulting number of paragraphs to save. To achieve this, the two services are not directly invoked through API calls, but are subscribed to a queue service, in our case Azure Service Bus.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbuaqbsw6egj1l4igu1r1.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbuaqbsw6egj1l4igu1r1.jpg" alt="Architectural design of a RAG infrastructure. The application runs on AKS leveraging Form Recognizer, Service Bus, Azure Blob Storage, Azure Search, and Azure OpenAI." width="602" height="416"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Why do we use AKS in this architecture? There could be several reasons behind this choice: corporate culture, the need for high scalability capacity,possibility of customizing the cluster or simply the assurance of being able to migrate from one installation to another with ease.&lt;/p&gt;

&lt;p&gt;Communication among microservices, subscription to Service Bus, and reading secrets from Azure Key Vault are entrusted to &lt;a href="https://docs.dapr.io/concepts/overview/" rel="noopener noreferrer"&gt;Dapr&lt;/a&gt;. The authorization for the various resources involved is instead managed, where possible, through Azure Identity, leveraging &lt;a href="https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/how-manage-user-assigned-managed-identities?pivots=identity-mi-methods-azp" rel="noopener noreferrer"&gt;“User-assigned Identities”&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The repository of reference for this architecture is available &lt;a href="https://github.com/tommasodotNET/semantic-kernel-rag-demo" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  How much does it cost me?
&lt;/h2&gt;

&lt;p&gt;So now we have our active solution and we can chat with our data. But how much does it cost to keep it active?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32mubjim7r4e46fzoxhp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32mubjim7r4e46fzoxhp.png" alt="Screenshot of the web application, where a question is asked and the answer is generated based on the documents saved in Blob Storage, shown on the right." width="602" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Assuming a cluster consisting of three B2alsv2 virtual machines (2 vCPUs, 4GB RAM), the monthly cost would be around $94.&lt;/p&gt;

&lt;p&gt;After the initial upload of our so-called knowledge base (the set of files we want to query until we release the solution), as we move forward in time, we will need to upload a few files per month. Regardless of the reasons that prompted us to adopt AKS as a starting point, it might now make sense to migrate to a serverless.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Container Apps to the rescue!
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://learn.microsoft.com/en-us/azure/container-apps/overview" rel="noopener noreferrer"&gt;Azure Container Apps (ACA)&lt;/a&gt; come into play. It is an environment that is fully managed by Azure. No node management, infrastructure management, or networking is required. In addition to the significant advantage of having an infrastructure that is fully managed by the cloud provider, ACA also offers a consumption-based billing plan, i.e. serverless. To make a comparison, while the cost plan for AKS is similar to that of a Virtual Machine Scale Set, ACA's plan is similar to that of &lt;a href="https://learn.microsoft.com/en-us/azure/azure-functions/consumption-plan" rel="noopener noreferrer"&gt;AzureFunction consumption&lt;/a&gt;. Instances are not paid for the time they remain active, but for the number of requests and the execution time used to process each request.&lt;/p&gt;

&lt;p&gt;Assuming 1 million monthly requests with 20 concurrent requests managed for each Container App and a total execution time of 2 seconds per request, the monthly cost for a Container App (4vCPUs, 8GB RAM) would be around $6.60.&lt;/p&gt;

&lt;p&gt;For details on the billing plan of Azure Container Apps please refer to  &lt;a href="https://azure.microsoft.com/en-us/pricing/details/container-apps/" rel="noopener noreferrer"&gt;this link&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;How do Azure Container Apps work from a resource perspective? The first thing to do is to create a Container Apps Environment. This resource has two main functions: managing resource allocation (and therefore billing) for our apps and configuring DNS, network, components. Dapr, certificates. You can think of the Container Apps Environment as the App Service Plan for Web Apps. Every Web App must have an App Service Plan that manages scaling based on available resources, and each ASP can have one or more Web Apps to allocate those resources to. The more the base resources assigned to this plan grow, the higher the cost. However, the two resources differ in one way. While an App Service Plan has a fixed cost that depends on how long it remains on and the chosen tier, a Container Apps Environment has a cost dependent on the tier chosen multipled by the number of requests processed per month.&lt;br&gt;
To make a comparison with AKS, we can see the Container Apps Environment as the namespace of Kubernetes and the individual Azure Container App as the deployments.&lt;/p&gt;

&lt;p&gt;In a scenario like the one described, where after the initial intensive use we expect to have a low number of monthly requests for our product, the savings generated by a migration choice of this kind are significant.&lt;/p&gt;

&lt;p&gt;What are the operational implications of this migration? If we think about moving a REST API from a Web App to Azure, we know we will have to rewrite part of the code. On the other hand, when it comes to migrating a workload from AKS to ACA, none of this is required. We just need to create an Azure Container App and point it to the Docker image that needs to be hosted. For instance, we know we will have to rewrite part of the code. When it comes to migrating a workload from AKS to ACA, none of this is required. We just need to create an Azure Container App and point it to the Docker image that needs to be hosted. User-Assigned Identity that were already used by AKS can also be assigned to the same Container Apps.&lt;/p&gt;

&lt;p&gt;Dapr is natively supported in Azure Container Apps and the same User-Assigned Identity that were already used by AKS can also be assigned to the same Container Apps.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fob9nij8j715ieglsxpq7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fob9nij8j715ieglsxpq7.png" alt="Screenshot of the Dapr components configuration page in Azure Container Apps Environment, showing two components: keyvault and servicebus, with the details of the service bus. The details show how servicebus uses the connection string retrieved via the keyvault component and is assigned to four Container Apps." width="602" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tiwlgw6tkrhcpogfvde.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tiwlgw6tkrhcpogfvde.png" alt="Screenshot of the Identity configuration page of an Azure Container App. The details show the assignment of a " width="602" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To present one of our Apps externally, we just need to check the box in the Ingress section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa4r1zvaktt1uv94jmir4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa4r1zvaktt1uv94jmir4.png" alt="Screenshot of the Ingress configuration page of an Azure Container App. The Ingress option is marked as " width="602" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusions
&lt;/h2&gt;

&lt;p&gt;We have seen how to achieve an internal namespace of Kubernetes on a completely serverless resource, greatly reducing costs, and how this trend of migration can make sense. However, we can also go further: we can ask ourselves if, in certain scenarios, it makes sense to start with a hosting environment rather than another. The motivations that lead us to choose one hosting mode over another are various and complex, but often when we talk about microservices we are driven by habit to immediately and only talk about Kubernetes. However, it is often necessary to invest in training courses for both developers and operations departments to approach solutions of this type with my clients on the world of Kubernetes, both for the client himself and for developers. The paradigm of considering the many serverless solutions on the market can lead us to great advantages. Therefore, it is worth knowing them and taking them into consideration.&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>serverless</category>
      <category>openai</category>
      <category>rag</category>
    </item>
  </channel>
</rss>
