<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: c-arnab</title>
    <description>The latest articles on DEV Community by c-arnab (@c_arnab).</description>
    <link>https://dev.to/c_arnab</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/c_arnab"/>
    <language>en</language>
    <item>
      <title>Serverless in Azure using Static Web Apps, Functions and Cosmos DB</title>
      <dc:creator>c-arnab</dc:creator>
      <pubDate>Mon, 16 Jan 2023 06:20:51 +0000</pubDate>
      <link>https://dev.to/c_arnab/serverless-in-azure-using-static-web-apps-functions-and-cosmos-db-506j</link>
      <guid>https://dev.to/c_arnab/serverless-in-azure-using-static-web-apps-functions-and-cosmos-db-506j</guid>
      <description>&lt;h4&gt;
  
  
  &lt;em&gt;In this post, we look at Serverless development on Azure. First we look at the tools and packages necessary to develop locally. Then we create a static site using Azure Static Web Apps (SWA). Then we look at the built in authentication support provided by SWA as we create the authentication layer to ensure users can access their data securely. Then we look at how SWA supports building APIs using built in support for HTTP-triggered functions as we create and Integrate APIs to the static web site using .Net6 and C#. Then we create the data layer in Cosmos DB and integrate the same to functions built earlier. Finally we look at the CI/CD support provided where 'Github repository changes' trigger builds and deploy the solution to Azure.&lt;/em&gt;
&lt;/h4&gt;

&lt;h2&gt;
  
  
  Problem Statement
&lt;/h2&gt;

&lt;p&gt;The people at helm at the insistence of HR Department have decided to have a Calendar system. Though the solution is supposed to have loads of features, the decision is to build the whole iteratively and a basic Proof Of Concept (POC) is to be developed first. A person at helm attended a conference where s/he heard about "serverless" which allows on-demand scaling and also brings down TCO of a solution with “pay-as-you-go” usage and so one of the requirements for the POC is to make the entire solution using serverless technologies.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution
&lt;/h2&gt;

&lt;p&gt;Compute options in serverless services are varied in Azure. One could choose Serverless Containerized Microservices using Azure Container Apps OR Serverless Kubernetes using AKS Virtual Nodes OR Serverless functions using Azure Functions OR the newest entrant- Azure Static Web Apps.&lt;/p&gt;

&lt;p&gt;Azure Static Web Apps supports static content hosting, APIs powered by Azure Functions, local development experience, CI/CD workflows, global availability, dynamic scale, preview environments and all this without the necessity to manage servers, creating &amp;amp; assigning SSL certificates, establishing reverse proxies, etc.&lt;/p&gt;

&lt;p&gt;There are two options in Databases amongst serverless services in Azure. The relational Azure SQL Database serverless and the non relational Azure Cosmos DB.&lt;/p&gt;

&lt;p&gt;Azure Cosmos DB is a fully managed NoSQL database which offers features such as Change Data Capture and multiple database APIs - NoSQL, MongoDB, Cassandra, Gremlin, &amp;amp; Table enabling one to model real world data using documents, column-family, graph, and key-value data models.&lt;/p&gt;

&lt;p&gt;For the POC, the calendar system will enable users to authenticate themselves and view their events as well as add events to the calendar.&lt;br&gt;
Static Web Apps with Cosmos DB will be used to implement these use cases. Visual Studio code will be used as IDE.&lt;/p&gt;

&lt;h3&gt;
  
  
  Architecture Diagram with Application Development Lifecycle
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4hjullo04y1ujo67nw8s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4hjullo04y1ujo67nw8s.png" alt="architecture"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites for local development
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;.net 6 sdk - &lt;a href="https://dotnet.microsoft.com/en-us/download/dotnet/6.0" rel="noopener noreferrer"&gt;https://dotnet.microsoft.com/en-us/download/dotnet/6.0&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure Functions Core Tools v4.x - &lt;a href="https://go.microsoft.com/fwlink/?linkid=2174087" rel="noopener noreferrer"&gt;https://go.microsoft.com/fwlink/?linkid=2174087&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Static Web Apps CLI - &lt;a href="https://azure.github.io/static-web-apps-cli/" rel="noopener noreferrer"&gt;https://azure.github.io/static-web-apps-cli/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure Cosmos DB Emulator - &lt;a href="https://aka.ms/cosmosdb-emulator" rel="noopener noreferrer"&gt;https://aka.ms/cosmosdb-emulator&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install Azure Functions Extension from Visual Studio Code Extensions Tab - &lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions" rel="noopener noreferrer"&gt;https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install Azure Static Web Apps Extension from Visual Studio Code Extensions Tab - &lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurestaticwebapps" rel="noopener noreferrer"&gt;https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurestaticwebapps&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fotj83ex7kpgknc43enx5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fotj83ex7kpgknc43enx5.png" alt="Visual Studio Code Extensions"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  Ensure you have .Net6 sdk (check by running command &lt;code&gt;dotnet --list-sdks&lt;/code&gt;), even if you have other .Net sdks even a higher one such as .Net7. This is because Azure Functions have concept of In-process and Isolated worker process. This article will have steps supporting In-process whereas .Net7 is only supported in Isolated worker process.
&lt;/h6&gt;

&lt;h2&gt;
  
  
  Build Static Site
&lt;/h2&gt;

&lt;p&gt;To get started a template in Github can be used. Go to &lt;a href="https://github.com/staticwebdev/vanilla-basic/generate" rel="noopener noreferrer"&gt;https://github.com/login?return_to=/staticwebdev/vanilla-basic/generate&lt;/a&gt; and in the page add &lt;em&gt;mycalendar&lt;/em&gt; in the Repository field and click on button &lt;em&gt;Create Repository from Template&lt;/em&gt; to create repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmfw4o0ep2m9kdal40yap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmfw4o0ep2m9kdal40yap.png" alt="Template"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  Plain vanilla javascript template is used here. There are  more templates including angular, react, vue, blazor available at &lt;a href="https://github.com/staticwebdev" rel="noopener noreferrer"&gt;https://github.com/staticwebdev&lt;/a&gt;
&lt;/h6&gt;

&lt;p&gt;Open visual studio code and open a new terminal with bash.&lt;br&gt;
Go to folder where you wish to do your development and run the following command to clone the github project to your local machine.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git clone https://github.com/&amp;lt;your_github_account&amp;gt;/mycalendar.git


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In VSCode select File &amp;gt; Open Folder to open the cloned &lt;em&gt;mycalendar&lt;/em&gt; repository&lt;/p&gt;

&lt;p&gt;Delete files &lt;em&gt;package.json&lt;/em&gt;, &lt;em&gt;package-lock.json&lt;/em&gt;, &lt;em&gt;playwright.config.ts&lt;/em&gt;, entire &lt;em&gt;tests&lt;/em&gt; folder, entire &lt;em&gt;.devcontainer&lt;/em&gt; folder and both files (&lt;em&gt;playwright-onDemand.yml&lt;/em&gt; and &lt;em&gt;playwright-scheduled.yml&lt;/em&gt;) in &lt;em&gt;.github/workflows&lt;/em&gt; folder (but do not delete this folder)&lt;/p&gt;

&lt;h3&gt;
  
  
  Update &lt;em&gt;Github&lt;/em&gt; Repository
&lt;/h3&gt;

&lt;p&gt;At the command prompt in terminal, run the following command to update changes in workspace to staging area.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git add --all


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Check the changes to be committed.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git status


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Commit changes from staging to the local repository.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git commit -m "Initial commit to create base repository"


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Push code to Github &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git push -u origin main


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvd1rf296xsswyn2hgu8y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvd1rf296xsswyn2hgu8y.png" alt="Update Github repository"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Deploy to Azure
&lt;/h3&gt;

&lt;p&gt;In Visual Studio Code, press &lt;em&gt;F1&lt;/em&gt; OR &lt;em&gt;Ctrl+Shift+P&lt;/em&gt; to open Command Palette.&lt;br&gt;
Search and Select &lt;em&gt;Azure Static Web Apps:Create Static Web App..&lt;/em&gt;. &lt;br&gt;
In ensuing screens, select your subscription, &lt;br&gt;
select an existing resource group or create a new one (if the screen is shown - not in below image), &lt;br&gt;
decide and select on a free plan or standard plan (if the screen is shown - not in below image), &lt;br&gt;
provide a name to the web app, &lt;br&gt;
and then select the region to deploy. &lt;br&gt;
The next screen provides a list of frontend frameworks. As the application is a vanilla javascript application, choose &lt;em&gt;Custom&lt;/em&gt;.&lt;br&gt;
Next, provide the location of application code - &lt;em&gt;/src&lt;/em&gt; as this is where index.html resides. &lt;br&gt;
Leave the API location blank for now (if the screen is shown - not in below image) &lt;br&gt;
and finally provide the location of build output also &lt;em&gt;/src&lt;/em&gt; (this is primarily useful if a framework such as angular, react, svelte, etc is used and in such cases, the build folder location goes here).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fur6c7czf6o7au4ewneij.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fur6c7czf6o7au4ewneij.png" alt="Static Web App creation steps"&gt;&lt;/a&gt;&lt;br&gt;
Finally, a message is shown in Azure Activity log stating that the Azure Static Web Site is created and a git pull is executed to download a workflow file from github and kept in &lt;em&gt;.github/workflows&lt;/em&gt; folder. One of the best things about Static Web Apps is the fact that CI / CD is integrated using Github Action Workflows and anything that is now added to the workspace and then pushed to github repository will get deployed on Azure automatically.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl6321n9rkous8g5tdcj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl6321n9rkous8g5tdcj.png" alt="Static Web App creation completed"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  Azure and Github credentials will have to be provided in the above process.
&lt;/h6&gt;

&lt;p&gt;Check the all resources screen in Azure Portal to find the &lt;em&gt;mycalendar&lt;/em&gt; static web app. On selecting the same, a page loads which shows the static webapp URL, the source code, action run history and the workflow which was pulled to the workspace (the last three are github links).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fasumsmwfbu7s0yexkb36.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fasumsmwfbu7s0yexkb36.png" alt="Azure Static Web App"&gt;&lt;/a&gt;&lt;br&gt;
Confirm that the static website URL loads.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create Application Interface
&lt;/h3&gt;

&lt;p&gt;To create the calendar interface, the free and open-source &lt;a href="https://javascript.daypilot.org/download/" rel="noopener noreferrer"&gt;DayPilot Lite library&lt;/a&gt; is used.&lt;/p&gt;

&lt;p&gt;Download the library and add the &lt;em&gt;daypilot-all.min.js&lt;/em&gt; to the &lt;em&gt;src&lt;/em&gt; folder.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpataohd50kw0ejdeu2x7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpataohd50kw0ejdeu2x7.png" alt="src folder with html and js"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Update index.html code to the one below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

&amp;lt;!DOCTYPE html&amp;gt;
&amp;lt;html lang="en"&amp;gt;

&amp;lt;head&amp;gt;
  &amp;lt;meta charset="UTF-8"&amp;gt;
  &amp;lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&amp;gt;
  &amp;lt;script src="daypilot-all.min.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;title&amp;gt;My Calendar&amp;lt;/title&amp;gt;
&amp;lt;/head&amp;gt;

&amp;lt;body&amp;gt;
  &amp;lt;main&amp;gt;
    &amp;lt;h1&amp;gt;My Calendar&amp;lt;/h1&amp;gt;
  &amp;lt;/main&amp;gt;

  &amp;lt;div id="mycalendar"&amp;gt;&amp;lt;/div&amp;gt;

&amp;lt;script type="text/javascript"&amp;gt;
    const mycalendar = new DayPilot.Month("mycalendar", {
        startDate: "2023-01-01",
        onTimeRangeSelected: async function (args) {

            const colors = [
                {name: "Blue", id: "#3c78d8"},
                {name: "Green", id: "#6aa84f"},
                {name: "Yellow", id: "#f1c232"},
                {name: "Red", id: "#cc0000"},
            ];

            const form = [
                {name: "Text", id: "text"},
                {name: "Start", id: "start", type: "datetime"},
                {name: "End", id: "end", type: "datetime"},
                {name: "Color", id: "barColor", options: colors}
            ];

            const data = {
                text: "Event",
                start: args.start,
                end: args.end,
                barColor: "#6aa84f"
            };

            const modal = await DayPilot.Modal.form(form, data);

            mycalendar.clearSelection();

            if (modal.canceled) {
                return;
            }

            mycalendar.events.add({
                start: modal.result.start,
                end: modal.result.end,
                id: DayPilot.guid(),
                text: modal.result.text,
                barColor: modal.result.barColor
            });
        }
    });

    mycalendar.events.list = [
    {
      "start": "2023-01-12T10:30:00",
      "end": "2023-01-12T15:30:00",
      "id": "225eb40f-5f78-b53b-0447-a885c8e92233",
      "text": "React Interview with Shirish Kumar",
      "barColor":"#cc0000"
    },
    {
      "start": "2023-01-16T12:30:00",
      "end": "2023-01-18T17:00:00",
      "id": "1f67def5-e1dd-57fc-2d39-eb7a5f8e789a",
      "text": "Kubernetes Interview with Ramesh Bhat",
      "barColor":"#3c78d8"
    },
    {
      "start": "2023-01-25T10:30:00",
      "end": "2023-01-25T16:00:00",
      "id": "aba78fd9-09d0-642e-612d-0e7e002c29f5",
      "text": "AAD Interview with Girish C",
      "barColor":"#cc0000"
    }
  ];

    mycalendar.init();


&amp;lt;/script&amp;gt;
&amp;lt;/body&amp;gt;

&amp;lt;/html&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code, a div with id &lt;code&gt;mycalendar&lt;/code&gt; is added which is referenced in the javascript. &lt;code&gt;DayPilot.Month&lt;/code&gt; ensures that the view is month based and the starting date of the calendar is stated using the &lt;code&gt;startDate&lt;/code&gt; attribute.&lt;br&gt;
&lt;code&gt;onTimeRangeSelected&lt;/code&gt; section allows the user to add an event by either clicking on a single date or selecting multiple dates by dragging on screen. &lt;code&gt;DayPilot.Modal.form&lt;/code&gt; provides a form with values to update and Save the event.&lt;br&gt;
&lt;code&gt;mycalendar.events.list&lt;/code&gt; adds existing event data by adding an array of data in a format the library expects.&lt;/p&gt;

&lt;p&gt;In the bash terminal, go to &lt;em&gt;mycalendar&lt;/em&gt; folder and run the following command.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

swa start src


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  This command will be available if Static Web Apps CLI is installed. Testing and Debugging is one of the primary challenges with serverless as application is broken into smaller pieces and replicating the environment locally for developer is hard. Static Web Apps CLI solves this problem as we will see later. &lt;em&gt;src&lt;/em&gt; is the folder where static content including HTML, images, javascript and stylesheets are kept.
&lt;/h6&gt;

&lt;p&gt;On running the command Azure Static Web Apps emulator starts and the calendar application can be accessed at &lt;a href="http://localhost:4280" rel="noopener noreferrer"&gt;http://localhost:4280&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Confirm that events added in code can be viewed as well as new events can be added by clicking on a date.&lt;/p&gt;
&lt;h2&gt;
  
  
  Implement Authentication
&lt;/h2&gt;

&lt;p&gt;Azure Static Web Apps has support for &lt;em&gt;GitHub&lt;/em&gt;, &lt;em&gt;Twitter&lt;/em&gt;, and &lt;em&gt;Azure Active Directory&lt;/em&gt; for authentication by default. Moreover, Static Web Apps CLI provides authentication emulator to mock responses from the three providers mentioned.&lt;/p&gt;

&lt;p&gt;To enable login using &lt;em&gt;github&lt;/em&gt; update the HTML main content area (between  tags) to the content below.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

&amp;lt;main&amp;gt;
    &amp;lt;h1&amp;gt;My Calendar&amp;lt;/h1&amp;gt;
    &amp;lt;p&amp;gt;
    &amp;lt;div id="login" style="display: flex; justify-content: end;"&amp;gt;&amp;lt;a href="/.auth/login/github"&amp;gt;Login&amp;lt;/a&amp;gt;&amp;lt;/div&amp;gt;
&amp;lt;/main&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After authentication, to get access to user information such as user id / email, Azure Static Web Apps provides an API endpoint which means not only do developers not have to implement and maintain any oauth related code but also the endpoint does not face serverless architecture challenges like cold start delays.&lt;/p&gt;

&lt;p&gt;Update the &lt;code&gt;&amp;lt;script&amp;gt;&lt;/code&gt; area with code below under &lt;code&gt;mycalendar.init()&lt;/code&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

    mycalendar.init(); //Add the code below

    const app = {
      getUserInfo() {
      return fetch('/.auth/me')
        .then(response =&amp;gt;{
            return response.json();
          }).then(data =&amp;gt;{
            const { clientPrincipal } = data;
            console.log(clientPrincipal);
            if (clientPrincipal !=null){
              const userDetails= clientPrincipal.userDetails;
              return userDetails;
            }
            return null;
          })
      },
      init(){
        app.getUserInfo()
            .then(user =&amp;gt;{
             console.log(user); 
            })
      }
    };
    app.init();
  &amp;lt;/script&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code &lt;code&gt;init&lt;/code&gt; function calls &lt;code&gt;getUserInfo&lt;/code&gt; function which in turn calls the &lt;em&gt;direct-access endpoint&lt;/em&gt; &lt;code&gt;/.auth/me&lt;/code&gt; and from the resultant response gets the &lt;em&gt;github userid&lt;/em&gt; (provided authenticated by github, else returns null) which gets logged in the browser console.&lt;/p&gt;

&lt;p&gt;In the bash terminal, run the following command again.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

swa start src


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Load the web page in a browser in incognito mode. After clicking on the login button, the emulator mock screen comes up.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ouaqpzdi1vw0d78b91f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ouaqpzdi1vw0d78b91f.png" alt="SWA Auth emulator"&gt;&lt;/a&gt;&lt;br&gt;
In the mock screen, add your first name in the Username field as shown in image (&lt;em&gt;arnab&lt;/em&gt; is shown in image) and select Login.&lt;/p&gt;

&lt;p&gt;The calendar interface shows up as Index.html is loaded. Open developer tools and go to Console. ClientPrincipal data as well as Username is shown as in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffabjbloy4ol0febmrmh3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffabjbloy4ol0febmrmh3.png" alt="Developer tools console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create API and integrate with Static Site
&lt;/h2&gt;

&lt;p&gt;Running logic on the browser has certain limitations namely the ability to connect to data stores / databases to persist data and then retrieve the same. That is where the necessity to run some part of the code server side comes in.&lt;br&gt;
Azure Static Web Apps supports serverless API endpoints powered by Azure Functions where HTTP request triggers the function. The API route is fixed at /api. Also, Azure Static Web Apps extension for Visual Studio Code creates the Function templates in api folder by default. Also, in local development environment API will run in port 7071 and not port 4280 where the static site runs. This in normal cases will lead to Cross-Origin Resource Sharing (CORS) errors like &lt;em&gt;Access to XMLHttpRequest at ""&lt;a href="http://localhost:7071/api/events%22" rel="noopener noreferrer"&gt;http://localhost:7071/api/events"&lt;/a&gt;" from origin ""&lt;a href="http://localhost:4280%22" rel="noopener noreferrer"&gt;http://localhost:4280"&lt;/a&gt;" has been blocked by CORS policy&lt;/em&gt;. But, Azure Static Web Apps (using Reverse Proxy) as well as the CLI (for local development scenario) takes care of this challenge as it makes the static web app and API appear to come from the same domain.&lt;/p&gt;

&lt;p&gt;The calendar application allows users to view their existing events as well as add new events to the calendar.&lt;/p&gt;

&lt;p&gt;For viewing existing events use case, &lt;em&gt;GET&lt;/em&gt; method at route endpoint &lt;em&gt;events&lt;/em&gt; will be used which means that the full api endpoint at the static site will be &lt;em&gt;api/events&lt;/em&gt;&lt;br&gt;
For adding new events use case, POST method at the same endpoint can be used.&lt;/p&gt;

&lt;h3&gt;
  
  
  View existing events use case
&lt;/h3&gt;

&lt;p&gt;In Visual Studio Code, press &lt;em&gt;F1&lt;/em&gt; OR &lt;em&gt;Ctrl+Shift+P&lt;/em&gt; to open Command Palette. Search and Select &lt;em&gt;Azure Static Web Apps: Create HTTP Function&lt;/em&gt;. &lt;br&gt;
In ensuing screens select &lt;em&gt;C#&lt;/em&gt; as language, &lt;br&gt;
add &lt;em&gt;GetEvents&lt;/em&gt; as Function Name, &lt;br&gt;
add &lt;code&gt;&amp;lt;Your_First_Name&amp;gt;.MyCalendar&lt;/code&gt; as Namespace (&lt;em&gt;Arnab.MyCalendar&lt;/em&gt; in my case) &lt;br&gt;
and &lt;em&gt;Anonymous&lt;/em&gt; as Access Rights (Good enough for POC scenarios but never use this setting in production).&lt;/p&gt;

&lt;h6&gt;
  
  
  Security is important. Do check out &lt;a href="https://learn.microsoft.com/en-us/azure/architecture/serverless-quest/functions-app-security" rel="noopener noreferrer"&gt;Azure Architecture - Serverless Functions security&lt;/a&gt;
&lt;/h6&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb53jkrfy8bjavccw3kqb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb53jkrfy8bjavccw3kqb.png" alt="Create HTTP Function steps"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A new folder in the workspace gets created named &lt;em&gt;api&lt;/em&gt; and a Functions project gets created.&lt;/p&gt;

&lt;h6&gt;
  
  
  A simple piece of code is written next just sufficient to test the integration of API with static site.
&lt;/h6&gt;

&lt;p&gt;In GetEvents.cs file update the contents to the code below. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using System.Threading.Tasks;
using System.Collections.Generic;
using System.Text.Json;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;


namespace Arnab.MyCalendar
{
    public static class GetEvents
    {
        [FunctionName("GetEvents")]
        public static async Task&amp;lt;IActionResult&amp;gt; Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "events")] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("Get Events list");
            string username = req.Query["u"];
            log.LogInformation(name);
            string json = @"[
                        {
                        ""start"": ""2023-01-12T10:30:00"",
                        ""end"": ""2023-01-12T15:30:00"",
                        ""id"": ""225eb40f-5f78-b53b-0447-a885c8e92233"",
                        ""text"": ""React Interview with Shirish Kumar"",
                        ""barColor"":""#cc0000""
                        },
                        {
                        ""start"": ""2023-01-16T12:30:00"",
                        ""end"": ""2023-01-18T17:00:00"",
                        ""id"": ""1f67def5-e1dd-57fc-2d39-eb7a5f8e789a"",
                        ""text"": ""Kubernetes Interview with Ramesh Bhat"",
                        ""barColor"":""#3c78d8""
                        },
                        {
                        ""start"": ""2023-01-25T10:30:00"",
                        ""end"": ""2023-01-25T16:00:00"",
                        ""id"": ""aba78fd9-09d0-642e-612d-0e7e002c29f5"",
                        ""text"": ""AAD Interview with Girish C"",
                        ""barColor"":""#cc0000""
                        }
                    ]";
            List&amp;lt;Dictionary&amp;lt;string, string&amp;gt;&amp;gt; results=null;
            if (username == "arnab"){
            results =JsonSerializer.Deserialize&amp;lt;List&amp;lt;Dictionary&amp;lt;string, string&amp;gt;&amp;gt;&amp;gt;(json);
            }
            return new OkObjectResult(results);
        }
    }
}



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code we first specify the HTTP method (GET) and route (events).&lt;br&gt;
The API accepts a querystring 'u' which contains the username value.&lt;br&gt;
There is a hardcoded json string on similar lines as in index.html which is deserialized and returned provided the username is equal to a hardcoded value.&lt;/p&gt;

&lt;h6&gt;
  
  
  The hardcoded value here is &lt;em&gt;arnab&lt;/em&gt; but you should update that to your first name provided you are going to use that as username in mock authentication screen. Also do remember to change the namespace to &lt;code&gt;&amp;lt;Your_First_Name&amp;gt;.MyCalendar&lt;/code&gt;.
&lt;/h6&gt;

&lt;p&gt;To call this API from frontend, update &lt;em&gt;index.html&lt;/em&gt;. Block comment or remove &lt;code&gt;mycalendar.events.list&lt;/code&gt; code, add a &lt;code&gt;loadEvents&lt;/code&gt; function which will call the API and call this function from &lt;code&gt;init&lt;/code&gt; as code shown below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

/* mycalendar.events.list = [
      {
        "start": "2023-01-12T10:30:00",
        "end": "2023-01-12T15:30:00",
        "id": "225eb40f-5f78-b53b-0447-a885c8e92233",
        "text": "React Training",
        "barColor":"#cc0000"
      },
      //more data below
    ]; */

    mycalendar.init();

    const app = {
      loadEvents(user) {
        console.log(user);
        var url = new URL('/api/events')
        var params = {u:user}
        url.search = new URLSearchParams(params).toString();
        //console.log(url);
        fetch(url)
          .then(response =&amp;gt;{
            return response.json();
          }).then(data =&amp;gt;{
            //console.log(data);
            mycalendar.update({
              events: data
            });
          })                       
      }, //next there will be getUserInfo()
      init(){
        app.getUserInfo()
            .then(user =&amp;gt;{
             console.log(user); 
             if (user !=null){
              app.loadEvents(user);  
              }
            })
      }
    };
    app.init();


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the code above &lt;code&gt;getUserInfo&lt;/code&gt; method is called and if usename is not null, &lt;code&gt;loadEvents&lt;/code&gt; method is called where the username is added to the API endpoint as querystring / search parameters and the return data is updated in the calendar.  &lt;/p&gt;

&lt;p&gt;To test the API and its integration with the static site, in the bash terminal, run the following command.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

swa start src --api-location api


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;When the web site loads, login with  as username in the ensuing emulator mock screen. The calendar gets updated with data but this time the data is sent from server.&lt;/p&gt;

&lt;h6&gt;
  
  
  In case you face any challenge / errors in running the above SWA CLI command, open a new bash terminal, ensure you are in api folder, update API urls in index.html from '&lt;code&gt;/api/events&lt;/code&gt;' to '&lt;code&gt;http://localhost:7071/api/events&lt;/code&gt;' and run the following command to run just the Azure function and in the first terminal run &lt;em&gt;swa start src&lt;/em&gt; as before. This style of running is also useful in debugging as you bifurcate the running of frontend site and backend API. But, do remember to switch back the urls to its earlier form before publishing the application to Azure.
&lt;/h6&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

func start --cors http://localhost:4280 --port 7071


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Add new events use case
&lt;/h3&gt;

&lt;p&gt;Create a new HTTP Function as before and name this PostEvents.&lt;/p&gt;

&lt;p&gt;In PostEvents.cs file update the contents to the code below.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;

namespace Arnab.MyCalendar
{
    public static class PostEvents
    {
        [FunctionName("PostEvents")]
        public static async Task&amp;lt;IActionResult&amp;gt; Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "events")] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("Post Event");

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            string cuser = data?.cuser;
            dynamic cevent=data?.cevent;
            log.LogInformation(cuser);
            string eguid =Guid.NewGuid().ToString(); 
            cevent.id=eguid;
            return new OkObjectResult(cevent);
        }
    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code, username and event data is being retrieved from request body, a Guid is created and added to the event data and the event data with Guid is returned.&lt;/p&gt;

&lt;p&gt;To call this API from frontend, update &lt;em&gt;index.html&lt;/em&gt;. Add an &lt;code&gt;addEvents&lt;/code&gt; function and update the mycalendar &lt;code&gt;onTimeRangeSelected&lt;/code&gt; function.The final script section in index.html is as code shown below&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

  &amp;lt;script type="text/javascript"&amp;gt;
      const mycalendar = new DayPilot.Month("mycalendar", {
          startDate: "2023-01-01",
          onTimeRangeSelected: async function (args) {

              const colors = [
                  {name: "Blue", id: "#3c78d8"},
                  {name: "Green", id: "#6aa84f"},
                  {name: "Yellow", id: "#f1c232"},
                  {name: "Red", id: "#cc0000"},
              ];

              const form = [
                  {name: "Text", id: "text"},
                  {name: "Start", id: "start", type: "datetime"},
                  {name: "End", id: "end", type: "datetime"},
                  {name: "Color", id: "barColor", options: colors}
              ];

              const data = {
                  text: "Event",
                  start: args.start,
                  end: args.end,
                  barColor: "#6aa84f"
              };
              const modal = await DayPilot.Modal.form(form, data);

              mycalendar.clearSelection();

              if (modal.canceled) {
                  return;
              }
              app.getUserInfo()
              .then(user =&amp;gt;{
                console.log(user); 
                if (user !=null){
                  const event = {
                    start: modal.result.start,
                    end: modal.result.end,
                    text: modal.result.text,
                    barColor: modal.result.barColor
                  };
                  app.addEvents(user,event);  
                }
              })

          }
      });


    mycalendar.init();

    const app = {
      loadEvents(user) {
        console.log(user);
        var url = new URL('/api/events')
        var params = {u:user}
        url.search = new URLSearchParams(params).toString();
        //console.log(url);
        fetch(url)
          .then(response =&amp;gt;{
            return response.json();
          }).then(data =&amp;gt;{
            //console.log(data);
            mycalendar.update({
              events: data
            });
          })                       
      },
      addEvents(user,event){
        fetch('/api/events', {
                method: 'POST',
                headers: {
                    'Accept': 'application/json',
                    'Content-Type': 'application/json'
                },
                body: JSON.stringify({
                  cuser:user,
                  cevent:event,
                }),
            })
            .then(response =&amp;gt;{
              console.log(response);
            return response.json();
            }).then(data =&amp;gt;{
              console.log(data);
              mycalendar.events.add(data);
            }) 
      },
      getUserInfo() {
      return fetch('/.auth/me')
        .then(response =&amp;gt;{
            return response.json();
          }).then(data =&amp;gt;{
            const { clientPrincipal } = data;
            console.log(clientPrincipal);
            if (clientPrincipal !=null){
              console.log("inside clientprincipal not null");
              const userDetails= clientPrincipal.userDetails;
              return userDetails;
            }
            return null;
          })
      },
      init(){
        app.getUserInfo()
            .then(user =&amp;gt;{
             console.log(user); 
             if (user !=null){
              document.getElementById("login").style.display = "none";
              app.loadEvents(user);  
              }
            })
      }
    };
    app.init();
  &amp;lt;/script&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The new code in &lt;code&gt;onTimeRangeSelected&lt;/code&gt; calls &lt;code&gt;getUserInfo&lt;/code&gt; and if username is not null sends the username and event data to &lt;code&gt;addEvents&lt;/code&gt; function which makes the POST call to the API and updates the calendar with return data.&lt;/p&gt;

&lt;p&gt;Once again test the API and its integration as before to confirm that the code works.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implement Persistence layer with Cosmos DB
&lt;/h2&gt;

&lt;p&gt;Run the Cosmos DB emulator. Right click and select &lt;em&gt;Open Data Explorer&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg82m6lq1nhawsbnld61n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg82m6lq1nhawsbnld61n.png" alt="Cosmos DB emulator"&gt;&lt;/a&gt; &lt;br&gt;
Run the following command in the bash terminal to configure the connection string (available from &lt;em&gt;Primary Connection String&lt;/em&gt; in Explorer screen) in the function project settings in the &lt;em&gt;local.settings.json&lt;/em&gt; file.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

func settings add CosmosDBConnection "AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==" --connectionString


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;em&gt;local.settings.json&lt;/em&gt; file should now be added to the connectionstring as below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

  "ConnectionStrings": {
    "CosmosDBConnection": {
      "ConnectionString": "AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==",
      "ProviderName": "System.Data.SqlClient"
    }
  }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  Emulator connection string is same always unless changed manually. Also &lt;em&gt;local.settings.json&lt;/em&gt; is not pushed to github and the connection string there is only useful for local development. The command / process to add connection string in Azure is different and we will see so later.
&lt;/h6&gt;

&lt;p&gt;In the emulator screen select &lt;em&gt;Explorer&lt;/em&gt; and then &lt;em&gt;New Container&lt;/em&gt;.&lt;br&gt;
Add a new Database with name &lt;em&gt;myCalendar&lt;/em&gt;, state the Container Name as &lt;em&gt;eventsCollection&lt;/em&gt; and Partition Key &lt;em&gt;/userName&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk835s01tkdak5ijb8ixp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk835s01tkdak5ijb8ixp.png" alt="Cosmos DB -Create Database and Container"&gt;&lt;/a&gt;&lt;br&gt;
The database and Container can be viewed now in Explorer.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frcjdbt1tdh94577o96ue.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frcjdbt1tdh94577o96ue.png" alt="Cosmos DB emulator explorer"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An easy declarative way to connect Azure services including Cosmos DB to Azure functions is using bindings. Bindings are implemented in extension packages. Run the following dotnet add package command in the terminal to install the Cosmos DB extension package.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet add package Microsoft.Azure.WebJobs.Extensions.CosmosDB --version 4.0.0


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Add a new file in &lt;em&gt;api&lt;/em&gt; folder and name the same &lt;em&gt;event.cs&lt;/em&gt;. This file will have two classes, one in the data structure format in which data arrives from frontend and the other is a structure format as the JSON data that will be persisted in Cosmos DB.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
namespace Arnab.MyCalendar
{
    public class CalendarEvent
    {
        public string id { get; set; }
        public string userName{ get; set; }
        public string startsAt { get; set; }
        public string endsAt { get; set; }
    #nullable enable
        public string? eventTitle { get; set; }
        public string? barColor { get; set; }
        public DateTime eventCreateDate{ get; set; }
    }

    public class ClientPostEvent
    {
        #nullable enable
        public string? id { get; set; }
        public string? start { get; set; }
        public string? end { get; set; }
        public string? text { get; set; }
        public string? barColor { get; set; }
    }

    public class ClientData
    {
        public string cuser { get; set; }
        public ClientPostEvent cevent { get; set; }
    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the code above &lt;code&gt;CalendarEvent&lt;/code&gt; will be used to persist data in Cosmos DB.&lt;/p&gt;

&lt;p&gt;Update the contents of PostEvents.cs to the code below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using System.Text.Json;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;

namespace Arnab.MyCalendar
{
    public static class PostEvents
    {
        [FunctionName("PostEvents")]
        public static async Task&amp;lt;IActionResult&amp;gt; Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "events")] HttpRequest req,
            [CosmosDB(
            databaseName: "myCalendar",
            containerName: "eventsCollection",
            Connection = "CosmosDBConnection")]
            IAsyncCollector&amp;lt;CalendarEvent&amp;gt; eventsOut,ILogger log)
        {
            log.LogInformation("Post Event");

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            ClientData data = JsonSerializer.Deserialize&amp;lt;ClientData&amp;gt;(requestBody);
            string cuser = data?.cuser;
            ClientPostEvent cevent=data?.cevent;
            log.LogInformation(cuser);
            string eguid =Guid.NewGuid().ToString(); 
            cevent.id=eguid;
            CalendarEvent nevent = new CalendarEvent() { 
                id = cevent.id,
                userName= cuser,
                startsAt=cevent.start,
                endsAt=cevent.end,
                eventTitle=cevent.text,
                barColor=cevent.barColor,
                eventCreateDate=DateTime.Now
                };

            await eventsOut.AddAsync(nevent);
            return new OkObjectResult(cevent);
        }
    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the code above, the presence of bindings enable in connecting to database seamlessly with the attribute &lt;code&gt;CosmosDB&lt;/code&gt;. Another parameter &lt;code&gt;eventsOut&lt;/code&gt; is of type &lt;code&gt;IAsyncCollector&amp;lt;CalendarEvent&amp;gt;&lt;/code&gt; ensuring that any instance of it calling &lt;code&gt;AddAsync&lt;/code&gt; gets an instance of &lt;code&gt;CalendarEvent&lt;/code&gt; persisted in the database.&lt;/p&gt;

&lt;p&gt;Update the contents of GetEvents.cs to the code below.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using System.Threading.Tasks;
using System.Collections.Generic;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Cosmos;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;

namespace Arnab.MyCalendar
{
     public static class GetEvents
    {
        [FunctionName("GetEvents")]
        public static async Task&amp;lt;IActionResult&amp;gt; Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "events")] HttpRequest req,
            [CosmosDB(
                databaseName: "myCalendar",
                containerName: "eventsCollection",
                Connection = "CosmosDBConnection")] CosmosClient client,
            ILogger log)
        {
            log.LogInformation("Get Events list");
            string name = req.Query["u"];

            List&amp;lt;Dictionary&amp;lt;string, string&amp;gt;&amp;gt; results=new  List&amp;lt;Dictionary&amp;lt;string, string&amp;gt;&amp;gt;();
            Container myContainer = client.GetDatabase("myCalendar").GetContainer("eventsCollection");
            QueryDefinition queryDefinition = new QueryDefinition(
                "SELECT * FROM items i WHERE (i.userName = @searchterm)")
                .WithParameter("@searchterm", name);
            string continuationToken = null;
            do
            {
            FeedIterator&amp;lt;CalendarEvent&amp;gt; feedIterator = 
                myContainer.GetItemQueryIterator&amp;lt;CalendarEvent&amp;gt;(
                        queryDefinition, 
                        continuationToken: continuationToken);

                while (feedIterator.HasMoreResults)
                {
                    FeedResponse&amp;lt;CalendarEvent&amp;gt; feedResponse = await feedIterator.ReadNextAsync();
                    continuationToken = feedResponse.ContinuationToken;
                    foreach (CalendarEvent item in feedResponse)
                    {
                        results.Add(new Dictionary&amp;lt;string, string&amp;gt;(){
                                    {"start", item.startsAt},
                                    {"end", item.endsAt},
                                    {"id", item.id},
                                    {"text", item.eventTitle},
                                    {"barColor", item.barColor}
                        });
                    }
                }
            } while (continuationToken != null);

            return new OkObjectResult(results);

        }
    } 
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the above code, a Azure Cosmos DB binding provided &lt;code&gt;CosmosClient&lt;/code&gt; instance, available in extension version 4.x, is used to read a list of documents. With the access of &lt;code&gt;CosmosClient&lt;/code&gt; instance, one can do complex stuff with Cosmos DB in Static Web Applications / Functions. &lt;br&gt;
Here, &lt;code&gt;CosmosClient&lt;/code&gt; object is used to configure and execute requests against the Azure Cosmos DB service. &lt;code&gt;Database&lt;/code&gt; is the reference to database and &lt;code&gt;Container&lt;/code&gt; a reference to container and they both are validated serverside.&lt;br&gt;
&lt;code&gt;QueryDefinition&lt;/code&gt; helps with the query and its parameters. &lt;code&gt;FeedIterator&amp;lt;&amp;gt;&lt;/code&gt; helps in tracking the current page of results and getting a new page of results while &lt;code&gt;FeedResponse&amp;lt;&amp;gt;&lt;/code&gt; represents a single page of responses which is iterated over using a foreach loop.&lt;br&gt;
Also notice the usage of &lt;code&gt;continuationToken&lt;/code&gt; and &lt;code&gt;.WithParameter&lt;/code&gt; goodness. &lt;/p&gt;

&lt;h6&gt;
  
  
  Though this code gets all events of a user, in production scenarios you would not wish to do that, and along with username also send start date and end date in query string to ensure limited amount of data is accessed or retrieved.
&lt;/h6&gt;

&lt;p&gt;To test open a new browser in incognito / inPrivate mode, login and confirm that you are able to add new events.&lt;br&gt;
To check whether the events are being persisted, check the Cosmos DB emulator explorer screen to confirm data is being persisted. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pt5gpta4posusksgele.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pt5gpta4posusksgele.png" alt="Cosmos DB emulator- Data Persisted"&gt;&lt;/a&gt;&lt;br&gt;
Next close the browser and open another browser again in incognito / inPrivate mode, login and confirm that you are able to view the events added earlier and persisted in Cosmos DB.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploy application to Azure
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Provision Database
&lt;/h3&gt;

&lt;p&gt;Log in to Azure Portal, Search for Cosmos DB and select the top result.&lt;br&gt;
Next create Cosmos DB account by selecting &lt;em&gt;Create&lt;/em&gt; under &lt;em&gt;Azure Cosmos DB For NoSQL&lt;/em&gt; box. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhksvdrkix0ozho84r53q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhksvdrkix0ozho84r53q.png" alt="Cosmos DB Account API selection"&gt;&lt;/a&gt;&lt;br&gt;
In the next screen, select subscription, select an existing Resource Group or create a new one, add an account name, choose the nearest location and choose &lt;em&gt;Serverless&lt;/em&gt; in capacity mode.&lt;br&gt;
Backup policy can be changed as well and locally redundant backups selected (sufficient for POC). Selecting &lt;em&gt;Review + Create&lt;/em&gt; button provisions the database.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1jlveki19upmayuq8z9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1jlveki19upmayuq8z9.png" alt="Cosmos DB account creation"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  One could also try Cosmos DB by going to &lt;a href="https://cosmos.azure.com/try" rel="noopener noreferrer"&gt;cosmos.azure.com/try&lt;/a&gt;. Selecting the account type - &lt;em&gt;Azure Cosmos DB For NoSQL&lt;/em&gt; would create a trial account for 30 days which would open in Azure Portal.
&lt;/h6&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbleuybe1j9meqc4y9an.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbleuybe1j9meqc4y9an.png" alt="Try cosmos db"&gt;&lt;/a&gt;&lt;br&gt;
Once Azure Cosmos DB account page opens up, select the Data Explorer tab and a similar interface as the Data Explorer emulator opens up. Select &lt;em&gt;New Container&lt;/em&gt; box, Add a new Database with name &lt;em&gt;myCalendar&lt;/em&gt;, state the Container Name as &lt;em&gt;eventsCollection&lt;/em&gt; and Partition Key &lt;em&gt;/userName&lt;/em&gt;.&lt;br&gt;
Once the database is created, select the &lt;em&gt;Connect&lt;/em&gt; box to view the connection string (OR select &lt;code&gt;Key&lt;/code&gt; under &lt;code&gt;Settings&lt;/code&gt; in left bar).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fici4rb9eknn448p0pyz4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fici4rb9eknn448p0pyz4.png" alt="Azure Cosmos DB Data Explorer"&gt;&lt;/a&gt;&lt;br&gt;
Copy the same and update the &lt;em&gt;local.settings.json&lt;/em&gt; connection string in Visual Studio Code.&lt;br&gt;
Run the application and open it in a browser, add new events and confirm that they persisted in Azure Cosmos DB by viewing them in Azure Cosmos DB data explorer.&lt;/p&gt;

&lt;h3&gt;
  
  
  The CI / CD Magic
&lt;/h3&gt;

&lt;p&gt;The workflow file in &lt;em&gt;.github/workflows&lt;/em&gt; folder needs to be updated to let know the location of api code.&lt;br&gt;
Search for &lt;code&gt;api_location&lt;/code&gt; in &lt;em&gt;Repository/Build Configurations&lt;/em&gt; section and update the value to &lt;code&gt;/api&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1a0291pguoektr1re1b9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1a0291pguoektr1re1b9.png" alt="Update workflow file"&gt;&lt;/a&gt;&lt;br&gt;
Cosmos DB connection string was configured in &lt;em&gt;local.settings.json&lt;/em&gt; file for local development. But, this file is not available in production environment.&lt;br&gt;
To configure the connection string in production, go to  &lt;em&gt;mycalendar&lt;/em&gt; Static Web App page in Azure Portal, and select &lt;em&gt;Configuration&lt;/em&gt; from the left hand bar. Click &lt;em&gt;Add&lt;/em&gt; and in the next screen add &lt;em&gt;CosmosDBConnection&lt;/em&gt; as Name and Azure Cosmos DB connection string as Value. Click &lt;em&gt;OK&lt;/em&gt; and then &lt;em&gt;Save&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mtf92gp0mq1229hjwg2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mtf92gp0mq1229hjwg2.png" alt="Update Application Setting - Cosmos DB Connectionstring"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Visual Studio Code Terminal, open a command prompt, and from &lt;em&gt;mycalendar&lt;/em&gt; folder run the following command to commit all code changes and push them to Github.&lt;/p&gt;

&lt;h6&gt;
  
  
  Though in this article / post, this step is done at the very end, in application development lifecycle, this ought to be done after each small change (after interface design, after authentication implementation, after API addition, etc )
&lt;/h6&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git add -A 
git status
git commit -m "Final Code with GET, POST, Auth and Azure Cosmos DB"
git push -u origin main


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxlacta0v5fdvpjcvwye.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxlacta0v5fdvpjcvwye.png" alt="Push to Github"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then the magic happens in the &lt;em&gt;Actions&lt;/em&gt; tab in github repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpwuexepk0aj9mnkuob2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpwuexepk0aj9mnkuob2b.png" alt="Github Actions"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  I would strongly urge all to check the steps (&lt;em&gt;Build Azure&lt;/em&gt; and &lt;em&gt;Build and Deploy&lt;/em&gt;)  to understand all that happens automatically.
&lt;/h6&gt;

&lt;p&gt;Once this step completes, the application is built on Azure and is ready for users.&lt;/p&gt;

&lt;h3&gt;
  
  
  Run Application
&lt;/h3&gt;

&lt;p&gt;Copy the URL from &lt;em&gt;mycalendar&lt;/em&gt; Static Web Apps page in the Azure Portal and paste it in a new browser window / tab.&lt;br&gt;
On clicking Login, the application does not show the mock screen any more and goes to github to authenticate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuvipw3m4kn2hv46tc4xu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuvipw3m4kn2hv46tc4xu.png" alt="Github Auth Screen"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then there is a consent screen from Microsoft and then the Login steps are complete.&lt;br&gt;
Add new events, confirm that they persisted (Azure Cosmos DB data explorer) and then open the page in another tab to check the loading of existing events.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kkbgeq4f5nrwzb1vkit.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kkbgeq4f5nrwzb1vkit.png" alt="Calendar Application"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges
&lt;/h2&gt;

&lt;p&gt;Deploying a modern web application ain't easy.&lt;/p&gt;

&lt;p&gt;Even though in this article / post, vanilla javascript is used (to ensure that this article is useful to all developers with any framework skills such as angular, react, svelte, etc.), in production use cases some framework would be used. This application will have to be built and bundles generated.&lt;/p&gt;

&lt;p&gt;While building the application all routes must be carefully configured to ensure users do not receive 404 errors.&lt;/p&gt;

&lt;p&gt;APIs would have to be built and it is very much possible that different APIs are built using different technologies / languages.&lt;/p&gt;

&lt;p&gt;User authentication will have to be implemented as well as APIs secured.&lt;/p&gt;

&lt;p&gt;So, 'multiple servers' or even 'a cluster of servers' would have to be setup (to ensure reliability and availability) to host these bundles and APIs and if the bundles and APIs are hosted on different servers, CORS would have to be implemented or a Reverse Proxy configured. Also, a global CDN for the frontend bundles will be needed.&lt;/p&gt;

&lt;p&gt;SSL will have to be configured as well as the ability to have custom domains added.&lt;/p&gt;

&lt;p&gt;When maintainability of the application over a period of time is taken into account, a staging environment (similar to production) as well as an automated build process is necessary as well.&lt;/p&gt;

&lt;p&gt;Earlier even if one used cloud services, s/he would have to do all these.&lt;/p&gt;

&lt;p&gt;Azure Static Web Apps takes care of all the above challenges and ensures that the development team can focus on business requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Business Benefit
&lt;/h2&gt;

&lt;p&gt;The application developed can be used as a base for any scheduling application. Also, any application built on the lines mentioned in the article / post using Serverless technologies like Static Web Apps and Cosmos DB is beneficial as no necessity of server management, charges for consumed storage only, better scalability, lower latency, quick updates, IDE support (VS Code and SWA CLI), etc all lead to accelerated innovation.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Final Notes
&lt;/h2&gt;

&lt;p&gt;Serverless on Azure - &lt;a href="https://azure.microsoft.com/en-in/solutions/serverless/" rel="noopener noreferrer"&gt;https://azure.microsoft.com/en-in/solutions/serverless/&lt;/a&gt;&lt;br&gt;
Azure Static Web Apps - &lt;a href="https://azure.microsoft.com/en-gb/products/app-service/static" rel="noopener noreferrer"&gt;https://azure.microsoft.com/en-gb/products/app-service/static&lt;/a&gt;&lt;br&gt;
Azure Cosmos DB - &lt;a href="https://azure.microsoft.com/en-in/products/cosmos-db/" rel="noopener noreferrer"&gt;https://azure.microsoft.com/en-in/products/cosmos-db/&lt;/a&gt;&lt;br&gt;
SWA Templates - &lt;a href="https://github.com/staticwebdev" rel="noopener noreferrer"&gt;https://github.com/staticwebdev&lt;/a&gt;&lt;br&gt;
DayPilot Lite library - &lt;a href="https://javascript.daypilot.org/download/" rel="noopener noreferrer"&gt;https://javascript.daypilot.org/download/&lt;/a&gt;&lt;br&gt;
Source Code Repository - &lt;a href="https://github.com/c-arnab/mycalendar" rel="noopener noreferrer"&gt;https://github.com/c-arnab/mycalendar&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>azurefunctions</category>
      <category>staticwebapps</category>
      <category>cosmosdb</category>
    </item>
    <item>
      <title>Real time analytics using .Net &amp; Redis</title>
      <dc:creator>c-arnab</dc:creator>
      <pubDate>Tue, 26 Jul 2022 11:43:00 +0000</pubDate>
      <link>https://dev.to/c_arnab/real-time-analytics-using-net-redis-4c5d</link>
      <guid>https://dev.to/c_arnab/real-time-analytics-using-net-redis-4c5d</guid>
      <description>&lt;h4&gt;
  
  
  &lt;em&gt;In this post, we look at RedisTimeSeries module and build a custom web analytic tool with Redis &amp;amp; .NET 5 using C#&lt;/em&gt;
&lt;/h4&gt;

&lt;h2&gt;
  
  
  Application architecture Overview
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Why use RedisTimeSeries module
&lt;/h3&gt;

&lt;p&gt;Redis has been used for analytics applications since long and a search on platforms like github will provide multiple projects on similar lines.&lt;br&gt;
The last couple of years has seen Redis being extended with modern data models and data processing tools (Document, Graph, Search, and Time Series).&lt;br&gt;
These extensions enables development of certain kind of applications including near real time analytics applications much much easier as well as simpler.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://redis.io/docs/stack/timeseries/" rel="noopener noreferrer"&gt;RedisTimeSeries&lt;/a&gt; &lt;em&gt;is a Redis module that adds a time series data structure to Redis with features such as...&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;em&gt;High volume inserts, low latency reads&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Query by start time and end-time&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Aggregated queries (min, max, avg, sum, range, count, first, last, STD.P, STD.S, Var.P, Var.S, twa) for any time bucket&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Configurable maximum retention period&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Downsampling / compaction for automatically updated aggregated timeseries&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Secondary indexing for time series entries. Each time series has labels (field value pairs) which will allows to query by labels&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;p&gt;You will see most of these features in the demo application being developed.&lt;br&gt;
A strong advise would be to study the &lt;a href="https://redis.io/commands/?group=timeseries" rel="noopener noreferrer"&gt;Commands&lt;/a&gt; available which can be run from Redis CLI. All these commands map to the methods available in C# via &lt;a href="https://github.com/RedisTimeSeries/NRedisTimeSeries/" rel="noopener noreferrer"&gt;&lt;code&gt;NRedisTimeSeries&lt;/code&gt;&lt;/a&gt; - .Net Client for RedisTimeSeries&lt;/p&gt;

&lt;p&gt;At a basic level, one creates a new time series using &lt;code&gt;TS.CREATE&lt;/code&gt; and provide the retention period for each data point that goes in this time series.&lt;br&gt;
You might want to delete the key prior to using &lt;code&gt;TS.CREATE&lt;/code&gt; to ensure you do not receive an error in case the key exists.&lt;/p&gt;

&lt;p&gt;Append a sample (data point) to the time series using &lt;code&gt;TS.ADD&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Access the last value using &lt;code&gt;TS.GET&lt;/code&gt; or do aggregated queries using different methods.&lt;/p&gt;

&lt;h3&gt;
  
  
  Application architecture
&lt;/h3&gt;

&lt;p&gt;In web analytic applications, at a high level, there are trackers /loggers, either javascript (in browser) or C#/Java/Python/NodeJS (serverside) or mobile app sdks which generate data. This data passes to a messaging layer such as kafka or pulsar or even redis and is consumed by applications to collect, validate, enrich and finally store in data stores where they would be analyzed.&lt;/p&gt;

&lt;h6&gt;
  
  
  If one is developing an IOT analytics application, the process would be similar but the data would in this case be generated by various sensors.
&lt;/h6&gt;

&lt;p&gt;These trackers generate data from user events and sends lots of event attributes to the collector per event. For the demo application, to keep things simple, mock data is generated for a very limited number of attributes.&lt;/p&gt;

&lt;p&gt;I call this application Redmetrix.&lt;br&gt;&lt;br&gt;
For the demo, this mock data is ingested by a .Net console application- Redmetrix Processor which saves the mock data in redis (in time series data structure).&lt;br&gt;
The log files are in a format where every line is a valid json.&lt;br&gt;
Ingestor reads the files line by line and adds data to Redis.&lt;/p&gt;

&lt;h6&gt;
  
  
  In the real world applications, one might have this logic preferably in azure function / aws lambda which would scale up/ down as needed.
&lt;/h6&gt;

&lt;p&gt;Finally, we have a ASP.NET Core 5.0 web application using razor pages- Redmetrix Webapp which reads this data from redis and displays charts in a dashboard.&lt;/p&gt;

&lt;h6&gt;
  
  
  Though, in this article, we get the data in the page's codebehind file, in real world scenarios one would have a background worker which calls a service at specific intervals to get the necessary data from redis and sends the data via signalr to the clients using websockets and server side events.
&lt;/h6&gt;

&lt;h3&gt;
  
  
  Charts in Dashboard
&lt;/h3&gt;

&lt;p&gt;The final article and the corresponding code in &lt;a href="https://github.com/c-arnab/Redmetrix" rel="noopener noreferrer"&gt;github repository&lt;/a&gt; will have the following charts&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Total PageViews (in last min) &amp;amp; comparison with a min prior&lt;/li&gt;
&lt;li&gt;Total Transactions&lt;/li&gt;
&lt;li&gt;Total Order Value (in Rupees)&lt;/li&gt;
&lt;li&gt;Page performance for Home page and Product page (in last min)&lt;/li&gt;
&lt;li&gt;Conversion Funnel (in last min)&lt;/li&gt;
&lt;li&gt;Orders by Payment Method (in last min)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Demo Infrastructure
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Create Database on Redis Cloud
&lt;/h3&gt;

&lt;p&gt;Sign up at &lt;a href="https://redis.com/try-free/" rel="noopener noreferrer"&gt;Redis Cloud&lt;/a&gt; and / or Sign In&lt;/p&gt;

&lt;p&gt;Select New DataBase Button.&lt;/p&gt;

&lt;p&gt;Name the new database &lt;code&gt;redmetrix&lt;/code&gt; and select &lt;code&gt;Redis Stack&lt;/code&gt; radio button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgktbdsowmsji59t1p33y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgktbdsowmsji59t1p33y.png" alt="New DB"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the ensuing page where it shows the database configuration, please copy the &lt;code&gt;public endpoint&lt;/code&gt; and &lt;code&gt;password&lt;/code&gt; from &lt;code&gt;General&lt;/code&gt; and &lt;code&gt;Security&lt;/code&gt; section and keep it safe.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcknl6puucmil6qlxumfo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcknl6puucmil6qlxumfo.png" alt="Public endpoint"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fayydzv1bv1qhm18fksid.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fayydzv1bv1qhm18fksid.png" alt="Password"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Generate Mock Data
&lt;/h3&gt;

&lt;p&gt;One can use applications like Mockaroo to create dummy data.&lt;br&gt;
Check out &lt;a href="https://www.mockaroo.com/7d448f50" rel="noopener noreferrer"&gt;https://www.mockaroo.com/7d448f50&lt;/a&gt; to see the one going to be used here.&lt;/p&gt;

&lt;p&gt;The attributes used are &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;application.app_id - Application Id, Can be a String or even a number &lt;/li&gt;
&lt;li&gt;application.platform - Denote whether a website or mobile app
&lt;/li&gt;
&lt;li&gt;date_time.collector_tstamp_dt - DataTime when the event reached messaging layer or was enriched &lt;/li&gt;
&lt;li&gt;event_detail.event_type- Page Views or Form Change/Submit or Ecom Trasaction via event &lt;/li&gt;
&lt;li&gt;event_detail.event_id - Event Id- Guid&lt;/li&gt;
&lt;li&gt;contexts.page.type - Home/Category/Product. etc..&lt;/li&gt;
&lt;li&gt;contexts.performance_timing.domContentLoadedEventEnd - Page load end timing&lt;/li&gt;
&lt;li&gt;contexts.performance_timing.requestStart - Page load start timing&lt;/li&gt;
&lt;li&gt;contexts.transaction.amount_to_pay - Transaction Amount&lt;/li&gt;
&lt;li&gt;contexts.transaction.payment_method - Payment Method&lt;/li&gt;
&lt;/ol&gt;

&lt;h6&gt;
  
  
  For simplicity, all event types are Page Views but in real world applications, lions share of them are not going to be Page Views.
&lt;/h6&gt;

&lt;h6&gt;
  
  
  Also, Amount to Pay and Payment Method should be part of the event if and only if Page type is 'Success Confirmation'. But in the dummy data you would find these fields present in all events.
&lt;/h6&gt;

&lt;h6&gt;
  
  
  The Date field is randomly generated but in real world its going to be sequential. As we are not going to directly do anything with this field, this will not create any problems.
&lt;/h6&gt;

&lt;p&gt;After downloading the mock data a couple of times, save them in a folder after renaming them as &lt;code&gt;.txt&lt;/code&gt; files as though each line has json structure, the whole file does not(The file is not an array of json objects). &lt;/p&gt;

&lt;h2&gt;
  
  
  Developing Redmetrix
&lt;/h2&gt;

&lt;p&gt;Visual Studio Code will be used for development in this article. &lt;/p&gt;

&lt;h3&gt;
  
  
  Processor
&lt;/h3&gt;

&lt;p&gt;Create a base Console Application in the Terminal&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet new console -n "RedMetrixProcessor"


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Move inside the RedMetrixProcessor directory&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

cd RedMetrixProcessor


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Build and Run the application to ensure everything is working fine.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet build
dotnet run


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Open the folder &lt;code&gt;RedMetrixProcessor&lt;/code&gt; in VSCode and add a New file named &lt;code&gt;appsettings.json&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now copy the saved public endpoint and password from database configuration in the file as shown in code block&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

{
    "RedisOptions":{
        "EndPoints":"redis-14475.c280.us-central1-2.gce.cloud.redislabs.com:14475",
        "Password":"BVDwMO6Qj1tnDZalcxmfdH2mL1c1G5iA"
    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The next step is to install the required nuget packages&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet add package Microsoft.Extensions.Hosting
dotnet add package Microsoft.Extensions.Configuration.Json


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Normally console applications are not visually appealing. So, I will use a package &lt;code&gt;Spectre.Console&lt;/code&gt; to make it colorful.&lt;/p&gt;

&lt;h6&gt;
  
  
  I'll not be getting into the features or the code parts of Spectre in this article. But the code is going to be available on github. Do check it out.
&lt;/h6&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet add package Spectre.Console


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally, the packages required for Redis and RedisTimeSeries&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet add package StackExchange.Redis
dotnet add package NRedisTimeSeries


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The above steps will add the below codeblock to your &lt;code&gt;RedMetrixProcessor.csproj&lt;/code&gt; file&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

&amp;lt;ItemGroup&amp;gt;
    &amp;lt;PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="6.0.0" /&amp;gt;
    &amp;lt;PackageReference Include="Microsoft.Extensions.Hosting" Version="6.0.1" /&amp;gt;
    &amp;lt;PackageReference Include="NRedisTimeSeries" Version="1.4.0" /&amp;gt;
    &amp;lt;PackageReference Include="Spectre.Console" Version="0.44.0" /&amp;gt;
    &amp;lt;PackageReference Include="StackExchange.Redis" Version="2.6.48" /&amp;gt;
  &amp;lt;/ItemGroup&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Add the following code to ensure that &lt;code&gt;appsettings.json&lt;/code&gt; gets copied during build/run.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

  &amp;lt;ItemGroup&amp;gt;
   &amp;lt;None Update="appsettings.json"&amp;gt;
     &amp;lt;CopyToOutputDirectory&amp;gt;PreserveNewest&amp;lt;/CopyToOutputDirectory&amp;gt;
   &amp;lt;/None&amp;gt;
  &amp;lt;/ItemGroup&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Update the &lt;code&gt;Program.cs&lt;/code&gt; code to allow access to &lt;code&gt;appsettings.json&lt;/code&gt; and connect to Redis DB.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Spectre.Console;
using NRedisTimeSeries;
using NRedisTimeSeries.DataTypes;
using StackExchange.Redis;
namespace RedMetrixProcessor
{
    class Program
    {
        public static IConfigurationRoot configuration;
        static void Main(string[] args)
        {
            ServiceCollection serviceDescriptors = new ServiceCollection();

            ConfigureServices(serviceDescriptors);
            IServiceProvider serviceProvider = serviceDescriptors.BuildServiceProvider();

            var options = new ConfigurationOptions
                {
                    EndPoints = { configuration.GetSection("RedisOptions:EndPoints").Value },
                    Password = configuration.GetSection("RedisOptions:Password").Value,
                    Ssl = false
                };
            // Multiplexer is intended to be reused
            ConnectionMultiplexer redisMultiplexer = ConnectionMultiplexer.Connect(options);

            // The database reference is a lightweight passthrough object intended to be used and discarded
            IDatabase db = redisMultiplexer.GetDatabase();
            Console.WriteLine("Hello World!");
        }
        private static void ConfigureServices(IServiceCollection serviceCollection)
        {
            configuration = new ConfigurationBuilder()
                .SetBasePath(Directory.GetParent(AppContext.BaseDirectory).FullName)
                .AddJsonFile("appsettings.json")
                .Build();

            serviceCollection.AddSingleton&amp;lt;IConfigurationRoot&amp;gt;(configuration);


        }
    }
}




&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Create a new file &lt;code&gt;ConfigureInitializationServices.cs&lt;/code&gt; with two methods &lt;code&gt;DeleteKeys&lt;/code&gt; which deletes all data from database (if any data is present) and &lt;code&gt;InitializeTimeSeriesTotalPageViews&lt;/code&gt; which creates a time series for pageviews with a retention period of 10 mins for each data point / sample.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.Collections.Generic;
using Microsoft.Extensions.Configuration;
using Spectre.Console;
using NRedisTimeSeries;
using NRedisTimeSeries.DataTypes;
using NRedisTimeSeries.Commands;
using StackExchange.Redis;
namespace RedMetrixProcessor
{
    public class ConfigureInitializationServices
    {  
        private  readonly IConfigurationRoot _config;

        public ConfigureInitializationServices(IConfigurationRoot config)
        {
            _config = config;
        }

        public void DeleteKeys(IDatabase db)
        {         


                db.KeyDelete("ts_pv:t"); //TimeSeries-PageView-Total

        } 


        public void InitializeTimeSeriesTotalPageViews(IDatabase db)
        {

                db.TimeSeriesCreate("ts_pv:t", retentionTime: 600000);


        }

    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  The methods in github code repository is a lot more extensive with &lt;code&gt;try catch blocks&lt;/code&gt; and &lt;code&gt;Spectre.Console&lt;/code&gt; code.
&lt;/h6&gt;

&lt;p&gt;Back in &lt;code&gt;Program.cs&lt;/code&gt;, Update &lt;code&gt;ConfigureServices&lt;/code&gt; method to ensure methods in &lt;code&gt;ConfigureInitializationServices&lt;/code&gt; can be accessed from the &lt;code&gt;main&lt;/code&gt;method. &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Spectre.Console;
using NRedisTimeSeries;
using NRedisTimeSeries.DataTypes;
using StackExchange.Redis;
namespace RedMetrixProcessor
{
    class Program
    {
        public static IConfigurationRoot configuration;
        static void Main(string[] args)
        {
            ServiceCollection serviceDescriptors = new ServiceCollection();

            ConfigureServices(serviceDescriptors);
            IServiceProvider serviceProvider = serviceDescriptors.BuildServiceProvider();

            var options = new ConfigurationOptions
                {
                    EndPoints = { configuration.GetSection("RedisOptions:EndPoints").Value },
                    Password = configuration.GetSection("RedisOptions:Password").Value,
                    Ssl = false
                };
            // Multiplexer is intended to be reused
            ConnectionMultiplexer redisMultiplexer = ConnectionMultiplexer.Connect(options);

            // The database reference is a lightweight passthrough object intended to be used and discarded
            IDatabase db = redisMultiplexer.GetDatabase();

            AnsiConsole.Write(new FigletText("RedMetrix").LeftAligned().Color(Color.Red));
            AnsiConsole.Write(new Markup("[bold red]Copyright(C)[/] [teal]2021 Arnab Choudhuri - Xanadu[/]"));
            Console.WriteLine("");
            var rule = new Rule("[red]Welcome to RedMetrix[/]");
            AnsiConsole.Write(rule);

            var selectedoption = AnsiConsole.Prompt(
                                    new SelectionPrompt&amp;lt;string&amp;gt;()
                                        .Title("[bold yellow]Intitialize Application[/] [red]OR[/] [green]Process Data[/]?")
                                        .PageSize(5)
                                        .AddChoices(new[]
                                        {
                                            "Initialize Application", "Process Data", "Exit"
                                        }));
            if (selectedoption.ToString()=="Exit")
            {
                return;
            }else{
                if (!AnsiConsole.Confirm(selectedoption.ToString()))
                {
                    return;
                }
                else{
                     if (selectedoption.ToString()=="Initialize Application")
                     {
                        serviceProvider.GetService&amp;lt;ConfigureInitializationServices&amp;gt;().DeleteKeys(db);
                        serviceProvider.GetService&amp;lt;ConfigureInitializationServices&amp;gt;().InitializeTimeSeriesTotalPageViews(db);

                     }

                }
            }
            redisMultiplexer.Close();
            AnsiConsole.Write(new Markup("[bold yellow]Press any key to [/] [red]Exit![/]"));
            Console.ReadKey(false);
        }
        private static void ConfigureServices(IServiceCollection serviceCollection)
        {
            configuration = new ConfigurationBuilder()
                .SetBasePath(Directory.GetParent(AppContext.BaseDirectory).FullName)
                .AddJsonFile("appsettings.json")
                .Build();

            serviceCollection.AddSingleton&amp;lt;IConfigurationRoot&amp;gt;(configuration);
            serviceCollection.AddTransient&amp;lt;ConfigureInitializationServices&amp;gt;();


        }
    }
}




&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The application must be able to access the mock data files downloaded earlier from the development machine where this console application will run. To ensure the same, its time to add the folder location for the mock data file in appsettings.json.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

{
    "RedisOptions":{
        "EndPoints":"redis-14475.c280.us-central1-2.gce.cloud.redislabs.com:14475",
        "Password":"BVDwMO6Qj1tnDZalcxmfdH2mL1c1G5iA"
    },
    "FolderPath":"D:\\redis\\article\\data"
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Create a new file &lt;code&gt;DataServices.cs&lt;/code&gt; which will traverse the folder for mock data file/s, read each line, and if the event type is 'Page View', add a datapoint with current timestamp and value 1(One). Do note the &lt;code&gt;TSPageViews&lt;/code&gt; method.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.IO;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.Configuration;
using Spectre.Console;
using NRedisTimeSeries;
using StackExchange.Redis;
namespace RedMetrixProcessor
{
    public class DataServices
    {
        private  readonly IConfigurationRoot _config;

        public DataServices(IConfigurationRoot config)
        {
            _config = config;
        }

        public void ProcessData(IDatabase db)
        {
            string folderPath = _config.GetSection("FolderPath").Value;
            DirectoryInfo startDir = new DirectoryInfo(folderPath);
            var files = startDir.EnumerateFiles();
            AnsiConsole.Status()
                .AutoRefresh(true)
                .Spinner(Spinner.Known.Default)
                .Start("[yellow]Initializing ...[/]", ctx =&amp;gt;
                {
                    foreach (var file in files)
                    {
                        ctx.Status("[bold blue]Started Processing..[/]");
                        HandleFile(file,db);
                        AnsiConsole.MarkupLine($"[grey]LOG:[/] Done[grey]...[/]");
                    }
                });

            // Done
            AnsiConsole.MarkupLine("[bold green]Finished[/]");
        }

        private static void HandleFile(FileInfo file,IDatabase db)
        {

               Console.WriteLine(file.FullName);

              using var fs = new FileStream(file.FullName, FileMode.Open, FileAccess.Read);
              using var sr = new StreamReader(fs, Encoding.UTF8);
               // int count = 0;
                //int total =0;

                string line = String.Empty;

                while ((line = sr.ReadLine()) != null)
                {
                    try{

                        using JsonDocument doc = JsonDocument.Parse(line);
                        if(doc.RootElement.GetProperty("event_detail").GetProperty("event_type").GetString()=="Page view")
                        {
                                //total++;
                                TSPageViews(db);


                        }


                    }catch(Exception ex){
                       AnsiConsole.WriteException(ex);
                     Console.WriteLine(line); 

                    }finally{

                    }

                }
           // Console.WriteLine($"{total}");
        }   

        private static void TSPageViews( IDatabase db)
        {
            db.TimeSeriesAdd("ts_pv:t", "*", 1);
        }

    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Once again in &lt;code&gt;Program.cs&lt;/code&gt;, Update &lt;code&gt;ConfigureServices&lt;/code&gt; method to ensure methods in &lt;code&gt;DataServices&lt;/code&gt; can be accessed from the &lt;code&gt;main&lt;/code&gt;method.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

serviceCollection.AddTransient&amp;lt;ConfigureInitializationServices&amp;gt;();
serviceCollection.AddTransient&amp;lt;DataServices&amp;gt;();


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

if (selectedoption.ToString()=="Initialize Application")
    {
        serviceProvider.GetService&amp;lt;ConfigureInitializationServices&amp;gt;().DeleteKeys(db);
        serviceProvider.GetService&amp;lt;ConfigureInitializationServices&amp;gt;().InitializeTimeSeriesTotalPageViews(db);

    }
if (selectedoption.ToString()=="Process Data")
    {
        serviceProvider.GetService&amp;lt;DataServices&amp;gt;().ProcessData(db);
    }           


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now to try what is built by building and then running the application.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet build
dotnet run


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Select &lt;code&gt;Intitialize Application&lt;/code&gt; from the choices and confirm the same.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frwr34chwm4qgtbn4vx4e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frwr34chwm4qgtbn4vx4e.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now to check if the application did what it was supposed to do using RedisInsight.&lt;/p&gt;

&lt;p&gt;Install and Run RedisInsight&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6g9kf5ja35j3fqe2b6pq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6g9kf5ja35j3fqe2b6pq.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the option of &lt;code&gt;Add Database manually&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzxd57h5vzjtlf8j1k37s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzxd57h5vzjtlf8j1k37s.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
Redis Database gets listed in Redis Insight&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8aufusp2cn0j14j1dqaz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8aufusp2cn0j14j1dqaz.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select &lt;code&gt;Database Alias&lt;/code&gt; value -redmetrix to view the Key just added.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F250225uvdtxuyp3epkpg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F250225uvdtxuyp3epkpg.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Run the application once again and this time select &lt;code&gt;Process Data&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhf79awkqxbx9is459t2u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhf79awkqxbx9is459t2u.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Confirm that the mock data files have been processed by running the query below to check for the number of data points/ samples in the database.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

TS.RANGE ts_pv:t - + AGGREGATION sum 6000000


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Later the same query will be run via .Net client in the web application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxgocs2qvvryiep9p5jv3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxgocs2qvvryiep9p5jv3.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Web Application
&lt;/h3&gt;

&lt;p&gt;Create a base Razor Pages Web Application in the Terminal&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet new webapp -o RedMetrixWebApp


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Move inside the directory and then build and run to confirm all works well.&lt;/p&gt;

&lt;p&gt;In &lt;code&gt;appsettings.json&lt;/code&gt;, add the &lt;code&gt;RedisOptions&lt;/code&gt; section as is in RedMetrixProcessor.&lt;/p&gt;

&lt;p&gt;Add the required nuget packages&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dotnet add package StackExchange.Redis.Extensions.AspNetCore
dotnet add package StackExchange.Redis.Extensions.System.Text.Json
dotnet add package NRedisTimeSeries


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Update the &lt;code&gt;Program.cs&lt;/code&gt; code to allow access to &lt;code&gt;appsettings.json&lt;/code&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;

namespace RedMetrixWebApp
{
    public class Program
    {
        private static IConfigurationRoot Configuration { get; set; }
        public static void Main(string[] args)
        {
            CreateHostBuilder(args).Build().Run();
        }

        public static IHostBuilder CreateHostBuilder(string[] args) =&amp;gt;
            Host.CreateDefaultBuilder(args)
                .ConfigureAppConfiguration((hostContext, config) =&amp;gt;
               {
                   config.AddJsonFile("appsettings.json");
               })
                .ConfigureWebHostDefaults(webBuilder =&amp;gt;
                {
                    webBuilder.UseStartup&amp;lt;Startup&amp;gt;();
                    webBuilder.UseUrls("http://localhost:5003");
                });


    }
}



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In &lt;code&gt;Startup.cs&lt;/code&gt; file comment the line &lt;code&gt;app.UseHttpsRedirection();&lt;/code&gt; as https is not going to be used.&lt;/p&gt;

&lt;p&gt;Now &lt;a href="https://github.com/redis-developer/nredi2read-preview/blob/main/Providers/RedisProvider.cs" rel="noopener noreferrer"&gt;RedisProvider&lt;/a&gt; is added to the solution in the folder &lt;code&gt;Provider&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Update &lt;code&gt;Startup.cs&lt;/code&gt; - &lt;code&gt;ConfigureServices&lt;/code&gt; method and add line &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

services.AddSingleton&amp;lt;RedisProvider&amp;gt;();


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  Do remember to add the namespace of &lt;code&gt;RedisProvider&lt;/code&gt; to &lt;code&gt;Startup.cs&lt;/code&gt;
&lt;/h6&gt;

&lt;p&gt;Next, its time to get data for the PageViews Widget.&lt;br&gt;
This will show Total PageViews (in last min) &amp;amp; comparison with a min prior, so two values.&lt;/p&gt;

&lt;p&gt;So, there is a need to create a model to hold two values.&lt;br&gt;
Create a new folder named &lt;code&gt;Model&lt;/code&gt; and in that create a file &lt;code&gt;RealTimeChart.cs&lt;/code&gt;.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.Collections.Generic;
namespace RedMetrixWebApp
{
    public class RealtimeChart
    {
        public PageView PageViews { get; set; }

        public DateTime Updated { get; set; }
    }

    public class PageView   
    {
        public PageView(long pv, long prev_pv)
        {
            this.pv=pv;
            this.prev_pv=prev_pv;
        }
        public long pv { get; set; }
        public long prev_pv{ get; set; }
    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Next, create a new folder named &lt;code&gt;Services&lt;/code&gt; and in that create a file &lt;code&gt;RealTimeChartService.cs&lt;/code&gt;.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.Collections.Generic;
using System.Text.Json;
using System.Linq;
using dotnetredis.Providers;
using NRedisTimeSeries;
using NRedisTimeSeries.Commands;
using NRedisTimeSeries.DataTypes;
using StackExchange.Redis;
using Microsoft.Extensions.Logging;


namespace RedMetrixWebApp
{
    public class RealtimeChartService
    {
        private readonly ILogger&amp;lt;RealtimeChartService&amp;gt; _logger;
        private readonly RedisProvider _redisProvider;

        public RealtimeChartService(ILogger&amp;lt;RealtimeChartService&amp;gt; logger, RedisProvider redisProvider)
        {
            _logger = logger;
            _redisProvider = redisProvider;
        }

        public RealtimeChart GetChartData()
        {
            return new RealtimeChart()
            {
                PageViews=GetPageViews(),
                Updated= DateTime.Now.ToLocalTime()

            };
        }


        public PageView GetPageViews()
        {
            PageView views = new PageView(0,0);
            try{
                var db = _redisProvider.Database();
                IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; results =  db.TimeSeriesRevRange("ts_pv:t", "-", "+", aggregation: TsAggregation.Sum, timeBucket: 60000, count:2); 

              views= new PageView(Convert.ToInt64(results[0].Val), Convert.ToInt64(results[1].Val));

            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }
            return views;
        }

    }    

}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  Check the method &lt;code&gt;GetPageViews&lt;/code&gt; to understand how the .Net Api method &lt;code&gt;TimeSeriesRevRange&lt;/code&gt; gets the data.
&lt;/h6&gt;

&lt;p&gt;Update Startup.cs - ConfigureServices method and add line&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

services.AddTransient&amp;lt;RealtimeChartService&amp;gt;();


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally its time to show the data received from redis DB in the web pages.&lt;br&gt;
Update Index.cshtml.cs code-behind file to get data using RealtimeChartService.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.RazorPages;
using Microsoft.Extensions.Logging;
using RedMetrixWebApp;

namespace RedMetrixWebApp.Pages
{
    public class IndexModel : PageModel
    {
        private readonly ILogger&amp;lt;IndexModel&amp;gt; _logger;
        private readonly RealtimeChartService _service;

        public RealtimeChart chartdata { get; set; }

        public IndexModel(ILogger&amp;lt;IndexModel&amp;gt; logger, RealtimeChartService service)
        {
            _logger = logger;
            _service=service;
        }

        public void OnGet()
        {
           chartdata  =  _service.GetChartData();
            _logger.LogInformation("Worker running at: {Time} ", chartdata.Updated);
        }

    }
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally, its time to create the the view.&lt;br&gt;
The data from &lt;code&gt;.cshtml.cs&lt;/code&gt; file can be accessed via &lt;code&gt;Model&lt;/code&gt; in &lt;code&gt;.cshtml&lt;/code&gt; using &lt;code&gt;var chartData = @Html.Raw(Json.Serialize(Model.chartdata));&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Structure of the dashboard is using &lt;a href="https://github.com/gridstack/gridstack.js" rel="noopener noreferrer"&gt;gridstack.js&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Widget design is using &lt;a href="https://github.com/handlebars-lang/handlebars.js" rel="noopener noreferrer"&gt;handlebars.js&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  In the grid where pageviews widget will go, do note the presence of a div with id &lt;code&gt;wpageviews&lt;/code&gt;. This is where handlebar templates will get injected.
&lt;/h6&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

@page
@model IndexModel
@{Layout = null;
}
&amp;lt;!doctype html&amp;gt;
&amp;lt;html lang="en"&amp;gt;
&amp;lt;head&amp;gt;
    &amp;lt;meta charset="utf-8"&amp;gt;
    &amp;lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&amp;gt;
    &amp;lt;meta name="description" content="RedMetrix - RedisTimeSeries ECOM Web analytics Dashboard"&amp;gt;
    &amp;lt;title&amp;gt;RedMetrix - A RedisTimeSeries Project&amp;lt;/title&amp;gt;
    &amp;lt;link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" /&amp;gt;
    &amp;lt;link rel="stylesheet" href="~/css/gridstack.css" /&amp;gt;

    &amp;lt;style&amp;gt;
    body {
    background: #222222;
    color: #465665;
      }
    .grid-stack-item {
        border: 0;
      }
    .grid-stack-item-content {
        background-color: white;
        text-align: center;
      }
    #blue{
        background: #0000ff;
      }
    &amp;lt;/style&amp;gt;
&amp;lt;/head&amp;gt;
&amp;lt;body&amp;gt;

  &amp;lt;div class="grid-stack" id="simple-grid"&amp;gt;&amp;lt;/div&amp;gt;
  &amp;lt;script src="~/lib/jquery/dist/jquery.min.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="~/lib/bootstrap/dist/js/bootstrap.bundle.min.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="//cdn.jsdelivr.net/npm/handlebars@latest/dist/handlebars.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="~/js/gridstack-jq.js" &amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script type="text/javascript"&amp;gt;
    var simple = [
      {x: 0, y: 0, w: 4, h: 2, noMove: true, noResize: true, locked: true, id:'logo', bgcolor:'#9c4274',content: '&amp;lt;img src="widgets/logo/logo.png" alt="logo RedMetrix" style="width:100%;margin-top: auto;"&amp;gt; &amp;lt;h2 style="color:white;"&amp;gt;&amp;lt;strong&amp;gt; -- A RedisTimeSeries Project &amp;lt;/strong&amp;gt; &amp;lt;/h2&amp;gt;'},
      {x: 4, y: 0, w: 4, h: 6, noMove: true, noResize: true, locked: true, id:'pageperf', bgcolor:'#e83e8c',content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Page Performance &amp;lt;br/&amp;gt; (in ms) &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;h6 style="color:white"&amp;gt;Home &amp;lt;/h6&amp;gt;&amp;lt;p/&amp;gt;&amp;lt;canvas id ="wpageperfh"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;&amp;lt;p/&amp;gt;&amp;lt;div&amp;gt;&amp;lt;h6 style="color:white"&amp;gt;Product &amp;lt;/h6&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;canvas id ="wpageperfp"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;'},
      {x: 8, y: 0, w: 2, h: 2,noMove: true, noResize: true, locked: true, id:'pageviews', bgcolor:'#12b0c5', content: '&amp;lt;div id ="wpageviews" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 10, y: 0, w: 2, h: 2,noMove: true, noResize: true, locked: true, id:'totalorders', bgcolor:'#ff9618', content: '&amp;lt;div id ="wtotalorders" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 0, y: 2, w: 4, h: 4, noMove: true, noResize: true, locked: true, id:'conversions', bgcolor:'#ec663c', content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Conversions &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;canvas id ="wconversions"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/p&amp;gt; &amp;lt;h5 style="color:whitesmoke"&amp;gt;Conversion Rate &amp;lt;/h5&amp;gt;&amp;lt;div id="wconversionrate" style="font-size: 400%; color:lightgreen;"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 8, y: 2, w: 4, h: 1,noMove: true, noResize: true, locked: true, id:'ordervalue', bgcolor:'#dc5945', content: '&amp;lt;div id ="wordervalue" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 8, y: 3, w: 4, h: 3,noMove: true, noResize: true, locked: true, id:'ordbypaymeth', bgcolor:'#47bbb3', content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Orders By Payment Method &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;canvas id ="wordbypaymeth"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;'}
    ];

    var simpleGrid = GridStack.init({

    }, '#simple-grid');
    simpleGrid.on('added', function(e, items) {
      items.forEach(function(item) { 
        item.el.firstElementChild.style.backgroundColor=item.bgcolor;
      });
    });
    simpleGrid.load(simple);
  &amp;lt;/script&amp;gt;

  &amp;lt;script&amp;gt;
  Handlebars.registerHelper('isdefined', function (value) {
    return value !== undefined;
    });
  Handlebars.registerHelper('isnumber', function (value) {
    return !isNaN(value);
    });
  Handlebars.registerHelper('showchange', function (value) {
    return value !== undefined || !isNaN(value) || value !==0;
    });

  &amp;lt;/script&amp;gt;
  &amp;lt;script id="pageview_template" type="text/x-handlebars-template"&amp;gt;
    &amp;lt;div class="entry"&amp;gt;
      &amp;lt;h5 style="color:whitesmoke"&amp;gt;{{pageview.title}}&amp;lt;/h5&amp;gt;
      &amp;lt;div class="pageview" style="width: 100%; height: 83%"&amp;gt;
        &amp;lt;div class= "views" style="font-size: 400%; color:#ff9618;"&amp;gt;{{pageview.views}}&amp;lt;/div&amp;gt;
        &amp;lt;div class= "views"&amp;gt;&amp;lt;h5 style="color:#ff9618"&amp;gt;Last value : {{pageview.last}}&amp;lt;/h5&amp;gt;&amp;lt;/div&amp;gt;
        &amp;lt;div class="change" style="color:white"&amp;gt;&amp;lt;h3&amp;gt;{{#if (showchange pageview.diff)}} &amp;lt;span&amp;gt;({{pageview.change}}) {{pageview.diff}}%&amp;lt;/span&amp;gt;{{/if}} &amp;lt;/h3&amp;gt;&amp;lt;/div&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;
  &amp;lt;/script&amp;gt;
  &amp;lt;script&amp;gt;
  var pageview_source = document.getElementById("pageview_template").innerHTML;
  var pageviewTemplate = Handlebars.compile(pageview_source);

  &amp;lt;/script&amp;gt; 
  &amp;lt;script&amp;gt;
    function percentage(n, total) {
    return Math.round(n * 100 / total);
  }
  function shortnumber(num) {
    if (isNaN(num)) {
      return num;
    }
    if (num &amp;gt;= 1000000000) {
      return (num / 1000000000).toFixed(1) + 'B';
    } else if (num &amp;gt;= 1000000) {
      return (num / 1000000).toFixed(1) + 'M';
    } else if (num &amp;gt;= 1000) {
      return (num / 1000).toFixed(1) + 'K';
    } else {
      return num;
    }
  };
  $(document).ready(function () {
  var chartData = @Html.Raw(Json.Serialize(Model.chartdata));
  console.log(chartData);
  var pageviews=0;
  var lastpv=0;
  var pageviewdiff=0;
  var pageviewchange="";
      if (!isNaN(chartData.pageViews.pv)){
            pageviews=chartData.pageViews.pv;
            if (!isNaN(chartData.pageViews.prev_pv)){
              lastpv=chartData.pageViews.prev_pv;
                var diff=0;
                if(chartData.pageViews.prev_pv&amp;gt;chartData.pageViews.pv){
                    diff=chartData.pageViews.prev_pv-chartData.pageViews.pv;
                    pageviewchange="-"
                }else{
                    diff=chartData.pageViews.pv-chartData.pageViews.prev_pv;
                    pageviewchange="+";
                }
                pageviewdiff=percentage(diff,pageviews);

              }
          }

        var context={
            pageview: {
            title:"Page Views",  
            views: shortnumber(pageviews),
            last:shortnumber(lastpv),
            change: pageviewchange,
            diff:pageviewdiff
            }
        };
        document.getElementById("wpageviews").innerHTML = pageviewTemplate(context);

  });      

  &amp;lt;/script&amp;gt;  
&amp;lt;/body&amp;gt;
&amp;lt;/html&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And the result is&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5l3wpxmjmwbk72df7jg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5l3wpxmjmwbk72df7jg.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Rest of the Widgets
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Total number of Orders / Transactions and Total Order Value (in Rupees)
&lt;/h3&gt;

&lt;p&gt;Unlike Total page views use case where page views in last min were shown, one might also have use cases where the total value from start of business or even current financial year needs to be shown.&lt;/p&gt;

&lt;h4&gt;
  
  
  Processor
&lt;/h4&gt;

&lt;p&gt;Add this section in the Processor &lt;code&gt;appsettings.json&lt;/code&gt; file. This will have the values of the said data at the time when this application will run.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

"InitializationOptions":{
        "TotalOrders":0,
        "TotalOrderValue":0.00
    }    


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  In real world scenarios, you would make a Api call to your enterprise systems to get the values.
&lt;/h6&gt;

&lt;p&gt;In &lt;code&gt;ConfigureInitializationServices.cs&lt;/code&gt;, create a method that will create necessary keys and add the values from &lt;code&gt;appsettings.json&lt;/code&gt;.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public void InitializeTimeSeriesTotalOrderNValue(IDatabase db)
        {
                db.TimeSeriesCreate("ts_o:t", retentionTime: 600000);
                ulong totalorders=0;
                // TODO: Get Data from Rest Api for total orders.
                totalorders=Convert.ToUInt64(_config.GetSection("InitializationOptions:TotalOrders").Value);
                db.TimeSeriesAdd("ts_o:t", "*", Convert.ToDouble(totalorders));  
                db.TimeSeriesCreate("ts_o:v", retentionTime: 600000);
                double totalordervalue=0;
                // TODO: Get Data from Rest Api for total order value
                totalordervalue=Convert.ToDouble(_config.GetSection("InitializationOptions:TotalOrderValue").Value);
                db.TimeSeriesAdd("ts_o:v", "*", totalordervalue);


        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  Do remember to update &lt;code&gt;DeleteKeys&lt;/code&gt; method as well.
&lt;/h6&gt;

&lt;p&gt;Similarly in &lt;code&gt;DataServices.cs&lt;/code&gt;, add the following method and call the method in &lt;code&gt;HandleFile&lt;/code&gt; method with the &lt;code&gt;orderamount&lt;/code&gt; value when the page type is &lt;code&gt;Success Confirmation&lt;/code&gt;. &lt;/p&gt;
&lt;h6&gt;
  
  
  HandleFile method not shown here. Check code repository if needed.
&lt;/h6&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

private static void TSOrders( double orderamount, IDatabase db)
        {
           db.TimeSeriesIncrBy("ts_o:t", 1, timestamp: "*");
           db.TimeSeriesIncrBy("ts_o:v", orderamount, timestamp: "*");
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  The corresponding Command for CLI would be &lt;code&gt;TS.INCRBY ts_o:t 1 TIMESTAMP *&lt;/code&gt;
&lt;/h6&gt;
&lt;h4&gt;
  
  
  Webapp
&lt;/h4&gt;

&lt;p&gt;Update &lt;code&gt;RealtimeChartService&lt;/code&gt; with method that uses &lt;code&gt;TS.GET&lt;/code&gt; to get the last data in the timeseries samples.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public double GetOrderValue()
        {
            double orderamount = 0;
            try{
                var db = _redisProvider.Database();
                TimeSeriesTuple value = db.TimeSeriesGet("ts_o:v");
                orderamount = value.Val;
            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }
            return orderamount;
        }

        public long GetOrders()
        {
            long orders = 0;
            try{
                var db = _redisProvider.Database();
                TimeSeriesTuple value = db.TimeSeriesGet("ts_o:t");
                orders=Convert.ToInt64(value.Val);
            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }
            return orders;
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Page performance for Home page and Product page (in last min)
&lt;/h3&gt;
&lt;h4&gt;
  
  
  Processor
&lt;/h4&gt;

&lt;p&gt;Initialize&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

db.TimeSeriesCreate("ts_pv:pp:h", retentionTime: 600000);
db.TimeSeriesCreate("ts_pv:pp:p", retentionTime: 600000);


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;and then..&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

private static ulong GetStartTime( JsonElement context)
        {
            string s_starttime=context.GetProperty("performance_timing").GetProperty("requestStart").GetString();
            ulong starttime=Convert.ToUInt64(s_starttime);
            return starttime;                           
        }
        private static ulong GetEndTime( JsonElement context)
        {
            string s_endtime=context.GetProperty("performance_timing").GetProperty("domContentLoadedEventEnd").GetString();
            ulong endtime=Convert.ToUInt64(s_endtime);
            return endtime;    
        }

        private static void TSPagePerformance( string pagetype,long pageperf, IDatabase db)
        {
            if (pagetype=="Home"){
                db.TimeSeriesAdd("ts_pv:pp:h", "*", pageperf);
            }else if(pagetype=="Product"){
                db.TimeSeriesAdd("ts_pv:pp:p", "*", pageperf);
            }else{}
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  In the source code in source repository there are two additional methods commented to get start time and end time with seven performance attributes instead of two.
&lt;/h6&gt;
&lt;h4&gt;
  
  
  Webapp
&lt;/h4&gt;

&lt;p&gt;Here, we will show page performance with floating bars denoting max and min whereas avg will be a line chart.&lt;br&gt;
In RealtimeChart Model&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public class PagePerf
    {
        public List&amp;lt;string&amp;gt; time {get;set;}
        public List&amp;lt;List&amp;lt;int&amp;gt;&amp;gt; maxmin {get;set;}
        public List&amp;lt;int&amp;gt; avg{get;set;}
    }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;and in RealtimeChartService&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public PagePerf GetPagePerformance(string pagetype){
            string key="";
            if (pagetype=="Home")
            {
                key="ts_pv:pp:h";
            }else if(pagetype=="Product")
            {
                key="ts_pv:pp:p";
            }else{}
            List&amp;lt;int&amp;gt; avgdata=new List&amp;lt;int&amp;gt;();
            List&amp;lt;string&amp;gt; timedata= new List&amp;lt;string&amp;gt;();
            List&amp;lt;List&amp;lt;int&amp;gt;&amp;gt; maxmindata= new List&amp;lt;List&amp;lt;int&amp;gt;&amp;gt;();
            try{
                var db = _redisProvider.Database();

                IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsmax =  db.TimeSeriesRange(key, "-", "+", aggregation: TsAggregation.Max, timeBucket: 60000, count:10);
                IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsmin =  db.TimeSeriesRange(key, "-", "+", aggregation: TsAggregation.Min, timeBucket: 60000, count:10);
                IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsavg =  db.TimeSeriesRange(key, "-", "+", aggregation: TsAggregation.Avg, timeBucket: 60000, count:10);
                maxmindata=GetMaxMinList(resultsmax,resultsmin);
                foreach (var result in resultsavg)
                {
                  avgdata.Add(Convert.ToInt32(result.Val));
                }
                timedata=GetStringTimeList(resultsavg);

            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }
            return new PagePerf
            {
                time=timedata,
                maxmin=maxmindata,
                avg=avgdata
            };
        }

        private List&amp;lt;List&amp;lt;int&amp;gt;&amp;gt; GetMaxMinList(IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsmax,IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsmin)
        {
            return resultsmax.Concat(resultsmin)
                                .GroupBy(o =&amp;gt; o.Time)
                                .Select(g =&amp;gt; g.Select(s =&amp;gt; (int)s.Val).ToList())
                                .ToList();
        }

        private List&amp;lt;string&amp;gt; GetStringTimeList(IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; resultsavg)
        {
             List&amp;lt;string&amp;gt; timedata= new List&amp;lt;string&amp;gt;();
             foreach (var result in resultsavg)
                {
                   TimeStamp ts = result.Time;
                   System.DateTime dtDateTime = new DateTime(1970, 1, 1, 0, 0, 0, 0, System.DateTimeKind.Utc);
                   dtDateTime = dtDateTime.AddMilliseconds((long)ts);
                   String hourMinute = dtDateTime.ToString("HH:mm");
                   timedata.Add(hourMinute);
                }
                return timedata;
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Conversion Funnel (in last min)
&lt;/h3&gt;
&lt;h4&gt;
  
  
  Processor
&lt;/h4&gt;

&lt;p&gt;Initialize&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

var label = new TimeSeriesLabel("chart", "Funnel");
                var labels = new List&amp;lt;TimeSeriesLabel&amp;gt; { label };
                db.TimeSeriesCreate("ts_fnl:pl", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:pl", "*", 0);
                db.TimeSeriesCreate("ts_fnl:pd", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:pd", "*", 0);
                db.TimeSeriesCreate("ts_fnl:ac", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:ac", "*", 0);
                db.TimeSeriesCreate("ts_fnl:vc", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:vc", "*", 0);
                db.TimeSeriesCreate("ts_fnl:co", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:co", "*", 0);
                db.TimeSeriesCreate("ts_fnl:sc", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_fnl:sc", "*", 0);


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  Corresponding command will be &lt;code&gt;TS.CREATE ts_fnl:pl RETENTION 600000 LABELS chart Funnel&lt;/code&gt;
&lt;/h6&gt;

&lt;p&gt;And then&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

private static void TSFunnel( string funneltype,IDatabase db)
        {
            switch (funneltype)
                {
                    case "Success":

                       db.TimeSeriesAdd("ts_fnl:sc", "*", 1);
                        break;
                    case "Checkout":

                        db.TimeSeriesAdd("ts_fnl:co", "*", 1);
                        break;
                    case "ViewCart":

                        db.TimeSeriesAdd("ts_fnl:vc", "*", 1);
                        break;
                    case "AddToCart":

                        db.TimeSeriesAdd("ts_fnl:ac", "*", 1);
                        break;
                    case "ProductDetail":

                        db.TimeSeriesAdd("ts_fnl:pd", "*", 1);
                        break;
                    case "ProductList":

                        db.TimeSeriesAdd("ts_fnl:pl", "*", 1);
                        break;
                    default:
                         AnsiConsole.MarkupLine($"[red]{funneltype}[/]");
                        break;
                }
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Webapp
&lt;/h4&gt;

&lt;p&gt;This will be showcased as a horizontal bar chart.&lt;br&gt;
In the RealtimeChart model&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public class Conversion
    {
        public List&amp;lt;FunnelItem&amp;gt; FunnelItems{get;set;}
        public long TotalFunnelValue{get;set;}
       // public double ConversionRate{get;set;}
    }
    public class FunnelItem
    {
        public FunnelItem(int Order, string Item, long Value)
    {
        this.Order=Order;
        this.Item = Item;
        this.Value = Value;        
    }
       public int Order { get; set; }
       public string Item { get; set; }
       public long Value { get; set; }

    }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And, in RealtimeChartService&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public Conversion GetConversions(){
            List&amp;lt;FunnelItem&amp;gt; funnelItems = new List&amp;lt;FunnelItem&amp;gt;();
            long totalFunnelValue =0;
            try{
                var db = _redisProvider.Database();
                var filter = new List&amp;lt;string&amp;gt; { "chart=Funnel" };

               var results= db.TimeSeriesMRevRange("-", "+", filter, aggregation:TsAggregation.Sum, timeBucket:600000, count: 1);

                foreach (var result in results)
                {
                    string key = result.key;
                    IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; values = result.values;
                    funnelItems.Add(new FunnelItem(GetFunnelOrder(key),PrettyFunnelItem(key),Convert.ToInt64(values[0].Val)));
                    totalFunnelValue=totalFunnelValue+Convert.ToInt64(values[0].Val);
                }
            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }
            return new Conversion
            {
                FunnelItems=funnelItems,
                TotalFunnelValue=totalFunnelValue
            };
        }

        private int GetFunnelOrder(string key){
           switch (key)
                {
                    case "ts_fnl:sc":
                        return 6;
                    case "ts_fnl:co":
                        return 5;
                    case "ts_fnl:vc":
                        return 4;
                    case "ts_fnl:ac":
                        return 3;
                    case "ts_fnl:pd":
                        return 2;
                    case "ts_fnl:pl":
                        return 1;
                    default:
                        _logger.LogInformation(key);
                    break;
                } 
            return 0;
        }
        private string PrettyFunnelItem(string key){
           switch (key)
                {
                    case "ts_fnl:sc":
                        return "Transaction Success";
                    case "ts_fnl:co":
                        return "Checkout";
                    case "ts_fnl:vc":
                        return "View Cart";
                    case "ts_fnl:ac":
                        return "Add To Cart";
                    case "ts_fnl:pd":
                        return "Product Detail";
                    case "ts_fnl:pl":
                        return "Product Listings";
                    default:
                        _logger.LogInformation(key);
                    break;
                } 
            return "";
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h6&gt;
  
  
  The command used is &lt;code&gt;TS.MREVRANGE - + AGGREGATION sum 600000 FILTER chart=Funnel&lt;/code&gt;
&lt;/h6&gt;
&lt;h3&gt;
  
  
  Orders by Payment Method (in last min)
&lt;/h3&gt;
&lt;h4&gt;
  
  
  Processor
&lt;/h4&gt;

&lt;p&gt;Initialize&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

var label = new TimeSeriesLabel("chart", "Ordersbypaymenttype");
                var labels = new List&amp;lt;TimeSeriesLabel&amp;gt; { label };
                db.TimeSeriesCreate("ts_o:t:cod", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:cod", "*", 0);
                db.TimeSeriesCreate("ts_o:t:dc", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:dc", "*", 0);
                db.TimeSeriesCreate("ts_o:t:cc", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:cc", "*", 0);
                db.TimeSeriesCreate("ts_o:t:nb", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:nb", "*", 0);
                db.TimeSeriesCreate("ts_o:t:ap", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:ap", "*", 0);
                db.TimeSeriesCreate("ts_o:t:gp", retentionTime: 600000, labels: labels);
                db.TimeSeriesAdd("ts_o:t:gp", "*", 0);


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And then&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

private static void TSOrdersbypaymenttype( string paymentmethod, IDatabase db)
        {
           switch (paymentmethod)
                {
                    case "Cash On Delivery":

                       db.TimeSeriesAdd("ts_o:t:cod", "*", 1);
                        break;
                    case "Debit Card":

                        db.TimeSeriesAdd("ts_o:t:dc", "*", 1);
                        break;
                    case "Credit Card":

                        db.TimeSeriesAdd("ts_o:t:cc", "*", 1);
                        break;
                    case "Netbanking":

                        db.TimeSeriesAdd("ts_o:t:nb", "*", 1);
                        break;
                    case "Amazon Pay":

                        db.TimeSeriesAdd("ts_o:t:ap", "*", 1);
                        break;
                    case "Google Pay":

                        db.TimeSeriesAdd("ts_o:t:gp", "*", 1);
                        break;
                    default:
                         AnsiConsole.MarkupLine($"[red]{paymentmethod}[/]");
                        break;
                }
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h4&gt;
  
  
  Webapp
&lt;/h4&gt;

&lt;p&gt;In RealtimeChart model&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public class PaymentMethodOrders
    {
        public PaymentMethodOrders(string PaymentMethod, long Orders)
        {
        this.PaymentMethod = PaymentMethod;
        this.Orders = Orders;
        }
       public string PaymentMethod { get; set; }
       public long Orders { get; set; }
    } 


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And then in RealtimeChartService&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

public List&amp;lt;PaymentMethodOrders&amp;gt; GetOrdersByPaymentMethod(){
            List&amp;lt;PaymentMethodOrders&amp;gt; OrdersByPaymentMethod = new List&amp;lt;PaymentMethodOrders&amp;gt;();
            try{
                var db = _redisProvider.Database();
                var filter = new List&amp;lt;string&amp;gt; { "chart=Ordersbypaymenttype" };

               var results= db.TimeSeriesMRevRange("-", "+", filter, aggregation:TsAggregation.Sum, timeBucket:600000, count: 1);
            //      string jsonString = JsonSerializer.Serialize(results);
            //   _logger.LogInformation(jsonString);
                foreach (var result in results)
                {
                    string key = result.key;
                   IReadOnlyList&amp;lt;TimeSeriesTuple&amp;gt; values = result.values;
                    OrdersByPaymentMethod.Add(new PaymentMethodOrders(PrettyPaymentMethod(key),Convert.ToInt64(values[0].Val)));
                }
            }catch(Exception ex){
                _logger.LogError(ex.Message);
            }

            return OrdersByPaymentMethod;
        }

        private string PrettyPaymentMethod(string key){
           switch (key)
                {
                    case "ts_o:t:cod":
                        return "Cash On Delivery";
                    case "ts_o:t:dc":
                        return "Debit Card";
                    case "ts_o:t:cc":
                        return "Credit Card";
                    case "ts_o:t:nb":
                        return "Net Banking";
                    case "ts_o:t:ap":
                        return "Amazon Pay";
                    case "ts_o:t:gp":
                        return "Google Pay";
                    default:
                        _logger.LogInformation(key);
                    break;
                } 
            return "";
        }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  The final view
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/chartjs/Chart.js" rel="noopener noreferrer"&gt;Chart.js&lt;/a&gt; is used for charting.&lt;br&gt;
Dashboard is refreshed using &lt;code&gt;meta http-equiv="refresh"&lt;/code&gt;&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

@page
@model IndexModel
@{Layout = null;
}
&amp;lt;!doctype html&amp;gt;
&amp;lt;html lang="en"&amp;gt;
&amp;lt;head&amp;gt;
    &amp;lt;meta charset="utf-8"&amp;gt;
    &amp;lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&amp;gt;
    &amp;lt;meta http-equiv="refresh" content="60" /&amp;gt;
    &amp;lt;meta name="description" content="RedMetrix - RedisTimeSeries ECOM Web analytics Dashboard"&amp;gt;
    &amp;lt;title&amp;gt;RedMetrix - A RedisTimeSeries Project&amp;lt;/title&amp;gt;
    &amp;lt;link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" /&amp;gt;
    &amp;lt;link rel="stylesheet" href="~/css/gridstack.css" /&amp;gt;

    &amp;lt;style&amp;gt;
    body {
    background: #222222;
    color: #465665;
      }
    .grid-stack-item {
        border: 0;
      }
    .grid-stack-item-content {
        background-color: white;
        text-align: center;
      }
    #blue{
        background: #0000ff;
      }
    &amp;lt;/style&amp;gt;
&amp;lt;/head&amp;gt;
&amp;lt;body&amp;gt;

  &amp;lt;div class="grid-stack" id="simple-grid"&amp;gt;&amp;lt;/div&amp;gt;
  &amp;lt;script src="~/lib/jquery/dist/jquery.min.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="~/lib/bootstrap/dist/js/bootstrap.bundle.min.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="//cdn.jsdelivr.net/npm/handlebars@latest/dist/handlebars.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="//cdn.jsdelivr.net/npm/chart.js"&amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script src="~/js/gridstack-jq.js" &amp;gt;&amp;lt;/script&amp;gt;
  &amp;lt;script type="text/javascript"&amp;gt;
    var simple = [
      {x: 0, y: 0, w: 4, h: 2, noMove: true, noResize: true, locked: true, id:'logo', bgcolor:'#9c4274',content: '&amp;lt;img src="widgets/logo/logo.png" alt="logo RedMetrix" style="width:100%;margin-top: auto;"&amp;gt; &amp;lt;h2 style="color:white;"&amp;gt;&amp;lt;strong&amp;gt; -- A RedisTimeSeries Project &amp;lt;/strong&amp;gt; &amp;lt;/h2&amp;gt;'},
      {x: 4, y: 0, w: 4, h: 6, noMove: true, noResize: true, locked: true, id:'pageperf', bgcolor:'#e83e8c',content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Page Performance &amp;lt;br/&amp;gt; (in ms) &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;h6 style="color:white"&amp;gt;Home &amp;lt;/h6&amp;gt;&amp;lt;p/&amp;gt;&amp;lt;canvas id ="wpageperfh"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;&amp;lt;p/&amp;gt;&amp;lt;div&amp;gt;&amp;lt;h6 style="color:white"&amp;gt;Product &amp;lt;/h6&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;canvas id ="wpageperfp"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;'},
      {x: 8, y: 0, w: 2, h: 2,noMove: true, noResize: true, locked: true, id:'pageviews', bgcolor:'#12b0c5', content: '&amp;lt;div id ="wpageviews" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 10, y: 0, w: 2, h: 2,noMove: true, noResize: true, locked: true, id:'totalorders', bgcolor:'#ff9618', content: '&amp;lt;div id ="wtotalorders" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 0, y: 2, w: 4, h: 4, noMove: true, noResize: true, locked: true, id:'conversions', bgcolor:'#ec663c', content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Conversions &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;canvas id ="wconversions"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/p&amp;gt; &amp;lt;h5 style="color:whitesmoke"&amp;gt;Conversion Rate &amp;lt;/h5&amp;gt;&amp;lt;div id="wconversionrate" style="font-size: 400%; color:lightgreen;"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 8, y: 2, w: 4, h: 1,noMove: true, noResize: true, locked: true, id:'ordervalue', bgcolor:'#dc5945', content: '&amp;lt;div id ="wordervalue" style="width: 100%; height: 100%"&amp;gt;&amp;lt;/div&amp;gt;'},
      {x: 8, y: 3, w: 4, h: 3,noMove: true, noResize: true, locked: true, id:'ordbypaymeth', bgcolor:'#47bbb3', content: '&amp;lt;h5 style="color:whitesmoke"&amp;gt;Orders By Payment Method &amp;lt;/h5&amp;gt; &amp;lt;div&amp;gt;&amp;lt;canvas id ="wordbypaymeth"&amp;gt;&amp;lt;/canvas&amp;gt; &amp;lt;/div&amp;gt;'}
    ];

    var simpleGrid = GridStack.init({

    }, '#simple-grid');
    simpleGrid.on('added', function(e, items) {
      items.forEach(function(item) { 
        item.el.firstElementChild.style.backgroundColor=item.bgcolor;
      });
    });
    simpleGrid.load(simple);
  &amp;lt;/script&amp;gt;

  &amp;lt;script&amp;gt;
  Handlebars.registerHelper('isdefined', function (value) {
    return value !== undefined;
    });
  Handlebars.registerHelper('isnumber', function (value) {
    return !isNaN(value);
    });
  Handlebars.registerHelper('showchange', function (value) {
    return value !== undefined || !isNaN(value) || value !==0;
    });

  &amp;lt;/script&amp;gt;
  &amp;lt;script id="pageview_template" type="text/x-handlebars-template"&amp;gt;
    &amp;lt;div class="entry"&amp;gt;
      &amp;lt;h5 style="color:whitesmoke"&amp;gt;{{pageview.title}}&amp;lt;/h5&amp;gt;
      &amp;lt;div class="pageview" style="width: 100%; height: 83%"&amp;gt;
        &amp;lt;div class= "views" style="font-size: 400%; color:#ff9618;"&amp;gt;{{pageview.views}}&amp;lt;/div&amp;gt;
        &amp;lt;div class= "views"&amp;gt;&amp;lt;h5 style="color:#ff9618"&amp;gt;Last value : {{pageview.last}}&amp;lt;/h5&amp;gt;&amp;lt;/div&amp;gt;
        &amp;lt;div class="change" style="color:white"&amp;gt;&amp;lt;h3&amp;gt;{{#if (showchange pageview.diff)}} &amp;lt;span&amp;gt;({{pageview.change}}) {{pageview.diff}}%&amp;lt;/span&amp;gt;{{/if}} &amp;lt;/h3&amp;gt;&amp;lt;/div&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;
  &amp;lt;/script&amp;gt;
  &amp;lt;script id="totalorders_template" type="text/x-handlebars-template"&amp;gt;
    &amp;lt;div class="entry"&amp;gt;
      &amp;lt;h5 style="color:whitesmoke"&amp;gt;{{totalorders.title}}&amp;lt;/h5&amp;gt;
      &amp;lt;div class="pageview" style="width: 100%; height: 83%"&amp;gt;
        &amp;lt;div class= "views" style="font-size: 500%; color:#12b0c5"&amp;gt;{{totalorders.orders}}&amp;lt;/div&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;
  &amp;lt;/script&amp;gt;
  &amp;lt;script id="totalsales_template" type="text/x-handlebars-template"&amp;gt;
    &amp;lt;div class="entry"&amp;gt;
      &amp;lt;h5 style="color:whitesmoke"&amp;gt;{{totalordervalue.title}}&amp;lt;/h5&amp;gt;
      &amp;lt;div class="pageview" style="width: 100%; height: 83%"&amp;gt;
        &amp;lt;div class= "views"&amp;gt;&amp;lt;h1 style="color:#ff9618";  font-size: 250%;&amp;gt;&amp;amp;#8377; {{totalordervalue.orderValue}}&amp;lt;/h1&amp;gt;&amp;lt;/div&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;
  &amp;lt;/script&amp;gt;
  &amp;lt;script&amp;gt;
  var pageview_source = document.getElementById("pageview_template").innerHTML;
  var pageviewTemplate = Handlebars.compile(pageview_source);
  var totalorders_source = document.getElementById("totalorders_template").innerHTML;
  var totalordersTemplate = Handlebars.compile(totalorders_source);
  var totalsales_source = document.getElementById("totalsales_template").innerHTML;
  var totalsalesTemplate = Handlebars.compile(totalsales_source);
  &amp;lt;/script&amp;gt; 
  &amp;lt;script&amp;gt;
    function percentage(n, total) {
    return Math.round(n * 100 / total);
  }
  function shortnumber(num) {
    if (isNaN(num)) {
      return num;
    }
    if (num &amp;gt;= 1000000000) {
      return (num / 1000000000).toFixed(1) + 'B';
    } else if (num &amp;gt;= 1000000) {
      return (num / 1000000).toFixed(1) + 'M';
    } else if (num &amp;gt;= 1000) {
      return (num / 1000).toFixed(1) + 'K';
    } else {
      return num;
    }
  };
  var orderbypaychartdata= {
    labels: [],
    datasets: [{
      label: 'Orders',
      data: [],
      borderColor: 'rgb(255, 99, 132)',
      backgroundColor: 'rgb(255, 99, 132)',
      borderWidth: 2,
      borderRadius: 5,
      borderSkipped: false,
      order: 1
    }]
   };
var orderbypaychartconfig = {
    type: 'bar',
    data: orderbypaychartdata,
    options: {
      responsive: true,
      plugins: {
        legend: {
            display: false,
        }
      }
    }
  };

var orderbypayChart = new Chart(
  document.getElementById('wordbypaymeth'),
  orderbypaychartconfig
);

const pageperfhomechartdata = {
  labels: [],
  datasets: [
    {
      label: 'Max-Min',
      data: [],
      borderColor: 'rgb(54, 162, 235)',
      backgroundColor: 'rgb(54, 162, 235)',
      borderWidth: 2,
      borderRadius: 5,
      borderSkipped: false,
      order: 1
    },
    {
      label: 'Avg',
      data: [],
      borderColor: 'rgb(255, 205, 86)',
      backgroundColor: 'rgb(255, 205, 86)',
      type: 'line',
      order: 0
    }]
};

const pageperfhomechartconfig = {
  type: 'bar',
  data: pageperfhomechartdata,
  options: {
    responsive: true,
    plugins: {
      legend: {
        display: false,
      },
      title: {
        display: false,
        text: 'Home Page Performance'
      }
    }
  }
};

var pageperfhomeChart = new Chart(
  document.getElementById('wpageperfh'),
  pageperfhomechartconfig
);

const pageperfproductchartdata = {
  labels: [],
  datasets: [
    {
      label: 'Max-Min',
      data: [],
      borderColor: 'rgb(255, 205, 86)',
      backgroundColor: 'lightgreen',
      borderWidth: 2,
      borderRadius: 5,
      borderSkipped: false,
      order: 1
    },
    {
      label: 'Avg',
      data: [],
      borderColor: 'rgb(54, 162, 235)',
      backgroundColor: 'rgb(54, 162, 235)',
      type: 'line',
      order: 0
    }]
};

const pageperfproductchartconfig = {
  type: 'bar',
  data: pageperfproductchartdata,
  options: {
    responsive: true,
    plugins: {
      legend: {
        display: false,
      },
      title: {
        display: false,
        text: 'Product Page Performance'
      }
    }
  }
};

var pageperfproductChart = new Chart(
  document.getElementById('wpageperfp'),
  pageperfproductchartconfig
);

var conversionchartdata= {
  axis: 'y',
  labels: [],
  datasets: [{
    label: 'Value',
    data: [],
    borderColor: 'rgb(54, 162, 235)',
    backgroundColor: 'rgb(54, 162, 235)',
    borderWidth: 2,
    borderRadius: 5,
    borderSkipped: false,
    order: 1
  }]
 };
var conversionchartconfig = {
  type: 'bar',
  data: conversionchartdata,
  options: {
    indexAxis: 'y',
    responsive: true,
    plugins: {
      legend: {
          display: false,
      }
    }
  }
};

var conversionChart = new Chart(
document.getElementById('wconversions'),
conversionchartconfig
);
  $(document).ready(function () {
  var chartData = @Html.Raw(Json.Serialize(Model.chartdata));
  console.log(chartData);
  var pageviews=0;
  var lastpv=0;
  var pageviewdiff=0;
  var pageviewchange="";
  var totalorders=0;
  var totalsales=0;
      if (!isNaN(chartData.pageViews.pv)){
            pageviews=chartData.pageViews.pv;
            if (!isNaN(chartData.pageViews.prev_pv)){
              lastpv=chartData.pageViews.prev_pv;
                var diff=0;
                if(chartData.pageViews.prev_pv&amp;gt;chartData.pageViews.pv){
                    diff=chartData.pageViews.prev_pv-chartData.pageViews.pv;
                    pageviewchange="-"
                }else{
                    diff=chartData.pageViews.pv-chartData.pageViews.prev_pv;
                    pageviewchange="+";
                }
                pageviewdiff=percentage(diff,pageviews);

              }
          }
        if (!isNaN(chartData.orders)){
          totalorders =chartData.orders
        }
        if (!isNaN(chartData.orderValue)){
          totalsales =chartData.orderValue
        }        
        var context={
            pageview: {
            title:"Page Views",  
            views: shortnumber(pageviews),
            last:shortnumber(lastpv),
            change: pageviewchange,
            diff:pageviewdiff
            },
            totalorders: {
            title:"Transactions",  
            orders: shortnumber(totalorders)
            },
            totalordervalue: {
            title:"Total Sales",  
            orderValue: shortnumber(totalsales)
            },
        };
        document.getElementById("wpageviews").innerHTML = pageviewTemplate(context);
        document.getElementById("wtotalorders").innerHTML =totalordersTemplate(context);
        document.getElementById("wordervalue").innerHTML=totalsalesTemplate(context);
        const updatedorderbypaychart = Chart.getChart("wordbypaymeth");
        updatedorderbypaychart.data.labels=chartData.ordersByPaymentMethod.map((item) =&amp;gt;item.paymentMethod);
        updatedorderbypaychart.data.datasets[0].data=chartData.ordersByPaymentMethod.map((item)=&amp;gt;item.orders);
        updatedorderbypaychart.update();

        const updatedpageperfhomechart = Chart.getChart("wpageperfh");
        updatedpageperfhomechart.data.labels=chartData.pagePerformanceHome.time;
        updatedpageperfhomechart.data.datasets[0].data=chartData.pagePerformanceHome.maxmin;
        updatedpageperfhomechart.data.datasets[1].data=chartData.pagePerformanceHome.avg;
        updatedpageperfhomechart.update();

        const updatedpageperfproductchart = Chart.getChart("wpageperfp");
        updatedpageperfproductchart.data.labels=chartData.pagePerformanceProduct.time;
        updatedpageperfproductchart.data.datasets[0].data=chartData.pagePerformanceProduct.maxmin;
        updatedpageperfproductchart.data.datasets[1].data=chartData.pagePerformanceProduct.avg;
        updatedpageperfproductchart.update();

        var funneldata =chartData.conversions.funnelItems.sort(function (a, b) {
          return a.order - b.order;
        });
        var funneldataVal=funneldata.map((item)=&amp;gt;item.value);
        const updatedconversionschart = Chart.getChart("wconversions");
        updatedconversionschart.data.labels=funneldata.map((item) =&amp;gt;item.item);
        updatedconversionschart.data.datasets[0].data=funneldata.map((item)=&amp;gt;item.value);
        updatedconversionschart.update();
        document.getElementById("wconversionrate").innerHTML = percentage(funneldataVal[funneldataVal.length-1],funneldataVal[0]) + "%";
  });      

  &amp;lt;/script&amp;gt;  
&amp;lt;/body&amp;gt;
&amp;lt;/html&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And the result is&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrbj9255sxgbdtf7k0d9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrbj9255sxgbdtf7k0d9.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  RedMetrixWebApp also contains a docker file which should help in running the application easily.
&lt;/h6&gt;

&lt;h2&gt;
  
  
  Final notes
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://redis.io/docs/stack/timeseries/" rel="noopener noreferrer"&gt;RedisTimeSeries : https://redis.io/docs/stack/timeseries/&lt;/a&gt;&lt;br&gt;
&lt;a href="https://redis.io/commands/?group=timeseries" rel="noopener noreferrer"&gt;RedisTimeSeries Commands : https://redis.io/commands/?group=timeseries&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/RedisTimeSeries/NRedisTimeSeries/" rel="noopener noreferrer"&gt;NRedisTimeSeries - .Net Client for RedisTimeSeries : https://github.com/RedisTimeSeries/NRedisTimeSeries/&lt;/a&gt;&lt;br&gt;
&lt;a href="https://redis.com/try-free/" rel="noopener noreferrer"&gt;Redis Cloud : https://redis.com/try-free/&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.mockaroo.com/7d448f50" rel="noopener noreferrer"&gt;Mockaroo - For mock data : https://www.mockaroo.com/7d448f50&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/redis-developer/nredi2read-preview/blob/main/Providers/RedisProvider.cs" rel="noopener noreferrer"&gt;RedisProvider : https://github.com/redis-developer/nredi2read-preview/blob/main/Providers/RedisProvider.cs&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/gridstack/gridstack.js" rel="noopener noreferrer"&gt;gridstack.js - For Dashboard Structure : https://github.com/gridstack/gridstack.js&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/handlebars-lang/handlebars.js" rel="noopener noreferrer"&gt;handlebars.js - For widget design : https://github.com/handlebars-lang/handlebars.js&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/chartjs/Chart.js" rel="noopener noreferrer"&gt;Chart.js - Charting library : https://github.com/chartjs/Chart.js&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/c-arnab/Redmetrix" rel="noopener noreferrer"&gt;Github Code Repository - Redmetrix  : https://github.com/c-arnab/Redmetrix&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  This post is in collaboration with Redis
&lt;/h6&gt;

</description>
      <category>dotnet</category>
      <category>redis</category>
      <category>timeseries</category>
      <category>analytics</category>
    </item>
    <item>
      <title>.Net microservice with gRPC containerised using Azure Container Registry and deployed on Azure Kubernetes Service</title>
      <dc:creator>c-arnab</dc:creator>
      <pubDate>Mon, 22 Mar 2021 01:39:15 +0000</pubDate>
      <link>https://dev.to/c_arnab/net-microservice-with-grpc-containerised-using-azure-container-registry-and-deployed-on-azure-kubernetes-service-36go</link>
      <guid>https://dev.to/c_arnab/net-microservice-with-grpc-containerised-using-azure-container-registry-and-deployed-on-azure-kubernetes-service-36go</guid>
      <description>&lt;h4&gt;
  
  
  &lt;em&gt;In this post, you will learn about gRPC, protocol buffers and building high performance services in .NET 5 using C#. Then you learn to test the service locally using a tool grpcui which is similar to postman but for gRPC. Then you containerise the image and publish the same to Azure Container Registry using ACR task. Then you test the containerised application by running the image on Azure Container Instance and building a gRPC client in .NET 5 using C#. Finally, you deploy the service to Azure Kubernetes Service.&lt;/em&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/09IqWIxUqFU"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Microservices are booming.&lt;br&gt;
From lower costs to better performance to less downtime to the ability to scale, microservices provide countless benefits relative to monolithic designs. Rather than using a text-based messaging protocol like Representational State Transfer (REST), leveraging a binary protocol which make serialization or deserialization easier  and that is optimized for inter-service communication is ideal. gRPC, implemented in Google and open sourced is a high-performance RPC framework which is the most viable solution for building communication between internal microservices.&lt;/p&gt;
&lt;h2&gt;
  
  
  Develop gRPC Service
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Foundational concepts of gRPC
&lt;/h3&gt;

&lt;p&gt;Since a long, long time, developers have been building applications which speak with other applications - java applications speaking with java applications, .Net applications speaking with .Net applications and so on.&lt;br&gt;
Then SimpleObjectAccessProtocol was born such that applications in one language could speak with applications of another language.&lt;br&gt;
Along came WCF by Microsoft. Though WCF also had NETTCP binding, a fast binary encoding to speak between .Net clients and servers, the primary mechanism of communication in SOAP was XML. XML (lots and lots of it) made communication slow and there was a need for something easier for machines to understand as well as a better mode of communication within and across data centers.&lt;br&gt;
And Google created their own rpc framework &amp;amp; binary format - protobuf. Later after HTTP/2 was created, google crated a new rpc framework, called it grpc and made it public. And we get an insanely powerful, fast, highly optimised protocol without the overhead of a soap envelope.&lt;br&gt;
Unlike SOAP where different languages generated XML which would pass between datacenters, gRPC relies on Protocol Buffers language to define service contracts independently from the programming language being used to develop the applications and uses protocol buffers (protobuf) as the binary data interchange format for inter-service communication.&lt;/p&gt;
&lt;h6&gt;
  
  
  Further Reading on basics
&lt;/h6&gt;

&lt;p&gt;&lt;a href="https://www.capitalone.com/tech/software-engineering/grpc-framework-for-microservices-communication/" rel="noopener noreferrer"&gt;gRPC - A Modern Framework for Microservices Communication&lt;/a&gt;&lt;br&gt;
&lt;a href="https://grpc.io/" rel="noopener noreferrer"&gt;grpc.io&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Create a gRPC Service using VS Code
&lt;/h3&gt;

&lt;p&gt;We will be building a simple service which Checks for availability of products in inventory.&lt;br&gt;
Open VS Code, Open a new terminal and run the command&lt;br&gt;
&lt;code&gt;dotnet new grpc -o ProductAvailabilityService&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;A folder with name &lt;em&gt;ProductAvailabilityService&lt;/em&gt; will be created and templated application will be generated.&lt;br&gt;
Move inside the folder &lt;em&gt;ProductAvailabilityService&lt;/em&gt;&lt;br&gt;
&lt;code&gt;cd ProductAvailabilityService&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Run the command&lt;br&gt;
&lt;code&gt;dotnet add ProductAvailabilityService.csproj package Grpc.AspNetCore.Server.Reflection&lt;/code&gt;&lt;br&gt;
This package allows us to use Reflection  and will help us test our service.&lt;/p&gt;

&lt;p&gt;gRPC natively supports the ability to define a service contract where you can specify the methods that can be invoked remotely and the data structure of the parameters and return types. Using major programming language provided tools, a server-side skeleton and client-side code (stub) can be generated using the same Protocol Buffers file (.proto) which defines the service contract.&lt;/p&gt;

&lt;p&gt;Delete the template generated &lt;code&gt;greet.proto&lt;/code&gt; file in &lt;code&gt;ProductAvailabilityService/Protos&lt;/code&gt; folder.&lt;br&gt;
Then, add a new file named &lt;code&gt;product-availability-service.proto&lt;/code&gt; in the same folder.&lt;br&gt;
Here we define apis and the messages in a language-agnostic way.&lt;br&gt;
Here, &lt;em&gt;ProductAvailabilityCheck&lt;/em&gt; is the service definition which contains a method &lt;em&gt;CheckProductAvailabilityRequest&lt;/em&gt;.&lt;br&gt;
The method takes in a request &lt;em&gt;ProductAvailabilityRequest&lt;/em&gt; and sends back a response &lt;em&gt;ProductAvailabilityReply&lt;/em&gt;. The request and response objects along with the implementation base class will be generated automatically using the &lt;code&gt;Grpc.Tools&lt;/code&gt; NuGet package and there is no need to implement the complexities.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Protos/product-availability-service.proto

syntax = "proto3";

option csharp_namespace = "ProductAvailabilityService";

package ProductAvailability;

service ProductAvailabilityCheck {
  rpc CheckProductAvailabilityRequest (ProductAvailabilityRequest) returns (ProductAvailabilityReply);
}

message ProductAvailabilityRequest {
  string productId = 1;  
}

message ProductAvailabilityReply {
  bool isAvailable = 1;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Nuget package knows which proto file to look for to generate the types from &lt;em&gt;.csproj&lt;/em&gt; file.&lt;br&gt;
So, we go to &lt;em&gt;.csproj&lt;/em&gt; file and update the &lt;em&gt;.proto&lt;/em&gt; file as shown below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;ItemGroup&amp;gt;
    &amp;lt;Protobuf Include="Protos\product-availability-service.proto" GrpcServices="Server" /&amp;gt;
&amp;lt;/ItemGroup&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Delete the template generated &lt;code&gt;GreeterService.cs&lt;/code&gt; file in Services folder.&lt;br&gt;
Create a new file &lt;code&gt;ProductAvailabilityCheckService.cs&lt;/code&gt; in services folder.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Grpc.Core;
using Microsoft.Extensions.Logging;

namespace ProductAvailabilityService
{
    public class ProductAvailabilityCheckService: ProductAvailabilityCheck.ProductAvailabilityCheckBase
    {
        private readonly ILogger&amp;lt;ProductAvailabilityCheckService&amp;gt; _logger;
        private static readonly Dictionary&amp;lt;string, Int32&amp;gt; productAvailabilityInfo = new Dictionary&amp;lt;string, Int32&amp;gt;() 
        {
            {"pid001", 1},
            {"pid002", 0},
            {"pid003", 5},
            {"pid004", 1},
            {"pid005", 0},
            {"pid006", 2}   
        };
        public ProductAvailabilityCheckService(ILogger&amp;lt;ProductAvailabilityCheckService&amp;gt; logger)
        {
            _logger = logger;
        }

        public override Task&amp;lt;ProductAvailabilityReply&amp;gt; CheckProductAvailabilityRequest(ProductAvailabilityRequest request, ServerCallContext context)
        {
            return Task.FromResult(new ProductAvailabilityReply
            {
                IsAvailable = IsProductAvailable(request.ProductId)
            });
        }

        private bool IsProductAvailable(string productId) {
            bool isAvailable = false;

            if (productAvailabilityInfo.TryGetValue(productId, out Int32 quantity))
            {
                isAvailable = (quantity &amp;gt; 0) ? true : false;
            }

            return isAvailable;
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here a dictionary &lt;em&gt;productAvailabilityInfo&lt;/em&gt; acts as a dummy inventory data store for products with productid – pid001 to pid006 and available value.&lt;br&gt;
The method &lt;em&gt;CheckProductAvailabilityRequest&lt;/em&gt; defined in &lt;em&gt;.proto&lt;/em&gt; file is implemented here. The implemented method takes in the request object along with &lt;em&gt;ServerCallContext&lt;/em&gt; useful in authentication / authorization and returns the response object.&lt;/p&gt;

&lt;p&gt;Open &lt;code&gt;startup.cs&lt;/code&gt; file and update &lt;em&gt;greeterservice&lt;/em&gt; to &lt;em&gt;ProductAvailabilityCheckService&lt;/em&gt;.&lt;br&gt;
Also, add &lt;code&gt;services.AddGrpcReflection()&lt;/code&gt; and &lt;code&gt;endpoints.MapGrpcReflectionService()&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; public void ConfigureServices(IServiceCollection services)
        {
            services.AddGrpc();
            services.AddGrpcReflection();
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; app.UseEndpoints(endpoints =&amp;gt;
            {
                endpoints.MapGrpcService&amp;lt;ProductAvailabilityCheckService&amp;gt;();
                endpoints.MapGrpcReflectionService();

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, to setup a HTTP/2 endpoint without TLS, open &lt;em&gt;program.cs&lt;/em&gt; and add the code below inside the method call to &lt;em&gt;ConfigureWebHostDefaults&lt;/em&gt; to run the service in http mode in port 5000.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; .ConfigureWebHostDefaults(webBuilder =&amp;gt;
                {
                    webBuilder.ConfigureKestrel(options =&amp;gt;
                    {
                        // Setup a HTTP/2 endpoint without TLS.
                        options.ListenLocalhost(5000, o =&amp;gt; o.Protocols = 
                            HttpProtocols.Http2);

                    });
                    webBuilder.UseStartup&amp;lt;Startup&amp;gt;();
                });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Also add the using statement &lt;br&gt;
&lt;code&gt;using Microsoft.AspNetCore.Server.Kestrel.Core;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Run the service by typing in terminal&lt;br&gt;
&lt;code&gt;dotnet run&lt;/code&gt;&lt;br&gt;
Service will be up and running using kestral webserver.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zugtwivk2so77tlsru3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zugtwivk2so77tlsru3.png" alt="Image1"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Test Service Locally
&lt;/h3&gt;

&lt;p&gt;To test our service we use &lt;em&gt;grpcui&lt;/em&gt; which is like &lt;em&gt;Postman&lt;/em&gt;, but for &lt;em&gt;gRPC&lt;/em&gt; APIs instead of &lt;em&gt;REST&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;To install the tool first install &lt;em&gt;GO&lt;/em&gt; &lt;a href="https://golang.org/" rel="noopener noreferrer"&gt;Golang Site&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then run the two commands below in powershell ( run as Administrator).&lt;br&gt;
&lt;code&gt;go get github.com/fullstorydev/grpcurl/...&lt;/code&gt;&lt;br&gt;
&lt;code&gt;go install github.com/fullstorydev/grpcurl/cmd/grpcurl&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Next run the tool with serverhost address and server port number&lt;/p&gt;

&lt;p&gt;&lt;code&gt;grpcui -plaintext localhost:5000&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;gRPCui will expose a local URL to test the gRPC service.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmq2eopb9iijzy4os3v5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmq2eopb9iijzy4os3v5.png" alt="Image2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Open the URL shown and add a productid as shown below.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdk6auwen6s826o80jjh7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdk6auwen6s826o80jjh7.png" alt="Image3"&gt;&lt;/a&gt;&lt;br&gt;
The product availability will be shown as response&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwdmox75r170a7l42y63.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwdmox75r170a7l42y63.png" alt="Image4"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Test service on Azure
&lt;/h2&gt;

&lt;p&gt;Next, we run the the grpc service in a container on Azure Cloud. Though we can build the container image on our workstation by installing Azure CLI and Docker engine, we will not do so but rather do so on Azure using Cloud Shell and Azure Container Registry task and push the image to Azure Container Registry.&lt;/p&gt;

&lt;p&gt;Update &lt;em&gt;program.cs&lt;/em&gt; to below code as we no longer wish it to listen to localhost but any ip on port 80.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.ConfigureWebHostDefaults(webBuilder =&amp;gt;
                {
                    webBuilder.ConfigureKestrel(options =&amp;gt;
                    {
                        // Setup a HTTP/2 endpoint without TLS.
                       // options.ListenLocalhost(5000, o =&amp;gt; o.Protocols = 
                       //     HttpProtocols.Http2);
                        options.ListenAnyIP(80, o =&amp;gt; o.Protocols = 
                            HttpProtocols.Http2);
                    });
                    webBuilder.UseStartup&amp;lt;Startup&amp;gt;();
                });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Dockerfile Basics
&lt;/h3&gt;

&lt;p&gt;A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image. Using docker build users can create an automated build that executes several command-line instructions in succession.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create Dockerfile
&lt;/h3&gt;

&lt;p&gt;Create a new file in the service project, name the same &lt;em&gt;Dockerfile&lt;/em&gt; and add the below contents.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM mcr.microsoft.com/dotnet/aspnet:5.0 AS base
WORKDIR /app
EXPOSE 80

FROM mcr.microsoft.com/dotnet/sdk:5.0 AS build
WORKDIR /src
COPY "ProductAvailabilityService.csproj" .
RUN dotnet restore "ProductAvailabilityService.csproj"
COPY . .
RUN dotnet build "ProductAvailabilityService.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "ProductAvailabilityService.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "ProductAvailabilityService.dll"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lets understand what is happening in the dockerfile.&lt;br&gt;
At first restore the packages and dependencies application requires to build. Next, build the application and then publish the same in &lt;em&gt;/app/publish&lt;/em&gt; directory. At this stage &lt;em&gt;.NET Core SDK&lt;/em&gt; available at &lt;em&gt;Microsoft Container Registry (MCR)&lt;/em&gt; is used as base image. &lt;br&gt;
Next, use the &lt;em&gt;.NET Core Runtime&lt;/em&gt; as base image and copy the binaries generated in the previous stage to &lt;em&gt;/app&lt;/em&gt; directory. Please note assembly name in ENTRYPOINT is case-sensitive.&lt;/p&gt;
&lt;h3&gt;
  
  
  Add source code to github
&lt;/h3&gt;

&lt;p&gt;Create a &lt;em&gt;.gitignore&lt;/em&gt; file either manually or using &lt;em&gt;.gitignore Generator&lt;/em&gt; &lt;a href="https://marketplace.visualstudio.com/items?itemName=piotrpalarz.vscode-gitignore-generator" rel="noopener noreferrer"&gt;Visual Studio Marketplace&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Create an empty github repository at &lt;a href="https://github.com/new" rel="noopener noreferrer"&gt;https://github.com/new&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Run the following commands in terminal.&lt;/p&gt;

&lt;p&gt;Initialise the local directory as Git repository.&lt;br&gt;
&lt;code&gt;git init&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Add all the files in the local directory to staging, ready for commit.&lt;br&gt;
&lt;code&gt;git add .&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Set your account’s default identity&lt;br&gt;
&lt;code&gt;git config --global user.email “your-emailid@your-email-domain.com”&lt;/code&gt;&lt;br&gt;
&lt;code&gt;git config --global user.name "your-username"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Commit the staged files&lt;br&gt;
&lt;code&gt;git commit -m "first commit"&lt;/code&gt;&lt;br&gt;
&lt;code&gt;git branch -M main&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To push your local repository changes to add your GitHub repository as remote repository&lt;br&gt;
&lt;code&gt;git remote add origin My-GitHub-repository-URL&lt;/code&gt;&lt;br&gt;
(I replace My-GitHub-repository-URL to &lt;a href="https://github.com/c-arnab/grpc-article.git" rel="noopener noreferrer"&gt;https://github.com/c-arnab/grpc-article.git&lt;/a&gt; where you can find my entire source code)&lt;/p&gt;

&lt;p&gt;Push local repository to the remote repository we added earlier called “origin” in its branch named master.&lt;br&gt;
&lt;code&gt;git push -u origin main&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You can now see all your files in the GitHub repository.&lt;/p&gt;
&lt;h3&gt;
  
  
  Build Container Image
&lt;/h3&gt;

&lt;p&gt;Login to &lt;em&gt;Azure Portal&lt;/em&gt; at &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;https://portal.azure.com/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select &lt;em&gt;Azure Shell&lt;/em&gt; and then &lt;em&gt;bash&lt;/em&gt; option&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdgxpwgxvsle7ao4x6kcv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdgxpwgxvsle7ao4x6kcv.png" alt="Image5"&gt;&lt;/a&gt;&lt;br&gt;
Its always advisable to create a new &lt;em&gt;resource group&lt;/em&gt; when working on a new project or in this case code in an article.&lt;/p&gt;

&lt;p&gt;Run the following command to create a &lt;em&gt;resource group&lt;/em&gt;.&lt;br&gt;
&lt;code&gt;az group create --name grpc-container-demo --location southindia --subscription 59axxx4d-xxxx-4352-xxxx-21dd55xxxca0&lt;/code&gt;&lt;br&gt;
Please ensure you add your own &lt;em&gt;subscription id&lt;/em&gt; after subscription&lt;/p&gt;

&lt;p&gt;Azure Container Registry (ACR) is an Azure-based, private registry, for container images.&lt;br&gt;
Run the following command to create an &lt;em&gt;Azure Container Registry (ACR)&lt;/em&gt;. You will need to provide a unique name. The &lt;code&gt;--admin-enabled&lt;/code&gt; flag will be useful while running the container image in container instance.&lt;br&gt;
&lt;code&gt;az acr create --name MyGrpcContainerRegistry --resource-group grpc-container-demo --sku Standard --admin-enabled true&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Change directory to clouddrive&lt;/p&gt;

&lt;p&gt;Clone the code with &lt;em&gt;Git&lt;/em&gt;&lt;br&gt;
&lt;code&gt;git clone https://github.com/c-arnab/grpc-article.git&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Change directory to &lt;em&gt;grpc-article&lt;/em&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fag6cox2y1szzjd405xfz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fag6cox2y1szzjd405xfz.png" alt="Image6"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;ACR task&lt;/em&gt;, a function of ACR is a container image build service on the cloud.&lt;br&gt;
Build container image using the &lt;code&gt;az acr build&lt;/code&gt; command and store it in Registry.&lt;br&gt;
&lt;code&gt;az acr build --image product-availability-service:latest --registry MyGrpcContainerRegistry .&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Run the following command to list and view image generated in Azure Container Registry &lt;br&gt;
&lt;code&gt;az acr repository list --name MyGrpcContainerRegistry --output table&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvokysdh2cfiibnflc1rf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvokysdh2cfiibnflc1rf.png" alt="Image7"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Run image in Azure Container Instance
&lt;/h3&gt;

&lt;p&gt;Azure Container Instances is a service that enables a developer to deploy containers on the Microsoft Azure public cloud without having to provision or manage any underlying infrastructure.&lt;/p&gt;

&lt;p&gt;Once image is in ACR, its easy to test the application in container image.&lt;/p&gt;

&lt;p&gt;Search for &lt;em&gt;container registries&lt;/em&gt; in &lt;em&gt;Azure portal&lt;/em&gt; and select the same.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi010f6pd2fbocxx12s1m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi010f6pd2fbocxx12s1m.png" alt="Image8"&gt;&lt;/a&gt;&lt;br&gt;
Select the registry created and then select &lt;em&gt;Access Keys&lt;/em&gt; under &lt;em&gt;Settings&lt;/em&gt;&lt;br&gt;
The &lt;em&gt;--admin-enabled&lt;/em&gt; flag while creating the registry has ensured that Admin user is enabled.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft03acsfz00keyn3c4evj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft03acsfz00keyn3c4evj.png" alt="Image9"&gt;&lt;/a&gt;&lt;br&gt;
Select &lt;em&gt;Repositories&lt;/em&gt; under &lt;em&gt;Services&lt;/em&gt; and then select the &lt;em&gt;product-availability-service&lt;/em&gt; under Repositories.&lt;br&gt;
Click on the &lt;em&gt;three dots&lt;/em&gt; on the right side of the &lt;em&gt;latest&lt;/em&gt; tag and select &lt;em&gt;Run instance&lt;/em&gt;.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxft3k732wae9qo1dtvay.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxft3k732wae9qo1dtvay.png" alt="Image10"&gt;&lt;/a&gt;&lt;br&gt;
In the ensuing screen, Create &lt;em&gt;Container Instance&lt;/em&gt; by providing a name and select a location near you. Then Click OK.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fged810b4vl6okqvg4hcq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fged810b4vl6okqvg4hcq.png" alt="Image11"&gt;&lt;/a&gt;&lt;br&gt;
Once the deployment completes, you should receive a notification and clicking the same one can reach the Container group and view the Container Instance and confirm that it is running. Finally get the IP Address which will be required when we create a client for the service next.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fum326pobat26ln19syni.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fum326pobat26ln19syni.png" alt="Image12"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Create a gRPC client
&lt;/h3&gt;

&lt;p&gt;For this article we create a console app as a client which in turn calls &lt;em&gt;CheckProductAvailabilityRequest&lt;/em&gt; method in the service.&lt;/p&gt;

&lt;p&gt;Close all applications/folder in VS Code and open a new terminal.&lt;/p&gt;

&lt;p&gt;Run command to create a console app&lt;br&gt;
&lt;code&gt;dotnet new console -o ProductAvailabilityClient&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Open folder / application &lt;em&gt;ProductAvailabilityClient&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the terminal run following commands to add necessary GRPC libraries&lt;/p&gt;

&lt;p&gt;&lt;code&gt;dotnet add ProductAvailabilityClient.csproj package Grpc.Net.Client&lt;/code&gt;&lt;br&gt;
&lt;code&gt;dotnet add ProductAvailabilityClient.csproj package Google.Protobuf&lt;/code&gt;&lt;br&gt;
&lt;code&gt;dotnet add ProductAvailabilityClient.csproj package Grpc.Tools&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create a folder &lt;em&gt;Protos&lt;/em&gt; and drop the &lt;em&gt;.proto&lt;/em&gt; file used in the  service application in this folder.&lt;/p&gt;

&lt;p&gt;Replace the program.cs code with the following code&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using System;
using System.Threading.Tasks;
using ProductAvailabilityService;
using Grpc.Net.Client;

namespace ProductAvailabilityClient
{
    class Program
    {
        static async Task Main(string[] args)
        {
            var channel = GrpcChannel.ForAddress("http://52.226.2.157:80");
            var client =  new ProductAvailabilityCheck.ProductAvailabilityCheckClient(channel);
            var productRequest = new ProductAvailabilityRequest { ProductId = "pid001"};
            var reply = await client.CheckProductAvailabilityRequestAsync(productRequest);

            Console.WriteLine($"{productRequest.ProductId} is {(reply.IsAvailable ? "available" : "not available")}!");
            Console.WriteLine("Press any key to exit...");
            Console.ReadKey();
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace the ip address with the ip address of your container instance. Please note the ProductId. You can change this value to any other.&lt;/p&gt;

&lt;p&gt;Run the console client by typing in terminal&lt;br&gt;
&lt;code&gt;dotnet run&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The console should show whether the product id is available. Change the value of the product id and rerun the application to reconfirm that the application works.&lt;/p&gt;
&lt;h2&gt;
  
  
  Deploy on Azure Kubernetes Service
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Foundational concepts of Kubernetes
&lt;/h3&gt;

&lt;p&gt;Containers are not new. The fundamental container goes back to 1979 with Unix’s chroot that allowed to separate processes. Then came linux containers in 2008. Docker made its appearance in 2013 and gave the power of containers in the hand of developers.&lt;/p&gt;

&lt;p&gt;And then Google came up with Kubernetes to run these containers at scale with intelligent deployment, auto repair, horizontal scaling, service discovery &amp;amp; load balancing, automation of deployments &amp;amp; rollback and key / secrets management.&lt;/p&gt;

&lt;p&gt;An application running on Kubernetes is known as workload and are run inside a set of pods. A workload can have a single component deployed in a single pod or several that work together deployed in a single pod or multiple pods. &lt;br&gt;
You can deploy container images in kubernetes either in a pod (Naked Pod) or rather than managing each Pod directly, use workload resources that manage a set of pods like Deployment and ReplicaSet.&lt;br&gt;
Its advised not to use naked Pods (that is, Pods not bound to a ReplicaSet or Deployment) as Naked Pods will not be rescheduled in the event of a node failure.&lt;br&gt;
Service is an abstract way to expose an application running on a set of Pods as a network service.&lt;/p&gt;

&lt;p&gt;Azure Kubernetes Service (AKS) is a managed Kubernetes service in which the master node is managed by Azure and end-users manages worker nodes. Users can use AKS to deploy, scale, and manage Docker containers and container-based applications across a cluster of container hosts.&lt;/p&gt;
&lt;h3&gt;
  
  
  Create Service Principal
&lt;/h3&gt;

&lt;p&gt;While creating container registry we had provided the same admin access with &lt;em&gt;--admin-enabled&lt;/em&gt; flag to easily provide access to images to run from container instance but providing admin access is not a best practice from security perspective. While deploying to AKS we use &lt;em&gt;service principal&lt;/em&gt; which will enable applications and services (AKS) to authenticate to container registry.&lt;/p&gt;

&lt;p&gt;Get the full Container Registry ID&lt;br&gt;
&lt;code&gt;ACR_ID=$(az acr show --name MyGrpcContainerRegistry --query id --output tsv)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create the Service principal with proper rights&lt;br&gt;
&lt;code&gt;SP_PASSWD=$(az ad sp create-for-rbac --name http://acr-service-principal --scopes $ACR_ID --role owner --query password --output tsv)&lt;/code&gt;&lt;br&gt;
&lt;code&gt;SP_APP_ID=$(az ad sp show --id http://acr-service-principal --query appId --output tsv)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Please copy the &lt;em&gt;service principal Id and password&lt;/em&gt; to safe place as we will need them later.&lt;br&gt;
&lt;code&gt;echo "ID: $SP_APP_ID"&lt;/code&gt;&lt;br&gt;
&lt;code&gt;echo "Password: $SP_PASSWD"&lt;/code&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Create Cluster
&lt;/h3&gt;

&lt;p&gt;Create an AKS cluster using the &lt;em&gt;az aks create&lt;/em&gt; command. &lt;br&gt;
&lt;code&gt;az aks create --resource-group grpc-container-demo --name mygrpcdemocluster --node-count 2 --generate-ssh-keys&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;IF you receive an error, check if you have VMs with your default subscription and region.&lt;br&gt;
&lt;code&gt;az vm list-skus -l southindia --query '[?resourceType==&lt;/code&gt;virtualMachines&lt;code&gt;&amp;amp;&amp;amp; restrictions==&lt;/code&gt;[]&lt;code&gt;]' -o table&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;If the above command does not provide any data, try the same with other locations for example try &lt;em&gt;centralindia&lt;/em&gt; in place of &lt;em&gt;southindia&lt;/em&gt;.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpb8g75xfvehgozylwjpx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpb8g75xfvehgozylwjpx.png" alt="Image13"&gt;&lt;/a&gt;&lt;br&gt;
Once you get data run again the &lt;em&gt;az aks create&lt;/em&gt; command with &lt;em&gt;--location&lt;/em&gt; flag&lt;br&gt;
&lt;code&gt;az aks create --resource-group grpc-container-demo --name mygrpcdemocluster --node-count 2 --generate-ssh-keys --location centralindia&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feeo0eqfmslamimjz3oe7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feeo0eqfmslamimjz3oe7.png" alt="Image14"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Connect to Cluster
&lt;/h3&gt;

&lt;p&gt;Kubernetes command-line client, &lt;em&gt;kubectl&lt;/em&gt; is used to manage a Kubernetes cluster which is already installed in Azure Cloud Shell.&lt;/p&gt;

&lt;p&gt;Use the &lt;em&gt;az aks get-credentials&lt;/em&gt; command to download credentials and configure &lt;em&gt;kubectl&lt;/em&gt; to use the same to connect to Kubernetes cluster&lt;br&gt;
&lt;code&gt;az aks get-credentials --resource-group grpc-container-demo --name mygrpcdemocluster&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;View namespaces&lt;br&gt;
&lt;code&gt;kubectl get namespaces&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create a namespace to work in&lt;br&gt;
&lt;code&gt;kubectl create namespace mygrpcdemospace&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Set created namespace to current namespace&lt;br&gt;
&lt;code&gt;kubectl config set-context --current --namespace=mygrpcdemospace&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Confirm you are in correct namespace&lt;br&gt;
&lt;code&gt;kubectl config get-contexts&lt;/code&gt;      &lt;/p&gt;
&lt;h3&gt;
  
  
  Deploy application
&lt;/h3&gt;

&lt;p&gt;Kubernetes uses an &lt;em&gt;image pull secret&lt;/em&gt; to store information needed to authenticate to registry. To create the pull secret for an Azure container registry, run the following command with a name (my-acr-secret), service principal id and password you created earlier along with your registry url.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;kubectl create secret docker-registry my-acr-secret \
    --namespace mygrpcdemospace \
    --docker-server=mygrpccontainerregistry.azurecr.io \
    --docker-username=eddeabb7-6f26-4e72-b741-549f67e83736 \
    --docker-password=j_L.57Om6S1Pqh9-V8_iSFPSk9m7jFD054
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpfb3k2wq4qx1cu2ru6i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpfb3k2wq4qx1cu2ru6i.png" alt="Image15"&gt;&lt;/a&gt;&lt;br&gt;
A &lt;em&gt;ReplicaSet&lt;/em&gt; ensures that a specified number of pod replicas are running at any given time. However, a &lt;em&gt;Deployment&lt;/em&gt; is a higher-level concept that manages ReplicaSets and provides declarative updates to Pods along with a lot of other useful features. Therefore, we will use Deployments instead of directly using ReplicaSets to deploy our container.&lt;/p&gt;

&lt;p&gt;Run the following command to create a &lt;em&gt;manifest yaml&lt;/em&gt; and submit it to our Kubernetes cluster to create a new &lt;em&gt;Deployment&lt;/em&gt; object named "grpc-demo-deployment". Do note the &lt;em&gt;image&lt;/em&gt; and &lt;em&gt;imagePullSecrets&lt;/em&gt; value and change it to your values.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cat &amp;lt;&amp;lt;EOF | kubectl apply -f -
apiVersion: apps/v1
kind: Deployment
metadata:
  name: grpc-demo-deployment
spec:
  replicas: 2
  selector:
    matchLabels:
      app: grpc-demo
  template:
    metadata:
      labels:
        app: grpc-demo
        env: dev
    spec:
      containers:
      - name: grpc-demo
        image: mygrpccontainerregistry.azurecr.io/product-availability-service:latest
        imagePullPolicy: Always
        ports:
        - containerPort: 80
      imagePullSecrets:
      - name: my-acr-secret
EOF
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check if the Deployment was created&lt;br&gt;
&lt;code&gt;kubectl get deployments&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyvn0h7vuubxvu179jsyq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyvn0h7vuubxvu179jsyq.png" alt="Image16"&gt;&lt;/a&gt;&lt;br&gt;
View the current ReplicaSets deployed&lt;br&gt;
&lt;code&gt;kubectl get rs&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Check for the Pods brought up&lt;br&gt;
&lt;code&gt;kubectl get pods --show-labels&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Futott6qetgptjjn98d9o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Futott6qetgptjjn98d9o.png" alt="Image17"&gt;&lt;/a&gt;&lt;br&gt;
Run the following command to create a new &lt;em&gt;Service&lt;/em&gt; object named "grpc-service", which targets TCP port 80 on any Pod with the &lt;em&gt;selector&lt;/em&gt; app=grpc-demo label.&lt;br&gt;
The selector determines the set of pods targeted by the service.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cat &amp;lt;&amp;lt;EOF | kubectl apply -f -
apiVersion: v1
kind: Service
metadata:
  name: grpc-service
spec:
  selector:
    app: grpc-demo
  ports:
    - protocol: TCP
      port: 80
      targetPort: 80
  type: LoadBalancer
EOF
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run the following command and wait until you see an external IP assigned. Copy this ip address as we will need this to test in the next step.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;watch kubectl get services&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbd8njdylnuujqm06xela.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbd8njdylnuujqm06xela.png" alt="Image18"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1ea95vvuftamjk8ywxx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1ea95vvuftamjk8ywxx.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Test the service deployed on AKS
&lt;/h3&gt;

&lt;p&gt;Open the client application developed earlier in VS Code and Replace the ip address with the external ip address viewed in the previous step.&lt;/p&gt;

&lt;p&gt;Run the console client by typing in terminal&lt;br&gt;
&lt;code&gt;dotnet run&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The console should show whether the product id is available. Change the value of the product id and rerun the application to reconfirm that the application works.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finally..
&lt;/h2&gt;

&lt;p&gt;Delete the resource group and all the resources in it.&lt;br&gt;
&lt;code&gt;az group delete --name grpc-container-demo&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;p&gt;Microsft Azure also has another service named Azure Redhat Openshift. &lt;br&gt;
Depending on who has implemented Kubernetes and its versions, you will have different tools for container runtime, log management, metrics &amp;amp; monitoring, etc. So, that led to enterprise customers to ask for a consistent version of Kubernetes which can run the same way everywhere from internal datacentres to any cloud. OpenShift is that enterprise deployment layer for Kubernetes in hybrid cloud mode.&lt;br&gt;
Azure Red Hat OpenShift provides a flexible, self-service deployment of fully managed OpenShift clusters jointly operated and supported by Microsoft &amp;amp; Red Hat and offers an SLA of 99.95% availability.&lt;br&gt;
Unlike AKS where Microsoft manages the master nodes and customer only pays for and manages the worker nodes, ARO is dedicated to a given customer and all resources, including the cluster master nodes, run in customer’s subscription.&lt;br&gt;
Our demo code is a part of inventory microservice and a part of supply chain application. Such applications depending on customer preference are deployed either on customer’s internal datacentre or public cloud.&lt;br&gt;
It is better for an application developer to select a deployment platform for such applications which can run in any environment (Bare metal, Virtual Machines, Public clouds) and OpenShift is ideally suited for this and Azure Redhat OpenShift is probably the best option for the same in public clouds.&lt;br&gt;
So, in the next article we look at deploying a series of microservices which are part of a supply chain application on ARO with continuous integration, delivery and deployment.&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>grpc</category>
      <category>azure</category>
      <category>kubernetes</category>
    </item>
  </channel>
</rss>
