<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Showmen Dasgupta</title>
    <description>The latest articles on DEV Community by Showmen Dasgupta (@showmen_dasgupta_08171571).</description>
    <link>https://dev.to/showmen_dasgupta_08171571</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/showmen_dasgupta_08171571"/>
    <language>en</language>
    <item>
      <title>From Setup to Implementation: Integrating Kafka with a .NET Web API (Part 1)</title>
      <dc:creator>Showmen Dasgupta</dc:creator>
      <pubDate>Wed, 04 Sep 2024 15:25:57 +0000</pubDate>
      <link>https://dev.to/showmen_dasgupta_08171571/from-setup-to-implementation-integrating-kafka-with-a-net-web-api-part-1-1em6</link>
      <guid>https://dev.to/showmen_dasgupta_08171571/from-setup-to-implementation-integrating-kafka-with-a-net-web-api-part-1-1em6</guid>
      <description>&lt;h3&gt;
  
  
  Introduction:
&lt;/h3&gt;

&lt;p&gt;In today’s fast-paced digital landscape, real-time data processing has become essential for building responsive and scalable applications. Whether it’s tracking user interactions, processing transactions, or managing logs, the ability to handle large volumes of data in real-time is crucial for modern software systems.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Kafka?
&lt;/h3&gt;

&lt;p&gt;Apache Kafka is an open-source distributed event streaming platform designed for high-throughput, low-latency data streaming. Originally developed by LinkedIn and later open-sourced under the Apache License, Kafka has quickly become the de facto standard for building real-time data pipelines and streaming applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features of Kafka:
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Scalability:&lt;/strong&gt; &lt;br&gt;
Kafka can handle high volumes of data with ease, allowing you to scale horizontally across multiple servers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Durability:&lt;/strong&gt; &lt;br&gt;
Kafka ensures data is stored reliably by replicating it across different nodes in the cluster.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Fault-Tolerance:&lt;/strong&gt; &lt;br&gt;
Kafka’s distributed architecture is designed to continue operating seamlessly, even in the event of server failures.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real-Time Processing:&lt;/strong&gt; &lt;br&gt;
Kafka can process streams of data in real-time, making it ideal for applications that require immediate insights.&lt;/p&gt;
&lt;h3&gt;
  
  
  Where is Kafka Used?
&lt;/h3&gt;

&lt;p&gt;Kafka is used across various industries and by many leading tech companies to build robust, real-time data pipelines. Here are some common use cases:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Log Aggregation:&lt;/strong&gt; &lt;br&gt;
Collecting and storing logs from different sources in a central location for monitoring and analysis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real-Time Analytics:&lt;/strong&gt; &lt;br&gt;
Processing streams of data in real-time to generate analytics and insights for immediate decision-making.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event Sourcing:&lt;/strong&gt; &lt;br&gt;
Capturing every change to an application state as an event stream, which can be replayed to reconstruct past states.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Stream Processing:&lt;/strong&gt; &lt;br&gt;
Continuous processing and transformation of data streams, often using Kafka Streams or other stream processing frameworks like Apache Flink.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxpyaxucr9jkpzac9ycu.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxpyaxucr9jkpzac9ycu.jpg" alt="Kafka broker, producer and consumer overview" width="700" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Figure 1: Kafka Architecture Overview. Image source: &lt;a href="https://www.learningjournal.guru/article/kafka/what-is-kafka/" rel="noopener noreferrer"&gt;Original Source Name&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In the diagram above:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Producers are client applications that publish messages to Kafka topics.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Topics are categories to which messages are sent and stored. Kafka divides these topics into Partitions for scalability and parallel processing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Brokers are Kafka servers that store the data and serve it to consumers. Kafka typically runs as a cluster of multiple brokers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Consumers are client applications that read messages from Kafka topics, often processing the data or forwarding it to other systems.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Sources:
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Official Documentation:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Kafka: &lt;a href="https://kafka.apache.org/documentation/" rel="noopener noreferrer"&gt;Apache Kafka Documentation&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;ZooKeeper Documentation: Apache ZooKeeper’s official documentation explains its purpose and its application in distributed systems like Kafka.&lt;a href="https://zookeeper.apache.org/doc/current/zookeeperOver.html" rel="noopener noreferrer"&gt;ZooKeeper Overview
&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Books:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;"Kafka: The Definitive Guide" by Neha Narkhede, Gwen Shapira, and Todd Palino – This book provides in-depth knowledge about Kafka.&lt;/li&gt;
&lt;li&gt;O'Reilly Media - Kafka: The Definitive Guide: This book is an excellent resource for understanding the architecture of Kafka, including the detailed explanation of ZooKeeper's function.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Articles and Tutorials:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Confluent Blog: Offers numerous articles on Kafka and its integration with various technologies.&lt;/li&gt;
&lt;li&gt;Microsoft Learn: Microsoft Learn .NET is a great resource for tutorials and practical guides.&lt;/li&gt;
&lt;li&gt;Medium Articles and Technical Blogs: Various engineers and developers who work with Kafka often write about their experiences and the technical details of ZooKeeper and Kafka.
Example: Kafka’s Evolution: ZooKeeper to KRaft&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;
  
  
  Series Overview:
&lt;/h3&gt;

&lt;p&gt;This three-part series, will guide you through the process of integrating Kafka with a .NET Web API application. By the end of this series, you’ll have a solid understanding of how to set up Kafka, produce and consume messages, and implement a real-world use case involving a product cart system.&lt;/p&gt;
&lt;h3&gt;
  
  
  Part 1: Setting up Kafka and your .NET Application
&lt;/h3&gt;

&lt;p&gt;In this first part, we will lay the groundwork by setting up Apache Kafka on your local machine and configuring a .NET Web API project. Whether you're new to Kafka or just need a refresher, this section will guide you through the installation process and help you get started with the necessary tools and dependencies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvydrofkhngohah952xb.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvydrofkhngohah952xb.jpg" alt="Kafka setup diagram" width="600" height="336"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Figure 2: Kafka Setup Overview. Image source: &lt;a href="https://www.tutorialspoint.com/apache_kafka/apache_kafka_cluster_architecture.htm" rel="noopener noreferrer"&gt;Kafka Cluster Architecture Overview&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding Kafka ZooKeeper:&lt;/strong&gt;&lt;br&gt;
Kafka ZooKeeper plays a critical role in the architecture of Apache Kafka, acting as a centralised service for maintaining configuration information, naming, providing distributed synchronisation, and offering group services. ZooKeeper’s primary function within Kafka is to manage and coordinate the Kafka brokers, handle leader election for Kafka partitions, and track the status of distributed resources, making it essential for ensuring high availability and fault tolerance within Kafka clusters.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Role in Kafka Ecosystem&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Broker Management:&lt;br&gt;
ZooKeeper keeps track of all Kafka brokers within a cluster, ensuring that each broker is aware of the others. It helps manage the dynamic membership of the cluster by tracking which brokers are active or have failed.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leader Election: &lt;br&gt;
In Kafka, each partition of a topic has a leader broker that handles all reads and writes. ZooKeeper is responsible for the leader election process, ensuring that if a broker fails, a new leader is promptly elected from the available replicas.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Topic Configuration and Metadata: &lt;br&gt;
ZooKeeper stores the metadata about Kafka topics, partitions, and the associated replicas. This centralized management helps Kafka brokers retrieve and update topic configurations efficiently.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Consumer Group Coordination: &lt;br&gt;
In earlier versions of Kafka, ZooKeeper was used to manage consumer group coordination, including keeping track of offsets. This responsibility has since been migrated to Kafka brokers themselves in more recent versions, but ZooKeeper’s role was foundational in earlier implementations.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Importance and Evolution&lt;/strong&gt;&lt;br&gt;
ZooKeeper's role is indispensable in Kafka’s operation, particularly for ensuring consistency and fault tolerance across the distributed system. However, as Kafka has evolved, there has been a movement towards reducing the dependency on ZooKeeper, leading to the introduction of Kafka's own built-in consensus protocol called KRaft (Kafka Raft). KRaft aims to simplify the architecture by handling broker metadata management directly within Kafka itself, eliminating the need for an external ZooKeeper cluster.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting up Kafka:&lt;/strong&gt; &lt;br&gt;
Let's begin by walking through the steps to install Kafka on your local development environment. You’ll learn about the core components of Kafka, including topics, partitions, brokers, and how to configure them.&lt;/p&gt;

&lt;p&gt;To do that lets setup a docker file on our project folder. The docker compose will provide us all the necessary development environments needed to work with Kafka locally.&lt;/p&gt;

&lt;p&gt;The project structure looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2b1u0zo9raamn7766nei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2b1u0zo9raamn7766nei.png" alt="Project Structure" width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lets check the docker compose file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;networks:
  kafka-net:
    driver: bridge

services:
  zookeeper-server:
    image: bitnami/zookeeper:latest
    networks:
      - kafka-net
    ports:
      - 2181:2181
    environment:
      - ALLOW_ANONYMOUS_LOGIN=yes
  kafdrop:
    image: obsidiandynamics/kafdrop:3.28.0
    networks:
      - kafka-net
    restart: "no"
    ports:
      - 9000:9000
    environment:
      KAFKA_BROKERCONNECT: PLAINTEXT://kafka-server:29092
      JVM_OPTS: -Xms16M -Xmx48M -Xss180K -XX:-TieredCompilation -XX:+UseStringDeduplication -noverify
      SCHEMAREGISTRY_CONNECT: http://schema-registry:8081
    depends_on:
      - kafka-server
  kafka-server:
    image: bitnami/kafka:latest
    networks:
      - kafka-net
    ports:
      - 9092:9092
    environment:
      - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper-server:2181
      - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka-server:29092,PLAINTEXT_HOST://127.0.0.1:9092
      - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      - KAFKA_CFG_LISTENERS=PLAINTEXT://:29092,PLAINTEXT_HOST://:9092
      - KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
      - ALLOW_PLAINTEXT_LISTENER=yes
    depends_on:
      - zookeeper-server
  schema-registry:
    image: confluentinc/cp-schema-registry:latest
    networks:
      - kafka-net
    ports:
      - 8081:8081
    environment:
      - SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS=PLAINTEXT://kafka-server:29092
      - SCHEMA_REGISTRY_HOST_NAME=localhost
      - SCHEMA_REGISTRY_LISTENERS=http://0.0.0.0:8081
    depends_on:
      - kafka-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To create my blog, I drew inspiration from another blog that provided a clear guide on setting up my local environment:&lt;br&gt;
&lt;a href="https://thecloudblog.net/post/event-driven-architecture-with-apache-kafka-for-net-developers-part-1-event-producer/" rel="noopener noreferrer"&gt;https://thecloudblog.net/post/event-driven-architecture-with-apache-kafka-for-net-developers-part-1-event-producer/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The docker compose file includes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Apache Zookeeper settings and image&lt;/li&gt;
&lt;li&gt;Kafka settings and image&lt;/li&gt;
&lt;li&gt;Kafdrop settings and image.(An Open-Source Kafka Web UI)&lt;/li&gt;
&lt;li&gt;Schema Registry settings and image.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now we will run the docker compose file to download all the images and run the containers locally:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker-compose up -d
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once all the images have been downloaded and the containers are running, you can view them in Docker Desktop.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2lhco7qjqvj1mijsic3v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2lhco7qjqvj1mijsic3v.png" alt="Docker images Kafka" width="800" height="288"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;or you can see their status on your console too using:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker ps
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F432u5h5mmqkakca7b87k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F432u5h5mmqkakca7b87k.png" alt="Docker ps" width="800" height="55"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can also take a look at the Kafdrop UI running on localhost at port 9000, as shown in the screenshots below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxinl4yq62jj4oh0oay39.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxinl4yq62jj4oh0oay39.png" alt="Kafdrop" width="800" height="469"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now since our kafka environment is ready let's install necessary libraries for our project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dotnet add package Confluent.Kafka
dotnet add package Confluent.SchemaRegistry.Serdes.Avro
dotnet restore
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our application's architecture will be represented by the following diagram:&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58zzc2jtwg3722gti63a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58zzc2jtwg3722gti63a.png" alt="kafka cart" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the image, you'll notice a topic named &lt;strong&gt;cart-item&lt;/strong&gt;, which receives items, and another topic, &lt;strong&gt;cart-item-processed&lt;/strong&gt;, that displays the status of processed cart items. The &lt;strong&gt;cart-item-group-id&lt;/strong&gt; represents our consumer group ID.&lt;/p&gt;

&lt;p&gt;Now lets setup a KafkaConfiguration class:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public static class KafkaConfiguration
{
    public static void ConfigureServices(IServiceCollection services, IConfiguration configuration)
    {
        // Add controllers and other services to the container
        services.AddControllers(static options =&amp;gt;
        {
            var formatter = options.InputFormatters.OfType&amp;lt;SystemTextJsonInputFormatter&amp;gt;()
                .First(static formatter =&amp;gt; formatter.SupportedMediaTypes.Contains("application/json"));

            formatter.SupportedMediaTypes.Add("application/csp-report");
            formatter.SupportedMediaTypes.Add("application/reports+json");
        });

        services.AddEndpointsApiExplorer();
        services.AddSwaggerGen();

        // Register Kafka services with configurable settings
        services.AddSingleton&amp;lt;KafkaProducerService&amp;gt;(sp =&amp;gt;
        {
            var kafkaConfig = sp.GetRequiredService&amp;lt;IOptions&amp;lt;KafkaConfigSecrets&amp;gt;&amp;gt;().Value;
            return new KafkaProducerService(kafkaConfig.KafkaBroker, kafkaConfig.KafkaTopic, kafkaConfig.SchemaRegistry);
        });

        services.AddSingleton&amp;lt;KafkaConsumerService&amp;gt;(sp =&amp;gt;
        {
            var kafkaConfig = sp.GetRequiredService&amp;lt;IOptions&amp;lt;KafkaConfigSecrets&amp;gt;&amp;gt;().Value;
            return new KafkaConsumerService(kafkaConfig.KafkaBroker, kafkaConfig.KafkaGroupId, kafkaConfig.KafkaTopic, kafkaConfig.SchemaRegistry, kafkaConfig.KafkaProcessedTopic);
        });

        services.AddSingleton&amp;lt;IShoppingCartRepository, ShoppingCartRepositoryRepository&amp;gt;();
    }

    public static void ConfigureMiddleware(WebApplication application)
    {
        // Only use Swagger in development
        if (application.Environment.IsDevelopment())
        {
            application.UseSwagger();
            application.UseSwaggerUI();
        }

        application.UseHttpsRedirection();
        application.UseAuthorization();
        application.MapControllers();
    }

    public static async void InitiateKafkaConsumer(KafkaConsumerService kafkaConsumerService,
        CancellationTokenSource? cancellationTokenSource)
    {
        // will be implemented on the next part of the series
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we will implement a KafkaConfigSecrets model which will be used to read kafka settings:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class KafkaConfigSecrets
{
    public string SchemaRegistry { get; set; } = default!;
    public string KafkaBroker { get; set; } = default!;
    public string KafkaGroupId { get; set; } = default!;
    public string KafkaTopic { get; set; } = default!;
    public string KafkaProcessedTopic { get; set; } = default!;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, let's enhance our Program.cs class by incorporating the Kafka configurations when the application runs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using ApacheKafkaBasics.Configuration;

var builder = WebApplication.CreateBuilder(args);

builder.Configuration.AddJsonFile("secret.json", true, true);
// Registering configuration section as a strongly typed class
builder.Services.Configure&amp;lt;KafkaConfigSecrets&amp;gt;(builder.Configuration.GetSection("KafkaConfig"));
// Configure services
KafkaConfiguration.ConfigureServices(builder.Services, builder.Configuration);

var app = builder.Build();
// Configure the HTTP request pipeline
KafkaConfiguration.ConfigureMiddleware(app);

app.Run();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now lets setup a service for Kafka producer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class KafkaProducerService : IKafkaProducerService
{
    private readonly IAdminClient _adminClient;
    private readonly string _topicName;
    private readonly object _queueLock = new();

    public KafkaProducerService(string brokerList, string kafkaTopic, string schemaRegistryUrl)
    {
        var adminConfig = new AdminClientConfig { BootstrapServers = brokerList };
        var schemaRegistryConfig = new SchemaRegistryConfig { Url = schemaRegistryUrl };
        var producerConfig = new ProducerConfig
        {
            BootstrapServers = brokerList,
            // Guarantees delivery of message to topic.
            EnableDeliveryReports = true,
            ClientId = Dns.GetHostName()
        };

        var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);

        _adminClient = new AdminClientBuilder(adminConfig).Build();
        _topicName = kafkaTopic;
    }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;we need to implement another service for Kafka consumer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class KafkaConsumerService : IKafkaConsumerService
{

    private readonly string _topicName;
    private readonly string _processedTopicName;
    private readonly ConsumerConfig _consumerConfig;
    private readonly object _queueLock = new();


    private record KafkaMessage(string? Key, int? Partition, CartItem Message);

    public KafkaConsumerService(string brokerList, string groupId, string topic, string schemaRegistryUrl,
        string processedTopic)
    {
        var schemaRegistryConfig = new SchemaRegistryConfig { Url = schemaRegistryUrl };

        _consumerConfig = new ConsumerConfig
        {
            BootstrapServers = brokerList,
            GroupId = groupId,
            EnableAutoCommit = false,
            EnableAutoOffsetStore = false,
            SessionTimeoutMs = 10000, // 10 seconds
            // Read messages from start if no commit exists.
            AutoOffsetReset = AutoOffsetReset.Earliest,
            MaxPollIntervalMs = 500000
        };
        var schemaRegistryClient = new CachedSchemaRegistryClient(schemaRegistryConfig);
        var cachedSchemaRegistryClient = new CachedSchemaRegistryClient(schemaRegistryConfig);

        _topicName = topic;
        _processedTopicName = processedTopic;

    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the next installment of this blog series, we will further enhance these services.&lt;/p&gt;

&lt;p&gt;In this first part, we covered the basics of Kafka, setting up Kafka locally using Docker, and configuring a .NET application to work with Kafka.&lt;/p&gt;

&lt;p&gt;In the upcoming parts, we will implement a Product Cart API that integrates with Kafka to process messages using Kafka Producers and Consumers.&lt;/p&gt;

&lt;p&gt;Happy coding! 😀&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Using Microsoft Graph API with .NET: A Comprehensive Guide</title>
      <dc:creator>Showmen Dasgupta</dc:creator>
      <pubDate>Tue, 06 Aug 2024 09:59:19 +0000</pubDate>
      <link>https://dev.to/showmen_dasgupta_08171571/using-microsoft-graph-api-with-net-a-comprehensive-guide-4jk1</link>
      <guid>https://dev.to/showmen_dasgupta_08171571/using-microsoft-graph-api-with-net-a-comprehensive-guide-4jk1</guid>
      <description>&lt;p&gt;In this blog post, we will explore how to use the Microsoft Graph API in a .NET application. We'll cover essential functionalities, including obtaining access tokens, interacting with user data, and handling group information. By the end of this guide, you'll have a solid understanding of how to leverage the Microsoft Graph API to enhance your applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction to Microsoft Graph API
&lt;/h2&gt;

&lt;p&gt;The Microsoft Graph API provides a unified programmability model that you can use to access the tremendous amount of data in Microsoft 365, Windows 10, and Enterprise Mobility + Security. With the Graph API, you can integrate your app with various Microsoft services to enhance its capabilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up Your Project
&lt;/h2&gt;

&lt;p&gt;First, ensure you have the Microsoft.Graph and Azure.Identity NuGet package installed in your project. You can do this via the NuGet Package Manager or by running the following command in your terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dotnet add package Microsoft.Graph
dotnet add package Azure.Identity
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Implementing the IGraphService Interface
&lt;/h2&gt;

&lt;p&gt;To keep our code clean and maintainable, we define an interface IGraphService that outlines the various methods we'll implement.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using Microsoft.Graph;
using Microsoft.Graph.Models;

namespace GraphApiBasics.Interfaces
{
    public interface IGraphService
    {
        Task&amp;lt;string&amp;gt; GetAccessTokenConfidentialClientAsync(string clientId, string tenantId, string clientSecret, string authority);
        Task&amp;lt;string&amp;gt; GetAccessTokenWithClientCredentialAsync(string clientId, string tenantId, string clientSecret, CancellationToken cancellationToken = default);
        Task&amp;lt;string&amp;gt; GetAccessTokenByUserNamePassword(string clientId, ICollection&amp;lt;string&amp;gt; scopes, string authority, string userName, string password);
        Task&amp;lt;GraphServiceClient&amp;gt; GetGraphServiceClient(string clientId, string tenantId, string clientSecret);
        Task&amp;lt;User?&amp;gt; GetUserIfExists(GraphServiceClient graphClient, string userEmail);
        Task&amp;lt;User?&amp;gt; CreateUserAsync(GraphServiceClient graphClient, string? displayName, string userPrincipalName, string password);
        Task&amp;lt;List&amp;lt;User&amp;gt;&amp;gt;? GetUserListAsync(GraphServiceClient graphClient);
        Task&amp;lt;PageIterator&amp;lt;User, UserCollectionResponse&amp;gt;&amp;gt;? GetPageIterator(GraphServiceClient graphClient);
        Task&amp;lt;List&amp;lt;User&amp;gt;&amp;gt;? GetUsersWithBatchRequest(GraphServiceClient graphClient);
        Task&amp;lt;User&amp;gt; GetCurrentlyLoggedInUserInfo(GraphServiceClient graphClient);
        Task&amp;lt;int?&amp;gt; GetUsersCount(GraphServiceClient graphClient);
        Task&amp;lt;UserCollectionResponse&amp;gt; GetUsersInGroup(GraphServiceClient graphClient, string groupId);
        Task&amp;lt;ApplicationCollectionResponse&amp;gt; GetApplicationsInGroup(GraphServiceClient graphClient, string groupId);
    }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Implementing the GraphService Class
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;GraphService&lt;/strong&gt; class implements the &lt;strong&gt;IGraphService&lt;/strong&gt; interface. This class contains methods to interact with the &lt;strong&gt;Microsoft Graph API&lt;/strong&gt;, such as obtaining access tokens, fetching user data, and handling group information.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Access Tokens
&lt;/h2&gt;

&lt;p&gt;Here we are exploring three methods for obtaining access tokens using different authentication methods: &lt;strong&gt;confidential client, client credentials, and username/password&lt;/strong&gt;. But there are other authentication providers you can use according to your application needs. More info can be found here : &lt;a href="https://learn.microsoft.com/en-us/graph/sdks/choose-authentication-providers?tabs=csharp" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/graph/sdks/choose-authentication-providers?tabs=csharp&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Confidential Client
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  public async Task&amp;lt;string&amp;gt; GetAccessTokenConfidentialClientAsync(string clientId, string tenantId,
        string clientSecret, string authority)
    {
        // Define the scopes you need
        var scopes = new[]
        {
            "https://graph.microsoft.com/.default"
        };

        try
        {
            var publicClient = ConfidentialClientApplicationBuilder.Create(clientId)
                .WithClientSecret(clientSecret)
                .WithAuthority(authority)
                .WithTenantId(tenantId)
                .WithRedirectUri("http://localhost:7181/auth/login-callback-ms")
                .Build();

            var token = await publicClient.AcquireTokenForClient(scopes)
                .WithTenantIdFromAuthority(new Uri(authority))
                .ExecuteAsync();

            var accessToken = token.AccessToken;

            return accessToken;
        }
        catch (MsalUiRequiredException ex)
        {
            _logger.LogCritical($"Error acquiring token: {ex.Message}");
            throw;
        }
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Client Credentials
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;string&amp;gt; GetAccessTokenWithClientCredentialAsync(string clientId, string tenantId,
        string clientSecret,
        CancellationToken cancellationToken = default)
    {
        // Define the scopes you need
        var scopes = new[]
        {
            "https://graph.microsoft.com/.default"
        };

        try
        {
            var options = new ClientSecretCredentialOptions
            {
                AuthorityHost = AzureAuthorityHosts.AzurePublicCloud
            };

            var credential = new ClientSecretCredential(tenantId, clientId, clientSecret, options);

            var tokenRequestContext = new TokenRequestContext(scopes);
            var token = await credential.GetTokenAsync(tokenRequestContext, cancellationToken);
            var accessToken = token.Token;

            return accessToken;
        }
        catch (MsalUiRequiredException ex)
        {
            _logger.LogCritical($"Error acquiring token: {ex.Message}");
            throw;
        }
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Username and Password
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;string&amp;gt; GetAccessTokenByUserNamePassword(string clientId, ICollection&amp;lt;string&amp;gt; scopes,
        string authority, string userName,
        string password)
    {
        try
        {
            var app = PublicClientApplicationBuilder.Create(clientId)
                .WithAuthority(authority)
                .WithRedirectUri("http://localhost:7181/auth/login-callback-ms")
                .Build();

            var result = await app.AcquireTokenByUsernamePassword(scopes, userName, password)
                .ExecuteAsync();

            return result.AccessToken;
        }
        catch (Exception ex)
        {
            throw new BadHttpRequestException(ex.Message);
        }
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  GraphServiceClient Instance
&lt;/h3&gt;

&lt;p&gt;We need a &lt;strong&gt;'GraphServiceClient'&lt;/strong&gt; instance to interact with the Microsoft Graph API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public Task&amp;lt;GraphServiceClient&amp;gt; GetGraphServiceClient(string clientId, string tenantId, string clientSecret)
{
    var credential = new ClientSecretCredential(tenantId, clientId, clientSecret);
    var graphClient = new GraphServiceClient(credential);
    return Task.FromResult(graphClient);
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  User Operations
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Check if a User Exists:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;User?&amp;gt; GetUserIfExists(GraphServiceClient graphClient, string userEmail)
{
    var userCollection = await graphClient.Users
        .GetAsync(requestConfiguration =&amp;gt; requestConfiguration.QueryParameters.Filter = $"userPrincipalName eq '{userEmail}'");
    return userCollection?.Value?.FirstOrDefault();
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can search an existing user or a valid user using the &lt;strong&gt;'userPrincipalName'&lt;/strong&gt; which is basically the &lt;strong&gt;'userEmail'&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create a New User
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;User?&amp;gt; CreateUserAsync(GraphServiceClient graphClient, string? displayName, string userPrincipalName, string password)
{
    var newUser = new User
    {
        AccountEnabled = true,
        DisplayName = displayName,
        MailNickname = userPrincipalName.Split('@')[0],
        Mail = userPrincipalName,
        UserPrincipalName = userPrincipalName,
        PasswordProfile = new PasswordProfile
        {
            ForceChangePasswordNextSignIn = true,
            Password = password
        }
    };
    return await graphClient.Users.PostAsync(newUser);
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Since you are creating a user with a system-generated password, Microsoft will require the user to change their password and set up a new one the next time they log in. The &lt;code&gt;ForceChangePasswordNextSignIn = true&lt;/code&gt; flag ensures this.&lt;/p&gt;

&lt;h3&gt;
  
  
  Get a List of All Users
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;List&amp;lt;User&amp;gt;&amp;gt;? GetUserListAsync(GraphServiceClient graphClient)
{
    var usersResponse = await graphClient.Users
        .GetAsync(requestConfiguration =&amp;gt; requestConfiguration.QueryParameters.Select = ["id", "createdDateTime", "userPrincipalName"]);
    return usersResponse?.Value;
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can add more parameters on the array to fetch more info for the users. you can get more detailed information about it here in this documentation: &lt;a href="https://learn.microsoft.com/en-us/graph/query-parameters?tabs=http" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/graph/query-parameters?tabs=http&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Group Operations
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Get Users in a Group:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;UserCollectionResponse&amp;gt; GetUsersInGroup(GraphServiceClient graphClient, string groupId)
{
    var usersInGroup = await graphClient.Groups[groupId].Members.GraphUser.GetAsync();
    return usersInGroup;
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Get Applications in a Group:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;ApplicationCollectionResponse&amp;gt; GetApplicationsInGroup(GraphServiceClient graphClient,
        string groupId)
    {
        try
        {
            var applicationsInGroup = await graphClient.Groups[groupId].Members.GraphApplication.GetAsync();
            return applicationsInGroup ?? throw new InvalidOperationException();
        }
        catch (Exception ex)
        {
            throw new BadHttpRequestException(ex.Message);
        }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Batch Requests and User Count
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Get Users Count:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public async Task&amp;lt;int?&amp;gt; GetUsersCount(GraphServiceClient graphClient)
{
    var count = await graphClient.Users.Count.GetAsync(requestConfiguration =&amp;gt;
        requestConfiguration.Headers.Add("ConsistencyLevel", "eventual"));
    return count;
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Get Users with Batch Request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Step 1: Create the request information for getting users
    var requestInformation = graphClient
        .Users
        .ToGetRequestInformation();

// Step 2: Create a batch request content collection
    var batchRequestContent = new BatchRequestContentCollection(graphClient);

// Step 3: Add the user request to the batch and get the step ID
    var requestStepId = await batchRequestContent.AddBatchRequestStepAsync(requestInformation);

// Step 4: Send the batch request and get the response
    var batchResponseContent = await graphClient.Batch.PostAsync(batchRequestContent);

// Step 5: Extract the user response from the batch response using the step ID
    var usersResponse = await batchResponseContent.GetResponseByIdAsync&amp;lt;UserCollectionResponse&amp;gt;(requestStepId);
    var userList = usersResponse.Value;

 // Step 6: Return the list of users or throw an exception if null
    return userList ?? throw new InvalidOperationException();

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Creating the API Controller
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;'GraphApiController'&lt;/strong&gt; uses the 'IGraphService' to provide API endpoints for interacting with Microsoft Graph&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using GraphApiBasics.Interfaces;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Options;
using Microsoft.Graph;

namespace GraphApiBasics.Controllers
{
    [Route("api/v1/graph")]
    public class GraphApiController : Controller
    {
        private readonly GraphSecretOptions _graphSecretOptions;
        private readonly IGraphService _graphService;

        public GraphApiController(IOptions&amp;lt;GraphSecretOptions&amp;gt; graphSecretOptions, IGraphService graphService)
        {
            _graphSecretOptions = graphSecretOptions.Value;
            _graphService = graphService;
        }

        private async Task&amp;lt;GraphServiceClient&amp;gt; GetGraphClientAsync()
        {
            return await _graphService.GetGraphServiceClient(_graphSecretOptions.ClientId, _graphSecretOptions.TenantId, _graphSecretOptions.ClientSecret);
        }

        [HttpGet("get-access-token-confidential-client-credentials")]
        public async Task&amp;lt;IActionResult&amp;gt; GetAccessTokenWithConfidentialClientCredential()
        {
            var accessToken = await _graphService.GetAccessTokenConfidentialClientAsync(
                _graphSecretOptions.ClientId,
                _graphSecretOptions.TenantId,
                _graphSecretOptions.ClientSecret,
                _graphSecretOptions.Authority
            );
            return Ok(new { accessToken });
        }

        [HttpGet("get-access-token-client-credentials")]
        public async Task&amp;lt;IActionResult&amp;gt; GetAccessTokenWithClientCredential()
        {
            var accessToken = await _graphService.GetAccessTokenWithClientCredentialAsync(
                _graphSecretOptions.ClientId,
                _graphSecretOptions.TenantId,
                _graphSecretOptions.ClientSecret
            );
            return Ok(new { accessToken });
        }

        [HttpPost("create-user-if-not-exists")]
        public async Task&amp;lt;IActionResult&amp;gt; CreateUserIfNotExists(string userEmail, string password, string displayName)
        {
            var graphClient = await GetGraphClientAsync();
            var validUser = await _graphService.GetUserIfExists(graphClient, userEmail);
            if (validUser != null)
            {
                return NotFound("User Already Exists");
            }

            var user = await _graphService.CreateUserAsync(graphClient, displayName, userEmail, password);
            return Ok(new { user });
        }

        [HttpGet("get-list-of-users")]
        public async Task&amp;lt;IActionResult&amp;gt; GetUsersList()
        {
            var graphClient = await GetGraphClientAsync();
            var users = await _graphService.GetUserListAsync(graphClient);
            return Ok(new { users });
        }

        [HttpGet("get-page-iterator")]
        public async Task&amp;lt;IActionResult&amp;gt; GetPageIterator()
        {
            var graphClient = await GetGraphClientAsync();
            var pageIterator = await _graphService.GetPageIterator(graphClient);
            await pageIterator.IterateAsync();

            return Ok(new { pageIterator });
        }

        [HttpGet("get-users-with-batch-request")]
        public async Task&amp;lt;IActionResult&amp;gt; GetUsersWithBatchRequest()
        {
            var graphClient = await GetGraphClientAsync();
            var users = await _graphService.GetUsersWithBatchRequest(graphClient);
            return Ok(new { users });
        }

        [HttpGet("get-currently-logged-in-user-info")]
        public async Task&amp;lt;IActionResult&amp;gt; GetCurrentlyLoggedInUserInfo()
        {
            var graphClient = await GetGraphClientAsync();
            var loggedInUserInfo = await _graphService.GetCurrentlyLoggedInUserInfo(graphClient);
            return Ok(new { loggedInUserInfo });
        }

        [HttpGet("get-users-count")]
        public async Task&amp;lt;IActionResult&amp;gt; GetUsersCount()
        {
            var graphClient = await GetGraphClientAsync();
            var usersCount = await _graphService.GetUsersCount(graphClient);
            return Ok(new { usersCount });
        }

        [HttpGet("get-users-in-group")]
        public async Task&amp;lt;IActionResult&amp;gt; GetUsersInGroup(string groupId)
        {
            var graphClient = await GetGraphClientAsync();
            var usersInGroup = await _graphService.GetUsersInGroup(graphClient, groupId);
            return Ok(new { usersInGroup });
        }

        [HttpGet("get-applications-in-group")]
        public async Task&amp;lt;IActionResult&amp;gt; GetApplicationsInGroup(string groupId)
        {
            var graphClient = await GetGraphClientAsync();
            var applicationsInGroup = await _graphService.GetApplicationsInGroup(graphClient, groupId);
            return Ok(new { applicationsInGroup });
        }

        [HttpPost("get-access-token-username-password")]
        public async Task&amp;lt;IActionResult&amp;gt; GetAccessTokenWithUserNamePassword(string userName, string password)
        {
            var accessToken = await _graphService.GetAccessTokenByUserNamePassword(
                _graphSecretOptions.ClientId,
                new[] { "User.Read", "User.ReadAll" },
                _graphSecretOptions.Authority,
                userName,
                password
            );
            return Ok(new { accessToken });
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this comprehensive guide, we have explored how to use the Microsoft Graph API in a .NET application. We covered obtaining access tokens, interacting with user data, and handling group information. By implementing the IGraphService interface and GraphService class, you can efficiently manage user and group data in your Azure AD tenant. The GraphApiController provides convenient API endpoints for interacting with the Graph API, ensuring a clean, maintainable, and scalable approach to using the Microsoft Graph API in your projects.&lt;/p&gt;

&lt;p&gt;To follow latest Microsoft Graph .NET SDK v5 change log and upgrade guide : &lt;a href="https://github.com/microsoftgraph/msgraph-sdk-dotnet/blob/feature/5.0/docs/upgrade-to-v5.md" rel="noopener noreferrer"&gt;https://github.com/microsoftgraph/msgraph-sdk-dotnet/blob/feature/5.0/docs/upgrade-to-v5.md&lt;/a&gt;&lt;/p&gt;

</description>
      <category>microsoft</category>
      <category>microsoftgraph</category>
      <category>dotnet</category>
    </item>
  </channel>
</rss>
