<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Abel</title>
    <description>The latest articles on DEV Community by Abel (@tarantarantino).</description>
    <link>https://dev.to/tarantarantino</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tarantarantino"/>
    <language>en</language>
    <item>
      <title>Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp; AI Gateway (Part 2)</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Mon, 30 Mar 2026 15:20:07 +0000</pubDate>
      <link>https://dev.to/tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-2-3n5k</link>
      <guid>https://dev.to/tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-2-3n5k</guid>
      <description>&lt;p&gt;&lt;em&gt;A practical guide to architecture, implementation, and lessons learned from a multi-agent AI system&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fehcmgfvz7y2l4q4cmx4p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fehcmgfvz7y2l4q4cmx4p.png" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Posts in this Series
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://medium.com/@tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-1-f05e5beb3a2f" rel="noopener noreferrer"&gt;Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp;amp; AI Gateway (Part 1)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-2-8050bbad4d60" rel="noopener noreferrer"&gt;Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp;amp; AI Gateway (Part 2) &lt;em&gt;(This Post)&lt;/em&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Microsoft Foundry&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The whole project uses &lt;strong&gt;Microsoft Foundry&lt;/strong&gt; , which is a unified PaaS platform for designing, customizing, and deploying AI applications and agents. It offers access to over 11,000 models, including OpenAI, Meta Llama, and Mistral, alongside tools like Foundry Agent Service and Foundry IQ to build, manage, and secure AI apps from prototype to production.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Foundry Control Plane&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Microsoft Foundry Control Plane&lt;/strong&gt; is a unified management interface that provides visibility, governance, and control for AI agents, models, and tools across your Foundry enterprise. Foundry Control Plane centralizes management for your AI agent fleet, from build to production.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Manages multiple AI agents across different projects or teams.&lt;/li&gt;
&lt;li&gt;Requires centralized compliance visibility and policy enforcement across an AI fleet.&lt;/li&gt;
&lt;li&gt;Integrates Microsoft Defender and Microsoft Purview for AI governance and threat protection.&lt;/li&gt;
&lt;li&gt;Operates agents from multiple platforms, including Foundry, Microsoft, and non-Microsoft sources.&lt;/li&gt;
&lt;li&gt;Needs to track cost, token usage, and resource consumption across an entire AI environment.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without Foundry Control Plane, you manage agents, models, and compliance through individual Azure portal blades and separate per-project views. Foundry Control Plane adds cross-project visibility, unified compliance enforcement, and integrated security signals in a single interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06w23ej2marprfs8413m.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06w23ej2marprfs8413m.jpeg" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Security&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For agents, we will use a typical &lt;strong&gt;Managed Identity&lt;/strong&gt; , which will act as its identity.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Observability&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;AI observability refers to the ability to monitor, understand, and troubleshoot AI systems throughout their lifecycle. Teams can trace, evaluate, integrate automated quality gates into CI/CD pipelines, and collect signals such as evaluation metrics, logs, traces, and model outputs to gain visibility into performance, quality, safety, and operational health.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Monitoring&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Production monitoring ensures your deployed AI applications maintain quality and performance in real-world conditions. Integrated with Azure Monitor Application Insights, Microsoft Foundry delivers real-time dashboards tracking operational metrics, token consumption, latency, error rates, and quality scores. Teams can set up alerts when outputs fail quality thresholds or produce harmful content, enabling rapid issue resolution.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Agent Monitoring Dashboard&lt;/strong&gt; in Microsoft Foundry tracks operational metrics and evaluation results for your agents. This dashboard helps you understand token usage, latency, success rates, and evaluation outcomes for production traffic.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bqzc6jjnu6f32zqohis.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bqzc6jjnu6f32zqohis.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tracing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Distributed tracing captures the execution flow of AI applications, providing visibility into LLM calls, tool invocations, agent decisions, and inter-service dependencies. Built on OpenTelemetry standards and integrated with Application Insights, tracing enables debugging complex agent behaviors, identifying performance bottlenecks, and understanding multi-step reasoning chains. Microsoft Foundry supports tracing for popular frameworks including LangChain, Semantic Kernel, and the OpenAI Agents SDK.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AidKTEedMa_1s6yOvB-azmA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AidKTEedMa_1s6yOvB-azmA.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Akq80pS9Qwtzfwzd33zRxzA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Akq80pS9Qwtzfwzd33zRxzA.jpeg" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Guardrails and Controls&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Microsoft Foundry provides safety and security guardrails that you can apply to core models and agents. Guardrails consist of a set of controls. The controls define a risk to be detected, intervention points to scan for the risk, and the response action to take in the model or agent when the risk is detected.&lt;/p&gt;

&lt;p&gt;The created guardrail contains controls for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Jailbreak&lt;/li&gt;
&lt;li&gt;Indirect Prompt Injections&lt;/li&gt;
&lt;li&gt;Sensitive Data Leakage (PII)&lt;/li&gt;
&lt;li&gt;Content Safety&lt;/li&gt;
&lt;li&gt;Protected Materials&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These controls are assigned to all the agents used in the system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2ArzjYk9CsP9i2tEXZlFGu5w.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2ArzjYk9CsP9i2tEXZlFGu5w.jpeg" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Evaluators&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Evaluators measure the quality, safety, and reliability of AI responses throughout development. Microsoft Foundry provides built-in evaluators for general-purpose quality metrics (coherence, fluency), RAG-specific metrics (groundedness, relevance), safety and security (hate/unfairness, violence, protected materials), and agent-specific metrics (tool call accuracy, task completion). Teams can also build custom evaluators tailored to their domain-specific requirements.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AUtqttK7z6FHQDiYtbzE66Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AUtqttK7z6FHQDiYtbzE66Q.jpeg" width="800" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F597%2F1%2A4uQMMFxAti1svy1amVDh4g.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F597%2F1%2A4uQMMFxAti1svy1amVDh4g.jpeg" width="597" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An evaluator was created with the following JSON structure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"query"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"What's the weekly summary?"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"response"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"context"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"User is requesting a weekly feedback summary. Expected to use get_weekly_summary tool and return sentiment, top issues, feedback count."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"ground_truth"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"The response should include sentiment breakdown, top issues, feedback count, and distinguish between positive and negative feedback. Required fields: total_feedback, sentiment_breakdown."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"conversation_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"conv_001"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"response_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s2"&gt;"test_001"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"previous_response_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"latency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"response_length"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The criteria for the evaluator were left as the default ones. They could be adjusted since many do not apply. In this case, they were left unchanged for simplicity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Purview, Defender, and Entra&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Scaling AI agents securely requires combining development capabilities with strong governance, security, and compliance controls.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AI systems introduce new risks such as data leakage, prompt injection, and agent misuse.&lt;/li&gt;
&lt;li&gt;Security must be applied across all layers: agents, data, models, and interactions.&lt;/li&gt;
&lt;li&gt;Microsoft Foundry acts as the control plane to manage, observe, and secure AI workloads.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✔️ &lt;strong&gt;Purview&lt;/strong&gt; focuses on data governance, compliance, and protection (e.g., sensitivity labels, DLP, auditing).&lt;/p&gt;

&lt;p&gt;✔️ &lt;strong&gt;Defender&lt;/strong&gt; provides security posture management and runtime threat protection for AI workloads.&lt;/p&gt;

&lt;p&gt;✔️ &lt;strong&gt;Entra&lt;/strong&gt; introduces the concept of managing agents as identities with controlled access.&lt;/p&gt;

&lt;p&gt;AI security is not a single tool but a combination of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Governance (Purview)&lt;/li&gt;
&lt;li&gt;Threat protection (Defender)&lt;/li&gt;
&lt;li&gt;Identity and access control (Entra)&lt;/li&gt;
&lt;li&gt;Control plane and observability (Foundry)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This enables organizations to move from experimental AI solutions to &lt;strong&gt;secure, governed, and production-ready agent systems&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Enabling Purview or Defender for this project was considered overkill, but in a real production scenario they should be mandatory.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AUOROJTqhDdjcwOMIrf9ajA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AUOROJTqhDdjcwOMIrf9ajA.jpeg" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIWwhdwERjUAeDTJYxWBqgw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIWwhdwERjUAeDTJYxWBqgw.png" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AcsRUhLbIxvgEY1f_8Iz73Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AcsRUhLbIxvgEY1f_8Iz73Q.jpeg" width="800" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;AI Gateway&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The AI gateway in Azure API Management provides a set of capabilities to manage AI backends effectively. It enables control over security, reliability, observability, and cost.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AAhd3dkatVeU2dVPHkvPzaA.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AAhd3dkatVeU2dVPHkvPzaA.gif" width="1233" height="549"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The following sections describe the main capabilities, combining traditional API gateway features with AI-specific functionality.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A_dNNOluhUERxu5bGjqrnZA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A_dNNOluhUERxu5bGjqrnZA.jpeg" width="800" height="486"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Governance&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Token Rate Limiting and Quotas&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can configure token-based limits on LLM APIs to control usage per consumer based on token consumption.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F609%2F1%2ANr3RXoVfjYeQt_kBNhyfnw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F609%2F1%2ANr3RXoVfjYeQt_kBNhyfnw.png" width="609" height="297"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This allows defining:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Tokens per minute (TPM)&lt;/li&gt;
&lt;li&gt;Token quotas over time (hourly, daily, monthly, etc.)
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;llm-token-limit&lt;/span&gt; &lt;span class="na"&gt;counter-key=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Subscription.Id)"&lt;/span&gt;
                 &lt;span class="na"&gt;tokens-per-minute=&lt;/span&gt;&lt;span class="s"&gt;"500"&lt;/span&gt;
                 &lt;span class="na"&gt;estimate-prompt-tokens=&lt;/span&gt;&lt;span class="s"&gt;"false"&lt;/span&gt;
                 &lt;span class="na"&gt;remaining-tokens-variable-name=&lt;/span&gt;&lt;span class="s"&gt;"remainingTokens"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/llm-token-limit&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;Security and Safety&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Security&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;All authentication is performed through the API Management subscription key, without relying on JWT or any Entra ID OAuth mechanism.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Content Safety&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this case, Content Safety is not configured at the AI Gateway level. Instead, it is managed through Microsoft Foundry.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Observability&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Token Metrics&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Token usage can be emitted using the llm-emit-token-metric policy, with custom dimensions for filtering in Azure Monitor. The following example emits token metrics with dimensions for client IP address, API ID, and user ID (from a custom header):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F709%2F1%2APkqmkgZetFiCYLHzzSUZBw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F709%2F1%2APkqmkgZetFiCYLHzzSUZBw.png" width="709" height="318"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;llm-emit-token-metric&lt;/span&gt; &lt;span class="na"&gt;namespace=&lt;/span&gt;&lt;span class="s"&gt;"llm-metrics"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
     &lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"Client IP"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Request.IpAddress)"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
     &lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"API ID"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Api.Id)"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
     &lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"User ID"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Request.Headers.GetValueOrDefault("&lt;/span&gt;&lt;span class="err"&gt;x-user-id",&lt;/span&gt; &lt;span class="err"&gt;"N/A"))"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/llm-emit-token-metric&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Prompt and Completion Logging&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Logging can be enabled to track:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Token usage&lt;/li&gt;
&lt;li&gt;Prompts and completions&lt;/li&gt;
&lt;li&gt;API consumption patterns&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This data can be analyzed in Application Insights and visualized through built-in dashboards in API Management.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Scalability and Performance&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Semantic Caching&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Semantic caching is a technique that improves the performance of LLM APIs by caching the results (completions) of previous prompts and reusing them by comparing the vector proximity of the prompt to prior requests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F669%2F1%2AvePLkN3dEEra_x-dxoENdw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F669%2F1%2AvePLkN3dEEra_x-dxoENdw.png" width="669" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Benefits:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reduced calls to backend AI services&lt;/li&gt;
&lt;li&gt;Lower latency&lt;/li&gt;
&lt;li&gt;Cost optimization&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It can be implemented using Azure Managed Redis or any compatible cache.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;policies&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;inbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;base&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;llm-semantic-cache-lookup&lt;/span&gt; 
     &lt;span class="na"&gt;score-threshold=&lt;/span&gt;&lt;span class="s"&gt;"0.05"&lt;/span&gt;
     &lt;span class="na"&gt;embeddings-backend-id=&lt;/span&gt;&lt;span class="s"&gt;"azure-openai-backend"&lt;/span&gt;
     &lt;span class="na"&gt;embeddings-backend-auth=&lt;/span&gt;&lt;span class="s"&gt;"system-assigned"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
     &lt;span class="nt"&gt;&amp;lt;vary-by&amp;gt;&lt;/span&gt;@(context.Subscription.Id)&lt;span class="nt"&gt;&amp;lt;/vary-by&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/llm-semantic-cache-lookup&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;rate-limit&lt;/span&gt; &lt;span class="na"&gt;calls=&lt;/span&gt;&lt;span class="s"&gt;"10"&lt;/span&gt; &lt;span class="na"&gt;renewal-period=&lt;/span&gt;&lt;span class="s"&gt;"60"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/inbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;outbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;llm-semantic-cache-store&lt;/span&gt; &lt;span class="na"&gt;duration=&lt;/span&gt;&lt;span class="s"&gt;"60"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;base&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/outbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/policies&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; A rate-limit policy should be applied after the cache lookup to prevent backend overload if the cache is unavailable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Weight or Session-Aware Load Balancing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The backend load balancer supports round-robin, weighted, priority-based, and session-aware load balancing. You can define a load distribution strategy that meets your specific requirements.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F708%2F1%2AqjKoeLqSTGy3JAh1iXyioQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F708%2F1%2AqjKoeLqSTGy3JAh1iXyioQ.png" width="708" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Priority Routing to Provisioned Capacity Models&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Enabling priority processing at the request level is optional. Both the chat completions API and responses API have an optional attribute service_tier that specifies the processing type to use when serving a request. In this case, we will not use it.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Velocity&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Import Azure OpenAI as an API&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Azure API Management allows importing Azure OpenAI endpoints as APIs with a single action. This includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Automatic OpenAPI schema generation&lt;/li&gt;
&lt;li&gt;Managed identity authentication&lt;/li&gt;
&lt;li&gt;Simplified onboarding&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F883%2F1%2A5hoZnQk6Fxt_45T35irQKA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F883%2F1%2A5hoZnQk6Fxt_45T35irQKA.jpeg" width="800" height="630"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;External Models and Integrations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;No models outside Azure are used in this project. MCP servers and A2A agents are fully implemented within the Python backend rather than managed through the gateway.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Full Policy Example&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Below is the full AI Gateway policy used in FeedbackForge. It uses policy fragments to simplify configuration and improve maintainability.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;policies&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;inbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;base&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!--
FeedbackForge Policy v2
Purpose: Routes requests to appropriate backend pools based on model selection and RBAC permissions
Flow:
1. Validate Entra ID authentication (optional, controlled by entra-validate named value)
2. Extract and validate model parameter from request payload
3. Configure backend pools and routing rules
4. Determine target backend pool based on model and permissions
5. Set up authentication and route to selected backend
6. Configure collecting usage metrics
Configuration:
- Modify allowedBackendPools for RBAC (comma-separated pool IDs, empty = all allowed)
- Set defaultBackendPool for unmapped models (empty = return error)
- Backend pool definitions are in frag-set-backend-pools fragment --&amp;gt;&lt;/span&gt;

&lt;span class="nt"&gt;&amp;lt;set-backend-service&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"apim-generated-policy"&lt;/span&gt; &lt;span class="na"&gt;backend-id=&lt;/span&gt;&lt;span class="s"&gt;"feedbackforgev2-ai-endpoint"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt; 
&lt;span class="cp"&gt;&amp;lt;!.. - Step 1: Validate Entra ID authentication (if enabled) --&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - &amp;lt;include-fragment fragment-id="aad-auth" /&amp;gt; --&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 2: Extract and validate model parameter from request --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"set-llm-requested-model"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Remove api-key header to prevent it from being passed to backend endpoints --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;set-header&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"api-key"&lt;/span&gt; &lt;span class="na"&gt;exists-action=&lt;/span&gt;&lt;span class="s"&gt;"delete"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 4: Configure RBAC and default routing behavior --&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - RBAC: Set allowed backend pools (comma-separated pool IDs, empty = all pools allowed) --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;set-variable&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"allowedBackendPools"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;""&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Set default backend pool (empty = return error for unmapped models) --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;set-variable&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"defaultBackendPool"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;""&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 4: Load backend pool configurations --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"set-backend-pools"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 5: Determine target backend pool based on model and permissions --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"set-target-backend-pool"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 6: Configure authentication and route to selected backend --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"set-backend-authorization"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - Step 7: Configure collecting usage metrics --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"set-llm-usage"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- - CORS Configuration for AI Foundry Compatibility --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"ai-foundry-compatibility"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;llm-semantic-cache-lookup&lt;/span&gt; &lt;span class="na"&gt;score-threshold=&lt;/span&gt;&lt;span class="s"&gt;"0.0"&lt;/span&gt; &lt;span class="na"&gt;embeddings-backend-id=&lt;/span&gt;&lt;span class="s"&gt;"AI-SC-text-embedding-aokaqomw6tctxm5"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;rate-limit&lt;/span&gt; &lt;span class="na"&gt;calls=&lt;/span&gt;&lt;span class="s"&gt;"10"&lt;/span&gt; &lt;span class="na"&gt;renewal-period=&lt;/span&gt;&lt;span class="s"&gt;"60"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;llm-token-limit&lt;/span&gt; &lt;span class="na"&gt;remaining-quota-tokens-header-name=&lt;/span&gt;&lt;span class="s"&gt;"remaining-tokens"&lt;/span&gt; &lt;span class="na"&gt;remaining-tokens-header-name=&lt;/span&gt;&lt;span class="s"&gt;"remaining-tokens"&lt;/span&gt; &lt;span class="na"&gt;tokens-per-minute=&lt;/span&gt;&lt;span class="s"&gt;"1000"&lt;/span&gt; &lt;span class="na"&gt;token-quota=&lt;/span&gt;&lt;span class="s"&gt;"100"&lt;/span&gt; &lt;span class="na"&gt;token-quota-period=&lt;/span&gt;&lt;span class="s"&gt;"Hourly"&lt;/span&gt; &lt;span class="na"&gt;counter-key=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Subscription.Id)"&lt;/span&gt; &lt;span class="na"&gt;estimate-prompt-tokens=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt; &lt;span class="na"&gt;tokens-consumed-header-name=&lt;/span&gt;&lt;span class="s"&gt;"consumed-tokens"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;llm-emit-token-metric&lt;/span&gt; &lt;span class="na"&gt;namespace=&lt;/span&gt;&lt;span class="s"&gt;"llm-metrics"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"Client IP"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Request.IpAddress)"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"API ID"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Api.Id)"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;dimension&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"User ID"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Request.Headers.GetValueOrDefault("&lt;/span&gt;&lt;span class="err"&gt;x-user-id",&lt;/span&gt; &lt;span class="err"&gt;"N/A"))"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/llm-emit-token-metric&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/inbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;backend&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- 
- Backend Retry Logic
Purpose: Implements retry mechanism for transient failures (429 throttling, 503 service unavailable)
Configuration:
- Retry count: Set to one less than number of backends in pool to try all backends
- Condition: Retries on 429 (throttling) or 503 (except when backend pool is unavailable)
- Strategy: First fast retry with zero interval, buffer request body for replay
--&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;retry&lt;/span&gt; &lt;span class="na"&gt;count=&lt;/span&gt;&lt;span class="s"&gt;"2"&lt;/span&gt; &lt;span class="na"&gt;interval=&lt;/span&gt;&lt;span class="s"&gt;"0"&lt;/span&gt; &lt;span class="na"&gt;first-fast-retry=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt; &lt;span class="na"&gt;condition=&lt;/span&gt;&lt;span class="s"&gt;"@(context.Response.StatusCode == 429 || (context.Response.StatusCode == 503 &amp;amp;&amp;amp; !context.Response.StatusReason.Contains("&lt;/span&gt;&lt;span class="err"&gt;Backend&lt;/span&gt; &lt;span class="err"&gt;pool")&lt;/span&gt; &lt;span class="err"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="err"&gt;!context.Response.StatusReason.Contains("is&lt;/span&gt; &lt;span class="err"&gt;temporarily&lt;/span&gt; &lt;span class="err"&gt;unavailable")))"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;forward-request&lt;/span&gt; &lt;span class="na"&gt;buffer-request-body=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/retry&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/backend&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;outbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;base&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/outbound&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;on-error&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;base&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;&amp;lt;!-- 
- Error Handling and Custom Metrics
Purpose: Pushes custom metrics for 429 throttling errors to enable Azure Monitor alerts
Variables Set:
- service-name: Identifies the service generating the error
- target-deployment: The model that was requested when error occurred
Integration: Uses throttling-events fragment to send metrics to monitoring system
 --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;set-variable&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"service-name"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"FeedbackForgev2"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;set-variable&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"target-deployment"&lt;/span&gt; &lt;span class="na"&gt;value=&lt;/span&gt;&lt;span class="s"&gt;"@((string)context.Variables["&lt;/span&gt;&lt;span class="err"&gt;requestedModel"])"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;include-fragment&lt;/span&gt; &lt;span class="na"&gt;fragment-id=&lt;/span&gt;&lt;span class="s"&gt;"throttling-events"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/on-error&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/policies&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Azure Well-Architected Framework&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The Azure Well-Architected Framework AI workload assessment is a review tool that can be used to self-assess the readiness of an AI workload for production.&lt;/p&gt;

&lt;p&gt;Running AI workloads on Azure can be complex, and this assessment helps evaluate how well the system aligns with the best practices defined in the Well-Architected Framework pillars.&lt;/p&gt;

&lt;p&gt;Although this is not a full professional project following all stages of a typical SDLC, this assessment highlights several missing areas and helps identify gaps that would need to be addressed in a real production scenario.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Assessment Results&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The result of applying the assessment is not particularly strong, with most areas marked as &lt;strong&gt;Critical&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;However, this is expected given the scope of the project, which focuses on architecture and experimentation rather than production readiness.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ati9igOlUCKF2R-QXFN7ycg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ati9igOlUCKF2R-QXFN7ycg.jpeg" width="800" height="309"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AGDYn19wBXgbAg-EA46rgpQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AGDYn19wBXgbAg-EA46rgpQ.jpeg" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Vibe Coding&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;All the Python and React development for the agents, backend, and frontend systems was done using Claude Code from Anthropic.&lt;/p&gt;

&lt;p&gt;This significantly accelerated the project, reducing the implementation time from months to weeks.&lt;/p&gt;

&lt;p&gt;This was the first time building a non-trivial project of this scale using an LLM in a declarative way, and the results were a big surprise in terms of quality.&lt;/p&gt;

&lt;p&gt;The project initially started using Github Copilot but later transitioned to Claude Code due to its stronger ability to understand context and generate more accurate and complete code.&lt;/p&gt;

&lt;p&gt;It also proved to be a very effective companion for problem-solving, helping identify complex errors, suggest fixes, and resolve issues related to libraries and version compatibility.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Feature overlap across MAF, Foundry, and AI Gateway&lt;/strong&gt; Some capabilities such as memory management and content safety appear across multiple layers, which can lead to duplication and requires clear architectural decisions on where to implement each concern.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Vibe Coding&lt;/strong&gt; Refer to the previous section. While highly effective, it may not be suitable for everyone due to cost considerations. In this case, the total cost was approximately 260 euros over 3 months of development. Using a different model might reduce costs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MAF complexity&lt;/strong&gt; MAF provides multiple ways to instantiate and interact with agents, which can feel somewhat convoluted when deciding which approach to use.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI Gateway setup is non-trivial&lt;/strong&gt; Designing and implementing an AI Gateway landing zone requires significant effort and understanding of policies, security, and routing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-agent orchestration&lt;/strong&gt; Complex orchestrations involving multiple agents are challenging by nature, but MAF significantly simplifies their implementation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Production readiness requires more effort&lt;/strong&gt; As highlighted in the Azure Well-Architected Framework assessment, a real production-grade system would require additional work across governance, security, and operational maturity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observability is critical&lt;/strong&gt; Observability using OpenTelemetry (logs, traces, and metrics) is essential in AI systems to understand agent behavior and system execution at any point in time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Source code:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/feedback-forge" rel="noopener noreferrer"&gt;https://github.com/zodraz/feedback-forge&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/ai-hub-gateway-feedbackforge" rel="noopener noreferrer"&gt;https://github.com/zodraz/ai-hub-gateway-feedbackforge&lt;/a&gt;&lt;/p&gt;

</description>
      <category>multiagentsystems</category>
      <category>microsoftagentframew</category>
      <category>microsoftfoundry</category>
      <category>agents</category>
    </item>
    <item>
      <title>Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp; AI Gateway (Part 1)</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Mon, 30 Mar 2026 06:04:19 +0000</pubDate>
      <link>https://dev.to/tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-1-3oba</link>
      <guid>https://dev.to/tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-1-3oba</guid>
      <description>&lt;p&gt;&lt;em&gt;A practical guide to architecture, implementation, and lessons learned from a multi-agent AI system&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fehcmgfvz7y2l4q4cmx4p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fehcmgfvz7y2l4q4cmx4p.png" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Posts in this Series
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://medium.com/@tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-1-f05e5beb3a2f" rel="noopener noreferrer"&gt;Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp;amp; AI Gateway (Part 1) &lt;em&gt;(This Post)&lt;/em&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@tarantarantino/building-feedbackforge-a-multi-agent-ai-system-on-azure-with-maf-foundry-ai-gateway-part-2-8050bbad4d60" rel="noopener noreferrer"&gt;Building FeedbackForge a Multi-Agent AI System on Azure with MAF, Foundry &amp;amp; AI Gateway (Part 2)&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FeedbackForge is an AI-powered feedback analysis system designed to transform raw customer feedback into actionable insights.&lt;/p&gt;

&lt;p&gt;Modern enterprises collect feedback from multiple sources such as web forms and platforms like Zendesk, but most of it remains underutilized due to the effort required to analyze it effectively.&lt;/p&gt;

&lt;p&gt;FeedbackForge addresses this by using a multi-agent architecture to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Analyze customer feedback at scale&lt;/li&gt;
&lt;li&gt;Detect anomalies and emerging issues&lt;/li&gt;
&lt;li&gt;Generate actionable recommendations&lt;/li&gt;
&lt;li&gt;Create tickets in systems like Jira&lt;/li&gt;
&lt;li&gt;Provide an executive dashboard for interactive insights&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This project is a side initiative aimed at exploring agentic AI systems using Microsoft technologies such as Microsoft Foundry, Microsoft Agent Framework (MAF), and Azure AI services.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Use Case&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To put FeedbackForge into context, let’s imagine a scenario based on a fictional enterprise: Contoso Inc.&lt;/p&gt;

&lt;p&gt;Contoso is a global company that collects customer feedback through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Traditional web forms&lt;/li&gt;
&lt;li&gt;Zendesk support tickets&lt;/li&gt;
&lt;li&gt;Mail-based feedback&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Despite having access to large amounts of feedback, they face several challenges:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Feedback is fragmented across systems&lt;/li&gt;
&lt;li&gt;Analysis is manual and time-consuming&lt;/li&gt;
&lt;li&gt;Issues are difficult to prioritize&lt;/li&gt;
&lt;li&gt;No clear connection between feedback and actions&lt;/li&gt;
&lt;li&gt;Insights arrive too late to be useful&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As a result, critical issues often go unnoticed, impacting customer satisfaction and retention.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Goals&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;FeedbackForge aims to solve these problems by enabling:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Automatic classification and analysis of feedback&lt;/li&gt;
&lt;li&gt;Real-time anomaly detection and issue tracking&lt;/li&gt;
&lt;li&gt;Generation of actionable recommendations&lt;/li&gt;
&lt;li&gt;Automatic creation of Jira tickets&lt;/li&gt;
&lt;li&gt;Dynamic FAQ generation from customer feedback&lt;/li&gt;
&lt;li&gt;An executive dashboard with a conversational interface&lt;/li&gt;
&lt;li&gt;Competitive intelligence and churn risk detection&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Architecture&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FeedbackForge is designed as a multi-agent system, where multiple specialized agents collaborate to analyze feedback and generate insights.&lt;/p&gt;

&lt;p&gt;Each agent focuses on a specific task (e.g., sentiment analysis, anomaly detection, action generation…), allowing the system to be modular, scalable, and easier to maintain.&lt;/p&gt;

&lt;p&gt;Agents communicate externally using the Agent-to-Agent (A2A) protocol and interact with systems through Model Context Protocol (MCP) servers, enabling integration with tools such as Zendesk and Jira.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;AI Hub Gateway&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To ensure governance, security, and observability, FeedbackForge uses a centralized AI Gateway based on Azure API Management.&lt;/p&gt;

&lt;p&gt;This gateway acts as a control plane for all AI interactions, enforcing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Security and access control&lt;/li&gt;
&lt;li&gt;Usage monitoring and cost management&lt;/li&gt;
&lt;li&gt;Policy enforcement and compliance&lt;/li&gt;
&lt;li&gt;Resiliency and load balancing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The architecture follows a hub-and-spoke model, where a central AI gateway manages traffic while individual application environments operate independently within defined guardrails.&lt;/p&gt;

&lt;p&gt;This layer is implemented through the &lt;strong&gt;Citadel Governance Hub&lt;/strong&gt; , an enterprise-grade AI landing zone that provides a unified control plane for all AI workloads.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Why this matters&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Instead of allowing each application to directly call AI models, all traffic flows through a centralized gateway. This enables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Governance &amp;amp; Security&lt;/strong&gt; Consistent access control, identity management, and policy enforcement&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observability &amp;amp; Compliance&lt;/strong&gt; Centralized logging, metrics, and near real-time usage analytics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Developer Velocity&lt;/strong&gt; Standardized onboarding and reusable configuration patterns&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Hub-and-Spoke Architecture
&lt;/h4&gt;

&lt;p&gt;The system follows a &lt;strong&gt;hub-and-spoke&lt;/strong&gt; model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Hub (Control Plane)&lt;/strong&gt; → Centralized governance and AI Gateway&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spokes (Applications)&lt;/strong&gt; → Independent workloads and agents&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This allows teams to innovate independently while remaining within enterprise guardrails.&lt;/p&gt;

&lt;h4&gt;
  
  
  🎯 Citadel Governance Hub — Central Control Plane
&lt;/h4&gt;

&lt;p&gt;The central governance layer with unified AI Gateway that all AI workloads route through.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fycsyzn3y0g2r4x83m4cr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fycsyzn3y0g2r4x83m4cr.png" width="800" height="634"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0xxph5cg1m6rzbnqyaa.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0xxph5cg1m6rzbnqyaa.jpeg" width="800" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F12v6m1s6ti9ev0yvdzd2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F12v6m1s6ti9ev0yvdzd2.png" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyyuhstrvip93tun2xxp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyyuhstrvip93tun2xxp.png" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Spoke Layer (Applications)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The spoke layer contains the actual applications and agents that implement business logic.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfs8uo9fh0mk8flwfwbr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfs8uo9fh0mk8flwfwbr.png" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Components&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Azure Container Apps&lt;/strong&gt; Hosts all frontend, backendservices and agents with serverless scaling&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Cosmos DB&lt;/strong&gt; : Stores feedback data, generated FAQS, alerts and workflow reports&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Redis:&lt;/strong&gt; Stores session data and conversational memory&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure AI Search:&lt;/strong&gt; Enables RAG-based FAQ generation and retrieval&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Zendesk Integration:&lt;/strong&gt; Source of customer support tickets and feedback&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Jira Integration:&lt;/strong&gt; Destination for automatically generated action items&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Project Structure&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FeedbackForge is composed of multiple components implemented in Python (backend) and TypeScript with React (frontend). The system is designed to be modular, allowing different execution modes depending on the use case.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Backend&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The backend is built around a single Python project (feedbackforge) that can run in multiple modes. Each mode represents a different way of interacting with the system.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Operating Modes&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;FeedbackForge supports five main operating modes:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bmi7c50fxwa8r7fk88g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bmi7c50fxwa8r7fk88g.png" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Chat Mode (DevUI)-Development Interface&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An interactive chat interface for testing agents and workflows.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Best for: development, testing, and exploration&lt;/li&gt;
&lt;li&gt;Default port: 8090&lt;/li&gt;
&lt;li&gt;Command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; feedbackforge chat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;DevUI is a lightweight application built on the Microsoft Agent Framework that provides:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A web-based interface for interacting with agents&lt;/li&gt;
&lt;li&gt;An OpenAI-compatible API backend&lt;/li&gt;
&lt;li&gt;Tools for debugging and iterating on workflows&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Serve Mode (AG-UI)-Production Server&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A production-ready FastAPI server implementing the AG-UI protocol.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Best for: frontend integration and production deployments&lt;/li&gt;
&lt;li&gt;Default port: 8081&lt;/li&gt;
&lt;li&gt;Command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; feedbackforge serve &lt;span class="nt"&gt;--port&lt;/span&gt; 8081
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;AG-UI (Agent User Interaction Protocol) is an event-driven protocol that standardizes communication between user-facing applications and AI agents. It enables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bi-directional communication&lt;/li&gt;
&lt;li&gt;Streaming responses&lt;/li&gt;
&lt;li&gt;Decoupling between frontend and backend&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw36tqw0gqiielmq9fwf6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw36tqw0gqiielmq9fwf6.jpeg" width="538" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Workflow Mode-Batch Analysis&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Executes the full multi-agent pipeline for large-scale feedback analysis.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Best for: scheduled processing of survey or feedback data&lt;/li&gt;
&lt;li&gt;Command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; feedbackforge workflow &lt;span class="nt"&gt;--max-surveys&lt;/span&gt; 50
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;4. FAQ Mode-RAG-Based FAQ Generation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Generates FAQs automatically from customer feedback using Azure AI Search.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Uses hybrid search (keyword + vector + semantic)&lt;/li&gt;
&lt;li&gt;Applies clustering to identify common themes&lt;/li&gt;
&lt;li&gt;Best for: customer support knowledge bases and documentation generation&lt;/li&gt;
&lt;li&gt;Command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; feedbackforge faq &lt;span class="nt"&gt;--days&lt;/span&gt; 7 &lt;span class="nt"&gt;--max-faqs&lt;/span&gt; 20
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;5. MCP Server Mode — External Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Runs a Model Context Protocol (MCP) server to integrate external feedback sources.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Best for: connecting to real systems like Zendesk, replacing mock data with production data and CI/CD and automation scenarios&lt;/li&gt;
&lt;li&gt;Command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; feedbackforge mcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;Action Planner&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Action Planner is an autonomous agent responsible for converting insights into actionable tasks. It integrates directly with systems like Jira to create and track issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Features&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A2A Protocol → Seamless communication with other agents&lt;/li&gt;
&lt;li&gt;Multi-platform support → Create tickets in Jira&lt;/li&gt;
&lt;li&gt;Intelligent prioritization → Based on severity and impact&lt;/li&gt;
&lt;li&gt;Auto-assignment → Suggests responsible teams&lt;/li&gt;
&lt;li&gt;Effort estimation → T-shirt sizing (S/M/L/XL)&lt;/li&gt;
&lt;li&gt;Traceability → Links tickets back to original feedback&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Frontend&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The frontend consists of two React applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FeedbackForge FAQ Viewer:&lt;/strong&gt; A web interface for browsing auto-generated FAQs stored in Azure Cosmos DB. Designed for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Customer support teams&lt;/li&gt;
&lt;li&gt;End-user self-service&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;FeedbackForge Dashboard:&lt;/strong&gt; A React application integrated with AG-UI for real-time interaction with agents. Features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Streaming responses from agents&lt;/li&gt;
&lt;li&gt;Interactive exploration of insights&lt;/li&gt;
&lt;li&gt;Executive-level dashboard experience&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Microsoft Agent Framework (MAF)&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FeedbackForge is built using the &lt;strong&gt;Microsoft Agent Framework (MAF)&lt;/strong&gt;, a multi-language framework for building, orchestrating, and deploying AI agents.&lt;/p&gt;

&lt;p&gt;MAF provides the foundational building blocks required to create both simple and complex multi-agent systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Core Capabilities&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Agent runtime&lt;/strong&gt; → Execution environment for agents&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Model clients&lt;/strong&gt; → Chat completions and response handling&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Session management&lt;/strong&gt; → Stateful conversations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Context providers&lt;/strong&gt; → Memory and context injection&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Middleware&lt;/strong&gt; → Interception and customization of agent behavior&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MCP clients&lt;/strong&gt; → Integration with external tools and services&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Workflow runtime&lt;/strong&gt; → Multi-agent workflows with graph-based orchestration&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Agents in FeedbackForge&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;FeedbackForge defines multiple specialized agents.&lt;/p&gt;

&lt;p&gt;🤖 &lt;strong&gt;Core Agents&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Action Planning Agent:&lt;/strong&gt; Converts feedback insights into trackable Jira action items.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dashboard Agent:&lt;/strong&gt; Executive Dashboard Assistant for analyzing customer feedback.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Sync Agent:&lt;/strong&gt; Syncs external feedback sources (Zendesk) into the data store.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🤖 &lt;strong&gt;Workflow Agents&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Initial Orchestrator Agent:&lt;/strong&gt; Validates survey data, assesses quality, plans analysis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Preprocessor Agent:&lt;/strong&gt; Cleans and prepares survey data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sentiment Analyzer Agent:&lt;/strong&gt; Analyzes sentiment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Topic Extractor Agent:&lt;/strong&gt; Extracts topics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Anomaly Detector Agent:&lt;/strong&gt; Detects anomalies.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Competitive Intelligence Agent:&lt;/strong&gt; Extracts competitor info.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Insight Miner Agent:&lt;/strong&gt; Synthesizes analyses.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Priority Ranker Agent:&lt;/strong&gt; Ranks priorities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action Generator Agent:&lt;/strong&gt; Generates actions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Report Generator Agent:&lt;/strong&gt; Executive report generator which analyzes all the data provided and creates a comprehensive report.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Final Orchestrator Agent:&lt;/strong&gt; Reviews results.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each agent focuses on a specific responsibility, enabling modular and scalable processing.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Tools&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Microsoft Agent Framework supports many different types of tools that extend agent capabilities. Tools allow agents to interact with external systems, execute code, search data, and more. Tools are invoked dynamically by agents during execution, enabling them to move from reasoning to action.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Agent Tools&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Action Planning Agent&lt;/strong&gt; : Analyze Issue, Create Jira Ticket, Get Available Systems&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Sync Agent&lt;/strong&gt; : Zendesk via MCP, Check Sync Status&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dashboard Agent&lt;/strong&gt; : Get Weekly Summary, Get Issue Details, Get Competitor Insights, Get Customer Context, Check for Anomalies, Set Alert, Generate Action Items, Escalate to Team, Create Action Plan, Run Workflow Analysis, Get Latest Workflow Report, Get Workflow Reports History&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example Tool:&lt;/strong&gt; Get Weekly Summary&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@ai_function&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_weekly_summary&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Get weekly feedback summary with sentiment, top issues, and urgent items.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;debug&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;📊 Calling get_weekly_summary tool&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;feedback_store&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_weekly_summary&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;total_responses&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;total_responses&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;sentiment&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;sentiment&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;top_issues&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
                &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;issue&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;mentions&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;priority&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;P0&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;40&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;P1&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;P2&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;top_issues&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
            &lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;urgent_items&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;urgent_count&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="n"&gt;indent&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;debug&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ get_weekly_summary completed&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;
    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Error in get_weekly_summary: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;exc_info&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed to retrieve weekly summary&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="n"&gt;indent&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;MCP (Model Context Protocol)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;MCP enables agents to interact with external systems as tools. In FeedbackForge, an MCP server is used as a tool for the Data Sync Agent to communicate with Zendesk. This MCP server uses SSE transport over HTTP and includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Tools&lt;/strong&gt; : Fetch Zendesk Tickets, Ingest Feedback to Store, Create Zendesk Ticket&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Resources&lt;/strong&gt; : Structured data endpoints (e.g., feedbackforge://feedback/recent)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Prompts&lt;/strong&gt; : Predefined reasoning templates (e.g., Analyze feedback trends, Generate executive summary)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cjnxe7sq6kl3hpmr1iz.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cjnxe7sq6kl3hpmr1iz.jpeg" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnv6t3ve4x25wfsenf4aa.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnv6t3ve4x25wfsenf4aa.jpeg" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;A2A (Agent-to-Agent Protocol)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;FeedbackForge uses the A2A protocol, an open standard designed to enable seamless communication and collaboration between AI agents. It enables the Dashboard Agent to communicate with the Action Planning Agent, which analyzes the issue, determines priority, category, and effort, and generates a Jira issue.&lt;/p&gt;

&lt;p&gt;Below we can see how the Dashboard agent transparently calls the ActionPlanner agent telling to create a ticket from an action item.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uq7qt380pjmixz8vmf0.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uq7qt380pjmixz8vmf0.jpeg" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Agent Cards&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The A2A Protocol standardizes the data format shared during the discovery process via &lt;strong&gt;Agent Cards&lt;/strong&gt;. These cards contain standardized descriptions that agents use to advertise their capabilities to other agents.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example of an Agent Card (JSON format):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"capabilities"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"pushNotifications"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"streaming"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"defaultInputModes"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"text"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"defaultOutputModes"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"text"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Converts customer feedback insights into trackable tickets in Jira"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ActionPlanningAgent"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"preferredTransport"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"JSONRPC"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"protocolVersion"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"0.3.0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"skills"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Analyzes customer feedback issues and creates actionable tickets with proper prioritization, categorization, and effort estimation. Supports Jira."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"examples"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Create tickets for iOS crash issue affecting 45 users"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Analyze payment failure feedback and create action plan"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Generate tickets for login performance complaints"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"action_planner_ticket_creation"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ActionPlanner"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"tags"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"action-planning"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"ticket-creation"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"jira"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"feedback"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"agent-framework"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"http://0.0.0.0:8084/"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1.0.0"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Agent Discovery&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For agents to collaborate, they first need to discover each other. A2A supports multiple discovery methods:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;DNS-based Discovery&lt;/li&gt;
&lt;li&gt;Registry-based Discovery&lt;/li&gt;
&lt;li&gt;Private Discovery&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In our case, we will use &lt;strong&gt;private discovery&lt;/strong&gt; , which means directly configuring known agent endpoints through the already known ACA URL.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Task Processing Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The A2A protocol defines a standard flow for task processing between agents:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9b4tz4knw6veczip6ay6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9b4tz4knw6veczip6ay6.jpeg" width="800" height="659"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communication Protocols&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A2A is built on established communication protocols:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;HTTP/HTTPS: For basic request/response communication&lt;/li&gt;
&lt;li&gt;JSON-RPC: For structured method calls&lt;/li&gt;
&lt;li&gt;Server-Sent Events (SSE): For streaming responses&lt;/li&gt;
&lt;li&gt;JSON: As the standard data exchange format&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk9cm77g71fk072iys7pe.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk9cm77g71fk072iys7pe.jpeg" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Memory and Sessions&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;FeedbackForge, through the Microsoft Agent Framework (MAF), uses a strategy for storing and managing conversations through sessions in two modes: in-memory and persistent. The persistent mode uses short-term storage, which can be configured through TTL.&lt;/p&gt;

&lt;p&gt;This ensures that when we refresh a page on the Dashboard UI, we do not lose the current conversations.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;RAG and Azure AI Search&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Retrieval Augmented Generation (RAG) is used to create FAQs based on common topics found in customer issues. It leverages Azure AI Search by creating a search index over the fields stored in Cosmos DB, supporting vector and semantic search capabilities.&lt;/p&gt;

&lt;p&gt;For semantic search, the fields &lt;em&gt;platform&lt;/em&gt;, &lt;em&gt;sentiment&lt;/em&gt; and &lt;em&gt;topics&lt;/em&gt; are used. Embeddings are generated for the &lt;em&gt;text&lt;/em&gt; field using the &lt;em&gt;text-embedding-3-small&lt;/em&gt; model. A hybrid search approach (keyword + vector + semantic) is still applied to achieve the best query accuracy.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Use hybrid search for better accuracy
# Search for question/problem patterns
&lt;/span&gt;&lt;span class="n"&gt;search_queries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;how to use feature&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;usage questions&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;why not working error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error reports&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;can I do this&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;capability questions&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;problem with issue&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;problems&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;confused about unclear&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;clarity issues&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;how do I configure&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;configuration&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;where is located&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;navigation&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;when will available&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;availability&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;]&lt;/span&gt;

  &lt;span class="n"&gt;all_feedback&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt; &lt;span class="c1"&gt;# id -&amp;gt; feedback dict
&lt;/span&gt;  &lt;span class="n"&gt;feedback_to_query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt; &lt;span class="c1"&gt;# id -&amp;gt; query that found it
&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;category&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;search_queries&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; Searching: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt; (&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;category&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

      &lt;span class="c1"&gt;# Use HYBRID search with Azure AI Search (keyword + vector + semantic)
&lt;/span&gt;      &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;rag_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;hybrid_search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
          &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="n"&gt;get_embeddings_func&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;get_embeddings&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="n"&gt;top&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="n"&gt;filters&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;time_filter&lt;/span&gt;
      &lt;span class="p"&gt;)&lt;/span&gt;

      &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="n"&gt;feedback_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
          &lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

          &lt;span class="c1"&gt;# Only include question-like or problem feedback
&lt;/span&gt;          &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_is_question_like&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
              &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;feedback_id&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;all_feedback&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                  &lt;span class="n"&gt;all_feedback&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;feedback_id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;feedback_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;customer&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;customer_name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;segment&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;customer_segment&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platform&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platform&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;timestamp&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;timestamp&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;reranker_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)),&lt;/span&gt;
                      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text_vector&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text_vector&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="c1"&gt;# For clustering
&lt;/span&gt;                  &lt;span class="p"&gt;}&lt;/span&gt;
              &lt;span class="n"&gt;feedback_to_query&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;feedback_id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;category&lt;/span&gt;

  &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; Found &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;all_feedback&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; question/problem feedback items&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Clustering is performed using vector similarity and cosine similarity on the embeddings of the feedback items. The themes variable in the code below is a list derived from the &lt;em&gt;all_feedback&lt;/em&gt; dictionary, containing only question-like or problem-related feedback items that were retrieved through hybrid search.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_cluster_themes_with_vectors&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Any&lt;/span&gt;&lt;span class="p"&gt;]]:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Cluster similar themes using VECTOR SIMILARITY from Azure AI Search.

    This uses the actual embeddings stored in Azure AI Search for semantic clustering.

    Args:
        themes: List of theme dictionaries (with text_vector if available)

    Returns:
        List of clustered themes with counts
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

    &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; Clustering &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; items using vector similarity...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Sort by search score (highest relevance first)
&lt;/span&gt;    &lt;span class="n"&gt;themes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;sorted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;reverse&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;clusters&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="n"&gt;used_indices&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;similarity_threshold&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.75&lt;/span&gt; &lt;span class="c1"&gt;# Cosine similarity threshold
&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;theme&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;used_indices&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;continue&lt;/span&gt;

        &lt;span class="c1"&gt;# Create new cluster
&lt;/span&gt;        &lt;span class="n"&gt;cluster&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;representative_text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;count&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;samples&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platforms&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platform&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)],&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;segments&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;segment&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)],&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;avg_rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;avg_search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

        &lt;span class="c1"&gt;# Get embedding for this theme (if available)
&lt;/span&gt;        &lt;span class="n"&gt;theme_vector&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text_vector&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Find similar themes using vector similarity
&lt;/span&gt;        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;themes&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;used_indices&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="k"&gt;continue&lt;/span&gt;

            &lt;span class="c1"&gt;# Use vector similarity if available
&lt;/span&gt;            &lt;span class="n"&gt;is_similar&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;theme_vector&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text_vector&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                &lt;span class="n"&gt;similarity&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_cosine_similarity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;theme_vector&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text_vector&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
                &lt;span class="n"&gt;is_similar&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;similarity&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;similarity_threshold&lt;/span&gt;
            &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="c1"&gt;# Fallback to text similarity
&lt;/span&gt;                &lt;span class="n"&gt;is_similar&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_are_similar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;theme&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;threshold&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.6&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;is_similar&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;count&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
                &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;samples&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platforms&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;platform&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
                &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;segments&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;segment&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
                &lt;span class="n"&gt;used_indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;used_indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Calculate averages
&lt;/span&gt;        &lt;span class="n"&gt;ratings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;samples&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;ratings&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;avg_rating&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ratings&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ratings&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;scores&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;samples&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;scores&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;avg_search_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scores&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scores&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;clusters&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cluster&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; Created &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;clusters&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; clusters&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;clusters&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;Workflows&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The project uses MAF workflows, which are designed to handle complex business processes that may involve multiple agents, human interactions, and integrations with external systems.&lt;/p&gt;

&lt;p&gt;Our survey analysis workflow can be executed in two ways: as a scheduled job on Azure Container Apps, or synchronously on demand from the dashboard.&lt;/p&gt;

&lt;p&gt;All agents involved in the workflow and their responsibilities are described in the Agents section. The workflow uses sequential, fan-out, and fan-in patterns, as shown in the following image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frloyd0424e7dkl6ao7al.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frloyd0424e7dkl6ao7al.png" width="668" height="1003"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this workflow, all issues stored in the Cosmos DB database are processed to generate an executive summary. In async mode, the result is returned in JSON format.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"key_metrics"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="nl"&gt;"total_responses"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="nl"&gt;"sentiment_breakdown"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"executive_summary"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"All 20 survey responses are unanimously negative, indicating a severe, platform&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011wide iOS app failure triggered immediately after the iOS 17 update. Feedback highlights two dominant crash modes&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2014app launch and settings&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011screen access&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2014supported by high-volume, repetitive reports across all customer segments, including high&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011value enterprise accounts. Key themes include iOS 17 compatibility breakage, widespread functional instability, repeat crashes, and issues consistently tied to the settings screen. Anomalies such as identical crash patterns, uniformly low ratings, and repeated complaints signal an active crisis with P0-level urgency. No competitive threats or mentions were identified, confirming that the problem is strictly product&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011related. Overall sentiment reflects total user frustration and high churn risk if remediation is delayed."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"positive"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"neutral"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"negative"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="nl"&gt;"top_topics"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="s2"&gt;"ios17_update_breakage: 8"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="s2"&gt;"general_crashing_on_iphone: 4"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="s2"&gt;"settings_screen_crashes: 6"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="s2"&gt;"repeat_crashes_unacceptable: 2"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="nl"&gt;"critical_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="nl"&gt;"high_priority_count"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"critical_issues"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"issue"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"App crashes immediately on launch for all iOS users after iOS 17 update"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"P0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"impact"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Affects all iOS customer segments including enterprise accounts"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"category"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ios17_update_breakage"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"issue"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Settings screen consistently triggers crashes due to likely unhandled iOS 17 changes"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"P0"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"impact"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"High-volume identical reports across users"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"category"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"settings_screen_crashes"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"issue"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Repeated identical crash submissions indicate systemic failure and heightened customer frustration"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"P1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"impact"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Multiple reports per user and wave of 1&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011star ratings"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"category"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"repeat_crashes_unacceptable"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"issue"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Lack of pre&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011release compatibility testing for major iOS updates"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"P2"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"impact"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Increased likelihood of future outages and regressions"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"category"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"process_gap"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"recommendations"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Deploy an emergency hotfix for iOS 17 launch and settings&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011screen crashes"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Immediate"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mobile Engineering"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"P0 failures blocking basic app access for all iOS users"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Run targeted debugging on iOS 17 devices to identify memory, permissions, and deprecated API triggers"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Immediate"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mobile Engineering"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Likely compatibility issues introduced with iOS 17"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Conduct full compatibility testing across iOS 17 devices and minor versions"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"High"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"QA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Prevent additional hidden crash paths"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Publish user&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011facing communication acknowledging the issue and outlining hotfix progress"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"High"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Customer Support / Comms"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mitigate churn and reassure customers during platform&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011wide outage"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Implement pre&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011release OS testing workflows for future major iOS versions"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Medium"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mobile Engineering / QA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Addresses known process gap and prevents recurrence"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Conduct proactive outreach to enterprise customers affected by the outage"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"priority"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"High"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"owner"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Account Management"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
       &lt;/span&gt;&lt;span class="nl"&gt;"rationale"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Prevent churn among high&lt;/span&gt;&lt;span class="se"&gt;\u&lt;/span&gt;&lt;span class="s2"&gt;2011value accounts experiencing total app failure"&lt;/span&gt;&lt;span class="w"&gt;
     &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
 &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When called from the Dashboard UI, the system displays a human-readable response:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ar8N4sxYeOVuyXzXFj7TANw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ar8N4sxYeOVuyXzXFj7TANw.png" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Observability&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Microsoft Agent Framework provides built-in support for observability, allowing you to monitor the behavior of your agents.&lt;/p&gt;

&lt;p&gt;It integrates with OpenTelemetry, emitting traces, logs, and metrics, and uses the Azure Monitor exporter to send them through Application Insights.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Abt-jPAhz1aO9ngMX4jsttA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Abt-jPAhz1aO9ngMX4jsttA.jpeg" width="800" height="344"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We have the new (in preview) Agent Monitoring Dashboard in Microsoft Foundry:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AxNaOA-6e3UoorYIRKMAbIg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AxNaOA-6e3UoorYIRKMAbIg.jpeg" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2ANXFj_RdFADBUmMzM_nt8CQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2ANXFj_RdFADBUmMzM_nt8CQ.jpeg" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also a new brand Grafana view:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ab5jnCektsa2t0PCj99eqcQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Ab5jnCektsa2t0PCj99eqcQ.jpeg" width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The following spans and metrics are created automatically:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Spans:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;invoke_agent &lt;/li&gt;
&lt;li&gt;chat &lt;/li&gt;
&lt;li&gt;execute_tool &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Metrics:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;gen_ai.client.operation.duration (histogram)&lt;/li&gt;
&lt;li&gt;gen_ai.client.token.usage (histogram)&lt;/li&gt;
&lt;li&gt;feedback_processed (counter)&lt;/li&gt;
&lt;li&gt;sessions_created (counter)&lt;/li&gt;
&lt;li&gt;agent_executions (counter)&lt;/li&gt;
&lt;li&gt;agent_duration (counter)&lt;/li&gt;
&lt;li&gt;agent_framework.function.invocation.duration (histogram)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We also trace several operations by decorating methods with the @trace_operation attribute:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@trace_operation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;memory.get_latest_workflow_report&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_latest_workflow_report&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Any&lt;/span&gt;&lt;span class="p"&gt;]]:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Get the most recent workflow analysis report.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;reports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_workflow_reports&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;limit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;reports&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;reports&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed to retrieve latest workflow report: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;ACA (Azure Container Apps)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Azure Container Apps are used to deploy the entire system, including both backend and frontend components. Key benefits include simplified deployments and application management, built-in load-based scaling, and serverless scaling down to zero or revisions.&lt;/p&gt;

&lt;p&gt;The following image shows a list of the deployed container apps:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AvGk5Ub2IcAvhN-bcqOXK3Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AvGk5Ub2IcAvhN-bcqOXK3Q.jpeg" width="800" height="132"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Jobs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Two special Azure Container Apps are used in a cron-scheduled manner. These are called Azure Container Jobs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AFtjo7hpuWuM1SmC4wS4ccQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AFtjo7hpuWuM1SmC4wS4ccQ.jpeg" width="800" height="181"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure Cosmos DB&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Cosmos DB is the primary database. It was chosen for its fast and productive application development, as well as its high availability, high throughput, low latency, and tunable consistency.&lt;/p&gt;

&lt;p&gt;The system uses a database called feedbackforge with the following containers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;feedback:&lt;/strong&gt; stores all the raw feedback items.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;faqs:&lt;/strong&gt; stores generated FAQs derived from feedback.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;alerts:&lt;/strong&gt; stores alerts created by the Dashboard Agent.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;workflow_reports:&lt;/strong&gt; stores the results of the survey workflows.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Coming Soon
&lt;/h3&gt;

&lt;p&gt;📌In the next post, I’ll dig into Microsoft Foundry and AI Gateway.&lt;/p&gt;

&lt;p&gt;👀 &lt;strong&gt;Follow me here on Medium&lt;/strong&gt; to catch Part 2.&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/feedback-forge" rel="noopener noreferrer"&gt;https://github.com/zodraz/feedback-forge&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/ai-hub-gateway-feedbackforge" rel="noopener noreferrer"&gt;https://github.com/zodraz/ai-hub-gateway-feedbackforge&lt;/a&gt;&lt;/p&gt;

</description>
      <category>microsoftfoundry</category>
      <category>multiagentsystems</category>
      <category>agents</category>
      <category>aigateway</category>
    </item>
    <item>
      <title>Tenark: Architecting a Scalable SaaS Multi-Tenant Platform with GitOps</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Sat, 27 Sep 2025 19:27:12 +0000</pubDate>
      <link>https://dev.to/tarantarantino/tenark-architecting-a-scalable-saas-multi-tenant-platform-with-gitops-52f2</link>
      <guid>https://dev.to/tarantarantino/tenark-architecting-a-scalable-saas-multi-tenant-platform-with-gitops-52f2</guid>
      <description>&lt;p&gt;A cloud native platform showcasing how AKS, Crossplane, Terraform, vClusters and ArgoCD enable scalable, secure, and efficient multi-tenant management with GitOps principles.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4nus59tbc7ia7iqh1lan.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4nus59tbc7ia7iqh1lan.png" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This article presents a proof of concept (POC) for Tenark, a multi-tenant Software as a Service (SaaS) system designed as an online bookstore. The POC implements a SaaS control plane using the GitOps Bridge pattern, leveraging technologies like Azure Kubernetes Service (AKS), Crossplane, Terraform, Virtual Clusters (vCluster), and ArgoCD. This architecture provides a scalable, secure, and cost-effective solution for managing multi-tenant environments, balancing isolation and resource efficiency.&lt;/p&gt;

&lt;p&gt;The goal is to demonstrate how modern cloud-native tools can simplify tenant provisioning, management, and application deployment while maintaining flexibility for future scalability, such as supporting premium offerings.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Use Case&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In many enterprises, multi-tenant SaaS systems are critical for delivering scalable, service-centric solutions. Tenark addresses the need for seamless tenant creation, isolation, and management in a SaaS environment. As an online bookstore, Tenark allows customers to browse, purchase, and manage books, with features like inventory tracking, order processing, and personalized recommendations. The system supports enterprise needs for multi-tenant management with features like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Security&lt;/strong&gt; : Ensuring tenant data and operations are isolated and protected.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tenant Provisioning&lt;/strong&gt; : Streamlining the creation and configuration of new tenants.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tenant Data Storage&lt;/strong&gt; : Managing tenant-specific databases and caches.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Metrics, Logging, and Monitoring&lt;/strong&gt; : Providing insights into system performance and usage.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A planned &lt;strong&gt;premium tier&lt;/strong&gt; for high-value customers will offer exclusive books, fully isolated Kubernetes clusters for better performance and security, and advanced analytics. Though not implemented in this POC, this feature highlights the architecture’s flexibility.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Architecture Overview&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The following diagram illustrates Tenark’s high-level architecture:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Freazmuxg841h3859ifuh.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Freazmuxg841h3859ifuh.jpeg" width="800" height="371"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Key Components&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub and GitHub Actions&lt;/strong&gt; : Hosts source code and automates deployment workflows for AKS, applications, and infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Terraform Cloud&lt;/strong&gt; : Provisions the main AKS cluster and bootstraps system components.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AKS&lt;/strong&gt; : Hosts two node pools:&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;System Pool&lt;/strong&gt; : Runs system components like ArgoCD and Crossplane.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;User Pool&lt;/strong&gt; : Hosts the SaaS Control Plane and Application Plane via vClusters.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tenant Data&lt;/strong&gt; : Managed in Azure Resource Groups, containing tenant-specific infrastructure like Redis caches and SQL Server databases.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Platform Services&lt;/strong&gt; : Includes Entra ID for identity management, Azure Monitor for observability, KeyVault for secrets, and DNS for routing.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These components are detailed in the following sections.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;What is SaaS?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Software as a Service (SaaS) is a delivery model where software is hosted centrally and provided to customers over the internet on a subscription basis. Unlike traditional software, SaaS emphasizes a shared experience, reducing the need for customer-specific customizations. This approach lowers maintenance costs and enables rapid scaling.&lt;/p&gt;

&lt;p&gt;For Tenark, SaaS means delivering a seamless bookstore experience to multiple tenants (e.g., individual users, libraries, or bookstores) while maintaining a single codebase and infrastructure core.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Multi-Tenancy in Tenark&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Multi-tenancy allows a single application instance to serve multiple customers (tenants) while ensuring data and configuration isolation. Tenark employs a hybrid multi-tenancy model, balancing shared and isolated components based on business and performance needs.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Multi-Tenancy Approaches&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Common multi-tenancy strategies include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Single Application, Single Database&lt;/strong&gt; : All tenants share one application and database.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Single Application, Multiple Databases&lt;/strong&gt; : One application, but each tenant has a dedicated database.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multiple Applications, Single Database&lt;/strong&gt; : Each tenant has a dedicated application instance sharing a database.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tenark combines these approaches for different components:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;SaaS Control Plane:&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Deployment&lt;/strong&gt; : Shared across all tenants in a single vCluster.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Database&lt;/strong&gt; : Shared database in a common Azure Resource Group.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SaaS Web Portal&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Deployment&lt;/strong&gt; : Shared across all tenants in a single vCluster.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Database&lt;/strong&gt; : Shared database in a common Azure Resource Group.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SaaS Identity Server&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Deployment&lt;/strong&gt; : Shared across all tenants in a single vCluster.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Database&lt;/strong&gt; : Shared database in a common Azure Resource Group.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Tenant Microservices (Ordering, Inventory, Recommendation)&lt;/strong&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Deployment&lt;/strong&gt; : Each microservice per tenant runs in a dedicated vCluster.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Database&lt;/strong&gt; : Separate databases per tenant and service in distinct Azure Resource Groups, with a shared Redis cache per tenant.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Benefits of Tenark’s Approach&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Enhanced Security:&lt;/strong&gt; Dedicated vClusters and isolated databases for tenant-specific microservices ensure robust data separation and protection against unauthorized access.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data and Performance Isolation&lt;/strong&gt; : Tenant-specific microservices and databases prevent “noisy neighbor” issues, especially for resource-intensive services like recommendations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost Efficiency&lt;/strong&gt; : Shared components reduce maintenance overhead.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt; : The architecture supports premium tiers with fully isolated clusters for high-value tenants.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexibility&lt;/strong&gt; : A common codebase simplifies updates and feature rollouts.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For example, the recommendation microservice, which may experience usage spikes, benefits from isolated deployments to ensure performance stability across tenants. By running in dedicated vClusters, each tenant’s recommendation engine operates independently, preventing one tenant’s high demand (e.g., during a promotional campaign) from impacting others. This isolation also allows for tenant-specific customization, such as tailored machine learning models, without affecting the shared infrastructure, and ensures dedicated resource allocation for consistent response times.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;GitOps Bridge Pattern&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The GitOps Bridge pattern, a community-driven best practice, streamlines Kubernetes cluster provisioning and management using GitOps principles. It integrates tools like Terraform, ArgoCD, and Crossplane to automate infrastructure and application deployments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F156u9dnud9lbziiawxnh.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F156u9dnud9lbziiawxnh.jpeg" width="800" height="315"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This was originally created for AWS, but as a Cloud Native pattern can be extended to any HyperScaler.&lt;/p&gt;

&lt;p&gt;In Tenark, the GitOps Bridge pattern provisions the AKS control plane and orchestrates tenant-specific infrastructure and applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Control Plane&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Tenark leverages the GitOps Bridge pattern to establish a primary AKS control plane, which orchestrates the deployment of all infrastructure and applications required for the system. This main control plane, hosted on an Azure Kubernetes Service (AKS) cluster, serves as the central management layer, coordinating resources and ensuring seamless operation of the SaaS platform. It is based on the &lt;a href="https://github.com/Azure-Samples/aks-platform-engineering%20one" rel="noopener noreferrer"&gt;AKS Platform Engineering&lt;/a&gt; project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8748css4rvocbr8fekmn.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8748css4rvocbr8fekmn.jpeg" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Microsoft and AWS provide specific guidelines for building SaaS control planes. In Tenark, it’s critical to distinguish that the AKS control plane acts as the global orchestrator, managing all deployments. Within these deployments, we define two key SaaS components: the SaaS Control Plane, which handles tenant management, and the SaaS Application Plane, which delivers the user-facing experience. These components, detailed in the following sections, operate within the broader AKS control plane, leveraging tools like ArgoCD and Crossplane to automate and streamline their deployment.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;SaaS Control Plane&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The SaaS Control Plane is foundational to any multi-tenant SaaS model. It provides a centralized set of services that enable operators to manage and operate all tenants through a single, unified interface, regardless of the underlying deployment or isolation model. This control plane handles critical tasks like tenant provisioning or user authentication ensuring a consistent and scalable experience for managing the SaaS environment.&lt;/p&gt;

&lt;p&gt;This control plane can be visualized as shown in the following diagram:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxlfcmsvzm5rcga71o4x8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxlfcmsvzm5rcga71o4x8.png" width="387" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Implementation&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Tenark’s SaaS Control Plane is implemented via the &lt;strong&gt;Tenark.Web.ControlPlane&lt;/strong&gt; administration application, built using the ABP.io framework.&lt;/p&gt;

&lt;p&gt;ABP.io is an open-source framework for building modular, multi-tenant applications with features like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Built-in multi-tenancy support with UI for tenant management.&lt;/li&gt;
&lt;li&gt;Role-based access control (RBAC) and user management.&lt;/li&gt;
&lt;li&gt;Extensible product and service management.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pclvb4iuzg19d94nguf.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pclvb4iuzg19d94nguf.jpeg" width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Tenark.AuthServer&lt;/strong&gt; , based on &lt;strong&gt;IdentityServer&lt;/strong&gt; , handles the identity, admin user management, authentication and authorization, ensuring secure tenant and user access.&lt;/p&gt;

&lt;p&gt;All of these components are deployed in a shared vCluster named tenant-common. Below we can see the isolation for all the pods on just that vCluster.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwaskec1cy4x05by6awd2.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwaskec1cy4x05by6awd2.jpeg" width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Future Enhancements&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;While the POC focuses on core functionality, future iterations could include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Onboarding&lt;/strong&gt; : Automating tenant creation with predefined YAML templates and connection strings.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Billing Integration&lt;/strong&gt; : Supporting subscription models for premium tiers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Metrics and Monitoring&lt;/strong&gt; : Integrating Azure Monitor for tenant-specific analytics.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Onboarding, while ideal, requires complex dynamic automation (e.g., generating tenant-specific YAMLs, configurations and infrastructure). For this POC, operators manually configure tenant properties, which is sufficient for demonstration purposes.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;SaaS Application Plane&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The SaaS Application Plane is central to delivering the user-facing functionality in a multi-tenant SaaS model. It provides the services and interfaces that enable tenants to interact with the application, ensuring a seamless and customized experience while leveraging the underlying multi-tenant architecture.&lt;/p&gt;

&lt;p&gt;This application plane can be visualized as shown in the following diagram:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsba2wyg89hmgm2b4j07y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsba2wyg89hmgm2b4j07y.png" width="382" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implementation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Tenark.Web.Portal&lt;/strong&gt; implements the SaaS application which consumes a set of tenant-specific application services, implemented as microservices running as Kubernetes pods. Each microservice operates in a dedicated vCluster, ensuring complete isolation between tenants, as detailed in later sections. These microservices include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Inventory&lt;/strong&gt; : Manages book stock and availability, ensuring accurate tracking of inventory for each tenant.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ordering&lt;/strong&gt; : Processes purchases and order tracking, handling tenant-specific transactions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Recommendation&lt;/strong&gt; : Generates personalized book suggestions using tenant-specific data, leveraging isolated compute resources for performance.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each microservice is supported by its own SQL Server database and a shared, tenant-specific Redis cache for optimized performance.&lt;/p&gt;

&lt;p&gt;And as before we can all the pods deployed in an isolated vCluster with only the needed services for this application plane.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ak0yac5x21nyvjbqmrl.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ak0yac5x21nyvjbqmrl.jpeg" width="800" height="188"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Virtual Clusters&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Virtual Clusters (vClusters) are lightweight, isolated Kubernetes clusters running within a single physical AKS cluster. Each vCluster has its own API server, providing stronger isolation than namespaces without the overhead of separate clusters.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Benefits&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Robust Isolation&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅Granular permissions per tenant.&lt;/p&gt;

&lt;p&gt;✅Isolated control planes and networking.&lt;/p&gt;

&lt;p&gt;✅Customizable security policies.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cost Efficiency&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅Lightweight infrastructure reduces resource consumption.&lt;/p&gt;

&lt;p&gt;✅Simplified management compared to dedicated clusters.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅Reduced API server load.&lt;/p&gt;

&lt;p&gt;✅Conflict-free Custom Resource Definition (CRD) management.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Flexibility&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅Supports diverse Kubernetes environments.&lt;/p&gt;

&lt;p&gt;✅Runs on any Kubernetes-compatible platform.&lt;/p&gt;

&lt;p&gt;Tenark uses vClusters for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Tenant-Common&lt;/strong&gt; : Hosts shared components like the SaaS Control Plane and Web Portal.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tenant-Specific&lt;/strong&gt; : Isolates microservices (Ordering, Inventory, Recommendation) per tenant.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkleseiq5dnt88hj0eq7j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkleseiq5dnt88hj0eq7j.png" width="800" height="572"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;ArgoCD&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;ArgoCD is the GitOps engine for deploying applications and infrastructure. It uses ApplicationSets and Kustomize manifests to manage tenant deployments across environments (Dev, QA, Prod). For the POC, only the Dev environment is implemented, but the system is designed for multi-environment support.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fatriyjzh3ckny20d4j32.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fatriyjzh3ckny20d4j32.jpeg" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Example: ApplicationSet for Tenant Deployment&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The following ApplicationSet deploys the tenant-common vCluster and tenant-specific vClusters (tenant1, tenant2, tenant3) using a matrix generator:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdjs7iqrwib0amjtwj72t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdjs7iqrwib0amjtwj72t.png" width="681" height="863"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This setup allows operators to add or remove tenants by updating the elements list, ensuring flexibility in tenant management.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Kustomize Overlay&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Kustomize overlay defines environment-specific configurations, such as the Dev environment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3tncgwdzogxlhsxqlde.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3tncgwdzogxlhsxqlde.png" width="542" height="87"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Tenant Folder Structure&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Each tenant (e.g., tenant2) and the tenant-common follows a Kustomize-based folder structure for environment-specific deployments:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju2fikf65p75xw6mxlln.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju2fikf65p75xw6mxlln.jpg" alt="Tenant Folder Structure" width="617" height="1089"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Terraform&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;While most infrastructure in Tenark is provisioned through Crossplane, the initial setup of the AKS Control Plane, which hosts Crossplane and other system components, is created using Terraform via the GitOps Bridge pattern. This pattern enables the provisioning of the AKS cluster with all necessary add-ons, such as ArgoCD and Crossplane, to support the SaaS platform.&lt;/p&gt;

&lt;p&gt;Tenark leverages &lt;strong&gt;Terraform Cloud&lt;/strong&gt; , a managed service platform designed to streamline Terraform usage in collaborative, team-oriented environments. Terraform Cloud enhances team workflows with features like state management and automated runs, though the system could also operate with the standard version of Terraform if needed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flg8yuxt1qe399b8k3ijw.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flg8yuxt1qe399b8k3ijw.jpg" alt="Terraform" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Policies&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Terraform Cloud includes &lt;strong&gt;Sentinel&lt;/strong&gt; , a policy-as-code tool that enforces rules to control what users of HashiCorp products can do. Sentinel proactively prevents unauthorized changes by validating infrastructure configurations before deployment. In Tenark, Sentinel policies enforce requirements such as mandatory tags on all Azure resources, ensuring compliance and consistency across the infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy07chn3ud4r52rvrg6x1.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy07chn3ud4r52rvrg6x1.jpg" alt="Policies" width="800" height="232"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Crossplane&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To create all the infrastructure resources needed for Tenark (except for the AKS Control Plane), Crossplane is used, which is a Kubernetes-native control plane framework for platform engineering. Crossplane enables the creation of custom control planes to manage cloud-native software, allowing the design of APIs and abstractions that users interact with to provision and manage infrastructure.&lt;/p&gt;

&lt;p&gt;This notion of a control plane differs from the main AKS control plane and the SaaS Control Plane but similarly serves as a component that manages and orchestrates the delivery of the desired infrastructure in Tenark’s scenario. This aligns with the Kubernetes concept of a control plane, which oversees resource management.&lt;/p&gt;

&lt;p&gt;A key advantage of Crossplane is its use of a reconciliation loop, leveraging Kubernetes to continuously ensure infrastructure matches the desired state. Unlike Terraform’s push-based model, which requires explicit terraform plan and apply commands and state management, Crossplane automates this process, eliminating the need for such manual operations.&lt;/p&gt;

&lt;p&gt;Crossplane uses YAML as its declaration language and is integrated with ArgoCD to deploy these configurations seamlessly.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Managed Resources&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;A managed resource in Crossplane represents an external service in a provider, in this case, the &lt;strong&gt;Upbound Azure Provider&lt;/strong&gt;. Upbound, the primary maintainer of Crossplane, offers providers for various platforms, including AWS and Google Cloud. For Tenark, managed resources define Azure services like Redis caches and SQL databases.&lt;/p&gt;

&lt;p&gt;Here is an example of a Redis cache configuration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyw2o8j1f03b3ul131dua.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyw2o8j1f03b3ul131dua.png" alt=" " width="367" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Compositions&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Crossplane’s Compositions are a powerful feature, allowing multiple Kubernetes resources to be represented as a single Kubernetes object. Composite resources are created when users interact with a custom API defined in a &lt;strong&gt;CompositeResourceDefinition&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Tenark uses compositions to create the common tenant infrastructure stack, which includes a SQL Server database and a Redis cache. An example of such a composition is shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feiz6m5mavxlnf0qgj7z4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feiz6m5mavxlnf0qgj7z4.png" alt=" " width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitOps Bridge Pattern&lt;/strong&gt; : Simplifies AKS provisioning and integrates seamlessly with ArgoCD and Crossplane.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;vClusters&lt;/strong&gt; : Provide robust isolation for multi-tenancy, balancing cost and performance compared to namespaces or dedicated clusters.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;ArgoCD and vClusters&lt;/strong&gt; : Enable flexible tenant management with minimal configuration changes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crossplane Reconciliation&lt;/strong&gt; : Offers powerful automation but requires time to stabilize; occasional sync delays were observed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crossplane Compositions&lt;/strong&gt; : Complex to design but really helpful for abstracting infrastructure stacks. Tools like ClaudeCode assisted in their creation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Crossplane Learning Curve&lt;/strong&gt; : Steeper than Terraform due to its Kubernetes-native approach, but its reconciliation model reduces operational overhead.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Git Repositories
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/tenark-identity-server" rel="noopener noreferrer"&gt;https://github.com/zodraz/tenark-identity-server&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/tenark-services" rel="noopener noreferrer"&gt;https://github.com/zodraz/tenark-services&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/tenark-webs" rel="noopener noreferrer"&gt;https://github.com/zodraz/tenark-webs&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/tenark-gitops-bridge" rel="noopener noreferrer"&gt;https://github.com/zodraz/tenark-gitops-bridge&lt;/a&gt;&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>crossplane</category>
      <category>argocd</category>
      <category>multitenancy</category>
    </item>
    <item>
      <title>OmniSync: Dynamics 365 Integration with Salesforce and Fabric (Part 4)</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Sun, 11 May 2025 13:23:47 +0000</pubDate>
      <link>https://dev.to/tarantarantino/omnisync-dynamics-365-integration-with-salesforce-and-fabric-part-4-4nom</link>
      <guid>https://dev.to/tarantarantino/omnisync-dynamics-365-integration-with-salesforce-and-fabric-part-4-4nom</guid>
      <description>&lt;p&gt;A hands-on walkthrough of syncing Dynamics 365 using Dataverse, PowerApps, Plugins, and Azure Logic Apps as part of a multi-platform integration with Salesforce and Microsoft Fabric.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa1h393b4absnhzaudf8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa1h393b4absnhzaudf8r.png" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Posts in this Series
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-17bdfb29469e" rel="noopener noreferrer"&gt;OmniSync: A Real-World Architecture for Syncing Salesforce, D365, and Fabric in Near Real-Time (Part 1)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-near-real-time-lakehouse-spark-streaming-and-power-bi-in-microsoft-fabric-part-2-6f3177f0931a" rel="noopener noreferrer"&gt;OmniSync: Real-Time Lakehouse, Spark Streaming, and Power BI in Microsoft Fabric (Part 2)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-integrating-salesforce-with-microsoft-fabric-and-dynamics-365-part-3-2a96f86e94a0" rel="noopener noreferrer"&gt;OmniSync: Integrating Salesforce with Microsoft Fabric and Dynamics 365 (Part 3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;em&gt;OmniSync:&lt;/em&gt; Dynamics 365 Integration with Salesforce and Fabric (Part 4)&lt;em&gt;(This Post)&lt;/em&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Dynamics 365 Sales&lt;/strong&gt; was used as the second CRM in the OmniSync integration architecture, synchronized alongside Salesforce. While both platforms offer rich and overlapping CRM functionality, Dynamics 365 was chosen for this role due to its tight integration with Microsoft ecosystem on Power Platform, Dataverse and its extensive market usage.&lt;/p&gt;

&lt;p&gt;As with Salesforce, the goal wasn’t to cover the full feature set, but rather to align a &lt;strong&gt;core retail centric subset of entities&lt;/strong&gt; across systems.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Retail Entities &amp;amp; Tradeoffs&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Instead of using &lt;strong&gt;Dynamics 365 Commerce&lt;/strong&gt; (the retail vertical specialized product with features like POS, checkout, payment features amongst others), we used the &lt;strong&gt;standard Sales module&lt;/strong&gt; , which still offered the necessary capabilities with the exception of Stores, which we added manually.&lt;/p&gt;

&lt;p&gt;To simplify the implementation and reduce system coupling, several tradeoffs were made, much like in the Salesforce setup:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Orders&lt;/strong&gt; were created directly, bypassing Opportunities or Leads for simplicity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Order Numbers&lt;/strong&gt; are aligned manually with Salesforce to avoid complex sequencing logic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Product Custom Fields&lt;/strong&gt; were used (e.g., Size, Weight, Brand) avoiding the built-in Product Properties model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Price Books&lt;/strong&gt; include standard pricing and cost fields and &lt;strong&gt;Price List Items&lt;/strong&gt; were not used.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Product Families&lt;/strong&gt; and &lt;strong&gt;Product Groups&lt;/strong&gt; were skipped, instead, categories were stored as simple product-level fields.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Discounts&lt;/strong&gt; and many optional fields were excluded from synchronization.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-currency&lt;/strong&gt; support was simplified by using EUR only.&lt;/li&gt;
&lt;li&gt;Unused fields were ignored to keep the system lightweight and avoid unnecessary sync complexity.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Customization&lt;/strong&gt;
&lt;/h3&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;PowerApps &amp;amp; Dataverse&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Dynamics 365 Sales is built on top of Microsoft’s &lt;strong&gt;Power Platform&lt;/strong&gt; , which includes &lt;strong&gt;Dataverse&lt;/strong&gt; (as the backend database) and &lt;strong&gt;PowerApps&lt;/strong&gt; for customizing tables and UI. This architecture allows for easy and scalable customization without needing to write full applications.&lt;/p&gt;

&lt;p&gt;In our case, Dynamics 365 didn’t include a &lt;strong&gt;Retail Store&lt;/strong&gt; entity by default, since it’s part of more specialized modules like D365 Commerce. So we created it from scratch using Dataverse and customized its forms, menus, and behavior through PowerApps.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Dataverse Table&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Using the PowerApps interface, we defined a new Retail Store table in Dataverse with all necessary fields. This is done visually through a form-based interface where you can drag, drop, and define field types quickly.&lt;/p&gt;

&lt;p&gt;You can also use &lt;strong&gt;Quick Create&lt;/strong&gt; for a faster setup, or even &lt;strong&gt;Copilot&lt;/strong&gt; , which suggests fields based on prompts accelerating the initial setup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk46fp14356oi7gmxu8en.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk46fp14356oi7gmxu8en.jpeg" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Forms&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;After creating the table, PowerApps automatically generates default forms and views, which can then be tailored to fit your needs.&lt;/p&gt;

&lt;p&gt;This is part of building a &lt;strong&gt;Model-Driven App&lt;/strong&gt; , where the UI is dynamically generated based on the underlying data structure in Dataverse. You can adjust layouts, control visibility, define validation rules, and group fields as needed.&lt;/p&gt;

&lt;p&gt;A customized form for the Retail Store form was created and later updated, with clearly organized sections for General Info and a brand new customized Address Details tabs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Falzz8nvbvxfc59b0xr9a.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Falzz8nvbvxfc59b0xr9a.jpeg" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Views&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Views were also created to define how lists of records are presented. Those are displayed in a table or grid format with options for sorting, filtering, column selection, and search functionality.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj50y702bxarx8yp6hhpw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj50y702bxarx8yp6hhpw.jpeg" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Menus&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To make this new entity usable within the Dynamics 365 Sales Hub, we added it to the app’s main navigation on the left pane. This lets users browse and manage retail stores directly, just like any standard CRM entity.&lt;/p&gt;

&lt;p&gt;In addition, a Quick Create menu was set up to allow rapid record creation from anywhere in the app, streamlining the user experience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F51sw0qxu8vv34ng0ghav.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F51sw0qxu8vv34ng0ghav.jpeg" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;UI Result&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;All these configurations resulted in a fully functional and visually integrated Retail Store form inside Dynamics 365. Users can now view, create, and edit these records natively, just like standard entities like Accounts or Products.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85h3zmaaytg8n0zkc5pz.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85h3zmaaytg8n0zkc5pz.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbl47r64kisg600712h67.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbl47r64kisg600712h67.jpeg" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In-App Notifications&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Logic was added to display a &lt;strong&gt;notification&lt;/strong&gt; when a synchronization conflict is detected between Salesforce and Dynamics 365.&lt;/p&gt;

&lt;p&gt;This is handled by a JavaScript function tied to the &lt;strong&gt;OnLoad&lt;/strong&gt; event of the form. It checks the value of a hidden field &lt;em&gt;Status_Sync&lt;/em&gt;. If the value is “Conflict”, a warning banner is displayed at the top of the form, and the field becomes visible for transparency.&lt;/p&gt;

&lt;p&gt;This helps inform users that a manual review or correction should be needed to review and fix.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;SyncStatus&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;executionContext&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;formContext&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;executionContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getFormContext&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;formContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getAttribute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;omnisync_syncstatus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;getValue&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;fieldControl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;formContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getControl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;omnisync_syncstatus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

        &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Conflict&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;formContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ui&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setFormNotification&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;There has been a Conflict Syncing. Review the fields or contact the Administrator!&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;WARNING&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;conflictNotification&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="nx"&gt;fieldControl&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setVisible&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;formContext&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ui&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;clearFormNotification&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;conflictNotification&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="nx"&gt;fieldControl&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setVisible&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Error in displayFormNotification: &lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flq0g96mr7kfnf9bbhasb.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flq0g96mr7kfnf9bbhasb.jpeg" width="800" height="359"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Plugins&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To maintain feature parity with Salesforce, we implemented a &lt;strong&gt;Dataverse Plugin&lt;/strong&gt; to calculate the &lt;strong&gt;Sales Order Line Number&lt;/strong&gt; on creation.&lt;/p&gt;

&lt;p&gt;Plugins in Dataverse are written in C# and deployed to execute during specific events, like Create or Update. In our case:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The plugin runs in the &lt;strong&gt;Post Operation&lt;/strong&gt; stage (after the record is created).&lt;/li&gt;
&lt;li&gt;It is configured as &lt;strong&gt;Synchronous&lt;/strong&gt; , so the line number is set immediately and is available in subsequent operations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This mirrors the logic used in Salesforce and ensures consistency across systems.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm8eg4woqlsk29ognkxgn.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm8eg4woqlsk29ognkxgn.jpeg" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f0ajjdlkblw3dylhqjd.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f0ajjdlkblw3dylhqjd.jpeg" width="771" height="816"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Plugins are powerful tools, and for developers with a C# background, they provide a familiar and efficient way to enforce server-side business logic that goes beyond what low-code flows can achieve.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Logic Apps&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;As with the Salesforce integration, Logic Apps serves as the central iPaaS (Integration Platform as a Service) for Dynamics 365.&lt;/p&gt;

&lt;p&gt;A Dataverse trigger which relies on the &lt;strong&gt;Dataverse connector&lt;/strong&gt; , on the &lt;em&gt;When a row is added, modified or deleted&lt;/em&gt; event is used. The action type (Insert, Update, or Delete) is then filtered within Logic Apps and handled by &lt;strong&gt;three separate Logic Apps&lt;/strong&gt; , each dedicated to a specific type of change.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why This Approach?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This design decision ( to split Logic Apps by action type ) was based on challenges encountered with Dataverse’s webhook-based triggering system like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Some change events were &lt;strong&gt;duplicated&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Others were &lt;strong&gt;missed entirely&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By assigning &lt;strong&gt;one Logic App per change type (Insert, Update, Delete)&lt;/strong&gt;, the integration became far more stable and predictable. Additionally, this setup makes it easier to integrate Change Data Capture (CDC) logic for forwarding events to Microsoft Fabric via Event Hub.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa7mtklmlohkgiu1qka4d.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa7mtklmlohkgiu1qka4d.jpeg" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Example: Insert — Order Product (Sales Order Line Item)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Let’s walk through a typical Logic App that listens for &lt;strong&gt;Order Product&lt;/strong&gt; (Sales Order Line item) inserts:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2r83qns48exx3kfy5puf.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2r83qns48exx3kfy5puf.jpeg" width="800" height="2186"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Trigger&lt;/strong&gt;
The Logic App is triggered when a new &lt;em&gt;Order Products&lt;/em&gt; record is inserted in Dataverse.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration User Check&lt;/strong&gt;
A condition checks if the change originated from an integration user to avoid triggering synchronizations from system to system updates.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Retrieve Related Entities&lt;/strong&gt;
If valid, the Logic App retrieves related records: the product, the parent order, the synced Salesforce Order from the mapping table&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Parallel Branching&lt;/strong&gt;
Two parallel operations begin: &lt;strong&gt;Send CDC to Fabric&lt;/strong&gt; through &lt;strong&gt;Event Hub&lt;/strong&gt; and &lt;strong&gt;Create Salesforce Order Product&lt;/strong&gt; if not already synced&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Conflict Check&lt;/strong&gt;
Before inserting in Salesforce, it checks if a line item already exists (via &lt;em&gt;MasterDataMapping&lt;/em&gt;). If so, it sets the &lt;em&gt;Sync_Status&lt;/em&gt; to conflict and exits.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce Item Creation&lt;/strong&gt;
If not in conflict, the Salesforce &lt;strong&gt;PriceBookEntry&lt;/strong&gt; is fetched, and the &lt;strong&gt;Order Product&lt;/strong&gt; is created using the Salesforce connector.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sending CDC&lt;/strong&gt;
Finally, a new CDC record is generated and sent to Fabric to reflect this change in the Lakehouse system.&lt;/li&gt;
&lt;/ol&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Data Transformations&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Just like in the Salesforce integration, data transformations are a key part of the Logic Apps workflows. These ensure each system receives the correct structure and format of data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Liquid Templates for Fabric CDC&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To construct CDC messages for Fabric, like it was done in the SalesForce integration Liquid templates were used. These templates convert Dataverse outputs into the JSON schema expected by Fabric’s Event Stream.&lt;/p&gt;

&lt;p&gt;Here’s an example template for syncing a Retail Store ** ** entity:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Operation"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.SdkMessage | capitalize }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Entity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Store"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Values"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreCode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storecode }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"CustomerKey"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content._cr989_account_value }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreTypeID"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storetype }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content._cr989_storetype_label }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storename }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreDescription"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storedescription }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StorePhone"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storephone }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"StoreFax"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_storefax }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"AddressLine1"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_addressline1 }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"EmployeeCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.cr989_employeecount }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Latitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.address1_latitude }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Longitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.address1_longitude }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"CreatedDate"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.createdon }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"UpdatedDate"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.modifiedon }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"D365Id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ content.ItemInternalId }}"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"CreatedDate"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ "&lt;/span&gt;&lt;span class="err"&gt;now&lt;/span&gt;&lt;span class="s2"&gt;" | date: "&lt;/span&gt;&lt;span class="err"&gt;%Y-%m-%d&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;%H:%M&lt;/span&gt;&lt;span class="s2"&gt;" }}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"UpdatedDate"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"{{ "&lt;/span&gt;&lt;span class="err"&gt;now&lt;/span&gt;&lt;span class="s2"&gt;" | date: "&lt;/span&gt;&lt;span class="err"&gt;%Y-%m-%d&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;%H:%M&lt;/span&gt;&lt;span class="s2"&gt;" }}"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This lets us dynamically construct messages that Fabric can ingest through Spark Streaming, keeping the transformation logic &lt;strong&gt;declarative&lt;/strong&gt; and &lt;strong&gt;lightweight&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fny22388dpeqtl7ipa9ib.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fny22388dpeqtl7ipa9ib.jpeg" width="800" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Direct Mapping via Connectors&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While Liquid templates are used for constructing CDC payloads to Fabric, most of the remaining data transformations are done directly via &lt;strong&gt;Logic Apps connectors&lt;/strong&gt; , particularly when syncing between Dataverse and Salesforce.&lt;/p&gt;

&lt;p&gt;Because these connectors expose the full set of fields and metadata, mapping becomes a matter of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Mapping fields from source to destination&lt;/li&gt;
&lt;li&gt;Applying &lt;strong&gt;expression functions&lt;/strong&gt; when necessary (e.g., formatting dates, converting units)&lt;/li&gt;
&lt;li&gt;Structuring nested data (e.g., lookups or relationships)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This approach keeps the integration visual, low-code, and easy to maintain, especially when no complex transformation logic is required.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhl230z9msmbpvl1e800a.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhl230z9msmbpvl1e800a.jpeg" width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Deletions&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Handling deletions in Dynamics 365 introduces a particular challenge: you don’t automatically get information about &lt;strong&gt;who deleted the record&lt;/strong&gt; in the Logic App trigger (i.e., there’s no &lt;em&gt;DeletedBy&lt;/em&gt; field in the payload).&lt;/p&gt;

&lt;p&gt;This creates a problem, especially in a system like OmniSync, where we need to distinguish between:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deletions made by  &lt;strong&gt;users&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Deletions triggered by &lt;strong&gt;integration logic&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without that information, the system could potentially enter a &lt;strong&gt;sync loop&lt;/strong&gt; or process deletions unnecessarily. To prevent this, all Logic Apps include a consistent check to determine whether the change came from an &lt;strong&gt;Integration User&lt;/strong&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Solution: Auditing via Audit Table&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To work around this limitation, the &lt;strong&gt;Dataverse Audit table&lt;/strong&gt; was used. By enabling auditing on the entities you want to monitor, you can retrieve historical changes, including who performed a &lt;strong&gt;Delete&lt;/strong&gt; , &lt;strong&gt;Insert&lt;/strong&gt; , or  &lt;strong&gt;Update&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The process works like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Enable auditing&lt;/strong&gt; for each table where you want to track deletes or updates.&lt;/li&gt;
&lt;li&gt;After a deletion trigger fires, query the &lt;strong&gt;Audit table&lt;/strong&gt; to identify the actor behind the change.&lt;/li&gt;
&lt;li&gt;If the user is an integration identity, the Logic App will skip further action to avoid recursion.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This same pattern is also applied to Insert and Update operations, helping reinforce idempotency and cycle prevention across the system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg6sazevgzai33xcozrc.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg6sazevgzai33xcozrc.jpeg" width="800" height="540"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Viewing Audit Logs&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;You can quickly inspect audit logs using the &lt;strong&gt;Audit Summary View&lt;/strong&gt; in Power Platform’s admin interface.&lt;/p&gt;

&lt;p&gt;For more detailed filtering and querying, using &lt;strong&gt;FetchXML&lt;/strong&gt; is a better option, which we’ll explore in an upcoming section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzsq7wibdk1o2dtnxkmy1.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzsq7wibdk1o2dtnxkmy1.jpeg" width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;OmniSync Configuration Table&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To avoid repetitive lookups and keep Logic Apps as streamlined as possible, a &lt;strong&gt;central configuration table&lt;/strong&gt; was introduced in Dataverse: &lt;em&gt;OmniSyncConfiguration&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;This table holds key constants and settings used across the sync process, for example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dynamics 365 Integration User ID&lt;/li&gt;
&lt;li&gt;Salesforce Standard Price Book ID&lt;/li&gt;
&lt;li&gt;Other sync-related identifiers or constants that would otherwise require querying each time&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Instead of hardcoding these values into Logic Apps (which can be harder to maintain), those are stored once and reused so can be used wherever needed via simple lookups.&lt;/p&gt;

&lt;p&gt;This approach:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keeps the logic clean and centralized&lt;/li&gt;
&lt;li&gt;Improves performance by reducing connector calls&lt;/li&gt;
&lt;li&gt;Makes the solution easier to maintain or extend in future&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdhq8smuxokvxf97ywuk.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdhq8smuxokvxf97ywuk.jpeg" width="800" height="332"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Tools Used in Dynamics 365 Integration
&lt;/h3&gt;

&lt;p&gt;To support the setup, customization, and data management of Dynamics 365 during the OmniSync project, several key tools were used, most notably from the &lt;strong&gt;XrmToolBox&lt;/strong&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;XrmToolBox&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;XrmToolBox is a Windows-based tool that connects to Dataverse and offers over 30 built-in plugins, along with more than 300 additional plugins available for installation, supporting tasks ranging from customization and configuration to data manipulation.&lt;/p&gt;

&lt;p&gt;It’s an essential tool for speeding up admin and integration tasks, especially in environments like Dynamics 365 where multiple entities and relationships are involved.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FetchXML Builder&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This XrmToolBox plugin was incredibly helpful for exploring Dataverse tables quickly. With &lt;strong&gt;FetchXML&lt;/strong&gt; , you can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;View rows and filter by field values&lt;/li&gt;
&lt;li&gt;Select only relevant columns&lt;/li&gt;
&lt;li&gt;Inspect relationships between entities visually&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;FetchXML&lt;/strong&gt; is an XML-based query language for Dataverse that allows users to retrieve, filter, and sort data. It’s commonly used in Power Platform tools for advanced data querying and working with entity relationships.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;While Power Apps Studio offers similar features, FetchXML Builder is far faster and more powerful when you’re exploring the data model or debugging issues.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AcS5ZFSWI3iOIYYJURE13oQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AcS5ZFSWI3iOIYYJURE13oQ.jpeg" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SQL 4 CDS&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For more advanced queries, &lt;strong&gt;SQL 4 CDS&lt;/strong&gt; lets you write and run familiar &lt;strong&gt;SQL-style queries&lt;/strong&gt; (SELECT, UPDATE, INSERT, DELETE) directly against your Dataverse instance.&lt;/p&gt;

&lt;p&gt;This is perfect when working with complex joins or bulk-edit scenarios, and ideal if you’re already comfortable with SQL.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AzIwI-yUdR1g0AlHTSyvLKw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AzIwI-yUdR1g0AlHTSyvLKw.jpeg" width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configuration Migration Tool (CMT)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To prepare the system for deployment and migration across environments, we used the Configuration Migration Tool (CMT) via the &lt;strong&gt;Power Platform CLI&lt;/strong&gt; using the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="go"&gt;pac tool cmt
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This tool lets you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Export schema and data from your current Dataverse environment&lt;/li&gt;
&lt;li&gt;Re-import that structure into a new one (e.g., a QA or Production instance)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This was essential for &lt;strong&gt;seeding initial values&lt;/strong&gt; like Stores, Accounts, especially to keep the systems aligned with Salesforce and Fabric.&lt;/p&gt;

&lt;p&gt;⚠️While this project used a single environment, in a multi-environment setup (Dev/QA/Prod), each system would need coordinated seed data to prevent mismatches.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AjqF_v8EcnurWn7x1MV_gDw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AjqF_v8EcnurWn7x1MV_gDw.png" width="800" height="657"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are other of plenty plugins within XMlToolBox to do this task too, so you have lots of choices here.&lt;/p&gt;

&lt;h3&gt;
  
  
  Security
&lt;/h3&gt;

&lt;p&gt;As mentioned in earlier parts of this series, an Integration User pattern is used across platforms to avoid synchronization loops and isolate system activity from human interaction.&lt;/p&gt;

&lt;p&gt;In Dynamics 365, this same approach was applied. A dedicated &lt;strong&gt;integration user account&lt;/strong&gt; was created and assigned a &lt;strong&gt;custom security role&lt;/strong&gt; called Integration Role. This role was created using the &lt;strong&gt;least privilege principle&lt;/strong&gt; , granting only the necessary permissions to access and modify the required entities (e.g., Accounts, Orders, Products, Retail Stores).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Al-r0UNDr-C82A6rRHlBhRw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Al-r0UNDr-C82A6rRHlBhRw.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Client Applications (Microsoft Entra)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Two client applications were registered in &lt;strong&gt;Microsoft Entra ID&lt;/strong&gt; to handle authentication for external services:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure Functions — Salesforce Updates&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;These apps are responsible for pushing updates to Salesforce (CDC responses) and were covered in detail in the previous article on SalesForce.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;GitHub CI/CD Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This app is used to authenticate GitHub workflows during the CI/CD pipeline for deploying changes to Dynamics 365.&lt;/p&gt;

&lt;p&gt;It will be described in more detail in the next section.&lt;/p&gt;

&lt;h3&gt;
  
  
  Git CI/CD Solutions
&lt;/h3&gt;

&lt;p&gt;All Dynamics 365 customizations and configurations are stored in a &lt;strong&gt;Power Platform Solution&lt;/strong&gt; , which is the standard packaging format for Dataverse based apps. This solution format allows you to version, export, and deploy your app components consistently across environments.&lt;/p&gt;

&lt;p&gt;The source code for the solution is tracked in &lt;strong&gt;GitHub&lt;/strong&gt; , enabling &lt;strong&gt;CI/CD automation&lt;/strong&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  Deployment Pipeline
&lt;/h4&gt;

&lt;p&gt;A basic CI/CD setup was created using &lt;strong&gt;GitHub Actions&lt;/strong&gt; , following a simple flow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Export to Git&lt;/strong&gt; :
Manual actions export the solution from Dynamics 365 to our GitHub source code&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pull Request Workflow&lt;/strong&gt; :
Changes are reviewed and merged into the main or master branch via a pull request&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment to Production&lt;/strong&gt; :
Once merged, GitHub Actions deploy the solution to a target environment (e.g., Production) using managed solution packaging&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIQLOM2-IzNAKb0axeEBIKQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIQLOM2-IzNAKb0axeEBIKQ.jpeg" width="800" height="305"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AeFgmx-SRNIZTRlBjBJUJrA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AeFgmx-SRNIZTRlBjBJUJrA.jpeg" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;⚠️ Important Notes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This pipeline &lt;strong&gt;requires a non-trial environment&lt;/strong&gt; , as GitHub Actions create &lt;strong&gt;managed solutions&lt;/strong&gt; , which aren’t supported in trial instances.&lt;/li&gt;
&lt;li&gt;These actions &lt;strong&gt;do not include data&lt;/strong&gt; by default. If you need to migrate seed data (e.g., stores, categories, users), use a tool like the &lt;strong&gt;Configuration Migration Tool (CMT)&lt;/strong&gt; described earlier.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Monitoring&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;As detailed in the Salesforce article, all OmniSync components are monitored through &lt;strong&gt;Azure Monitor&lt;/strong&gt; , with centralized logs and metrics available via a &lt;strong&gt;Log Analytics Workspace&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This approach offers both platform-native observability and extended diagnostics when needed.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Logic Apps Monitoring&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Logic Apps include built-in &lt;strong&gt;run history&lt;/strong&gt; tracking. From the Azure Portal, you can easily review:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Trigger invocations&lt;/li&gt;
&lt;li&gt;Action steps&lt;/li&gt;
&lt;li&gt;Success/failure details&lt;/li&gt;
&lt;li&gt;Execution timings&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is particularly useful for spotting transient failures, checking retry logic, or auditing historical executions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AWfi-JM1G83KMUPH01TaY4Q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AWfi-JM1G83KMUPH01TaY4Q.jpeg" width="800" height="348"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Plugin Trace Viewer (Power Platform)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To monitor plugin executions (like those used for setting Sales Order Line Numbers), we used the Plugin Trace viewer in Power Platform.&lt;/p&gt;

&lt;p&gt;This built-in viewer lets you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Inspect plugin events&lt;/li&gt;
&lt;li&gt;See exceptions and execution order&lt;/li&gt;
&lt;li&gt;Trace variable values and processing time&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A_szskujjljRGPeQIawBQfg.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A_szskujjljRGPeQIawBQfg.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AVwz8LvJcNwW7bvHDEvavDQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AVwz8LvJcNwW7bvHDEvavDQ.jpeg" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Plugin Trace Viewer (XrmToolBox)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For more advanced filtering and visibility, we also used the XrmToolBox Plugin Trace Viewer. Compared to the default Power Platform view, this plugin allows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Full-text search across logs&lt;/li&gt;
&lt;li&gt;Time range filters&lt;/li&gt;
&lt;li&gt;Easy export and comparison&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Aya_GrwGU0M9cml3O-wIoNw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Aya_GrwGU0M9cml3O-wIoNw.jpeg" width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Application Insights Limitation (Trial Environment)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Since this OmniSync instance used a &lt;strong&gt;Trial Dataverse environment&lt;/strong&gt; , integration with &lt;strong&gt;Application Insights&lt;/strong&gt; wasn’t supported.&lt;/p&gt;

&lt;p&gt;In a &lt;strong&gt;production&lt;/strong&gt; or &lt;strong&gt;managed solution&lt;/strong&gt; scenario, Application Insights could be used for deeper monitoring, telemetry, and diagnostics, including tracking performance metrics, usage patterns, and exceptions.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lessons Learned
&lt;/h3&gt;

&lt;p&gt;Working with Dynamics 365 as part of the OmniSync architecture highlighted several important technical and practical takeaways, especially in the context of integrating with external systems like Salesforce and Fabric.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;UI Customization:&lt;/strong&gt; one of the best surprises I found in this project was how easy it was to &lt;strong&gt;customize screens and forms&lt;/strong&gt; in Dynamics 365 using &lt;strong&gt;PowerApps&lt;/strong&gt; and &lt;strong&gt;Dataverse&lt;/strong&gt;. Creating entirely new entities like &lt;em&gt;Retail Stores&lt;/em&gt; from scratch complete with custom fields, forms, and navigation menus experiences was entirely visual, fast to iterate, and required no code.The Power Platform’s integration with Dataverse makes it seamless to link UI components directly to backend tables, and define user flows with precision, all from the same interface.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Permissions:&lt;/strong&gt; even with a minimal integration user role, &lt;strong&gt;fine-tuning&lt;/strong&gt; permissions over time was necessary, especially for less obvious operations, such as reading dynamic product properties or creating order-related entities. For example, enabling privileges like &lt;em&gt;prvReadDynamicProperty&lt;/em&gt; wasn’t something that was immediately apparent. Despite this, using a &lt;strong&gt;least-privilege model&lt;/strong&gt; for the integration user proved to be a solid approach, although it did require iteration and updates whenever permission related errors occurred.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Complex Lookups:&lt;/strong&gt; for simple entities, Logic Apps with basic field mapping worked well. But more complex scenarios like Sales Order syncing required &lt;strong&gt;multi-step lookups&lt;/strong&gt; to fetch like previously synced entities from the &lt;em&gt;MasterDataMapping&lt;/em&gt; table related values like PriceBook entries or foreign key references like Account or Product IDs. Trying to avoid these lookups often led to loss of accuracy or sync integrity. In the end, it was better to embrace complexity where necessary and structure the workflows accordingly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dataverse plugins:&lt;/strong&gt; provided a robust way and easy way through C# to implement logic (like calculating line numbers) where out-of-the-box workflows fell short.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tools: XrmToolBox&lt;/strong&gt; and its plugins &lt;strong&gt;FetchXML Builder&lt;/strong&gt; and &lt;strong&gt;SQL 4 CDS&lt;/strong&gt; dramatically accelerated development and debugging. Without them, exploring relationships, building queries, or inspecting tables would have taken much longer through standard web interfaces.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Final Thoughts and What’s Next
&lt;/h3&gt;

&lt;p&gt;This concludes the current series of articles. While this marks the end for now, there may be a near future installment expanding on the topics covered, potentially including a &lt;strong&gt;video&lt;/strong&gt; and an &lt;strong&gt;integration with SAP.&lt;/strong&gt; Stay tuned for updates!&lt;/p&gt;

&lt;p&gt;👀 &lt;strong&gt;Follow me here on Medium&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt; &lt;a href="https://github.com/zodraz/omnisync-d365" rel="noopener noreferrer"&gt;https://github.com/zodraz/omnisync-d365&lt;/a&gt;&lt;/p&gt;

</description>
      <category>logicapps</category>
      <category>integration</category>
      <category>dataverse</category>
      <category>ipaas</category>
    </item>
    <item>
      <title>OmniSync: Integrating Salesforce with Microsoft Fabric and Dynamics 365 (Part 3)</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Tue, 06 May 2025 12:40:12 +0000</pubDate>
      <link>https://dev.to/tarantarantino/omnisync-integrating-salesforce-with-microsoft-fabric-and-dynamics-365-part-3-4mk2</link>
      <guid>https://dev.to/tarantarantino/omnisync-integrating-salesforce-with-microsoft-fabric-and-dynamics-365-part-3-4mk2</guid>
      <description>&lt;p&gt;A real-world walkthrough of custom objects, outbound messages, CDC, and Logic Apps used to sync Salesforce with Fabric and Dynamics 365.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fga9938s5uwmzc18oqj5t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fga9938s5uwmzc18oqj5t.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Posts in this Series
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-17bdfb29469e" rel="noopener noreferrer"&gt;OmniSync: A Real-World Architecture for Syncing Salesforce, D365, and Fabric in Near Real-Time (Part 1)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-near-real-time-lakehouse-spark-streaming-and-power-bi-in-microsoft-fabric-part-2-6f3177f0931a" rel="noopener noreferrer"&gt;OmniSync: Near Real-Time Lakehouse, Spark Streaming, and Power BI in Microsoft Fabric (Part 2)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;OmniSync: Integrating Salesforce with Microsoft Fabric and Dynamics 365 (Part 3) &lt;em&gt;(This Post)&lt;/em&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-dynamics-365-integration-with-salesforce-and-fabric-part-4-0ce408fe435a" rel="noopener noreferrer"&gt;&lt;em&gt;OmniSync:&lt;/em&gt; Dynamics 365 Integration with Salesforce and Fabric (Part 4)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This is Part 3 of the OmniSync series, where we deep dive into how &lt;strong&gt;Salesforce&lt;/strong&gt; was integrated with Microsoft Fabric and Dynamics 365 for near real-time data synchronization.&lt;/p&gt;

&lt;p&gt;While previous parts focused on the overall architecture and Fabric implementation, this post explains Salesforce specific challenges and solutions including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How custom objects and fields were used to fit a retail oriented model&lt;/li&gt;
&lt;li&gt;What tricks were needed to work around Salesforce limitations&lt;/li&gt;
&lt;li&gt;How Outbound Messages, Change Data Capture, and Logic Apps were combined to trigger and route changes&lt;/li&gt;
&lt;li&gt;And finally, how this all connects to the rest of the OmniSync ecosystem&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Salesforce and Entity Mapping&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync uses Salesforce as the retail-facing front end, but instead of relying on the highly specialized vertical like &lt;em&gt;Consumer Goods Cloud&lt;/em&gt;, it was opted for a more flexible approach: starting from the &lt;strong&gt;Standard Salesforce model&lt;/strong&gt; and tailoring it to fit retail and integration needs.&lt;/p&gt;

&lt;p&gt;Since this is a Proof of Concept (PoC), the focus was on simplicity and interoperability, particularly with &lt;strong&gt;Dynamics 365&lt;/strong&gt; and &lt;strong&gt;Microsoft Fabric&lt;/strong&gt;. That meant reshaping and mapping a core set of entities without introducing unnecessary complexity.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Core Entities and Mapping&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;A lightweight model centered around real-world retail flows was defined:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Accounts&lt;/strong&gt; : Representing customer companies&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Products&lt;/strong&gt; : Standard Salesforce products, enhanced with custom attributes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Orders and Order Lines&lt;/strong&gt; : Standard Salesforce products that reflect actual retail sales orders&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Retail Stores&lt;/strong&gt; : Stores for physical locations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Geography&lt;/strong&gt; : Used in addresses in Standard Salesforce objects and expanded on Fabric&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Currencies and Categories:&lt;/strong&gt; Mapped internally for simplicity&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Changes and Tradeoffs&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;A brand new custom &lt;strong&gt;Retail Store&lt;/strong&gt; object (Retail_Store__c) was created to represent physical store locations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Store Assortments&lt;/strong&gt; (which products are available in which stores) were &lt;em&gt;excluded&lt;/em&gt; from this version and treated as out of scope&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Orders&lt;/strong&gt; were created directly without going through Opportunities or Leads to simplify the sales flow, although you can use that business workflow if you want anyway&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Products&lt;/strong&gt; were enhanced with attributes like &lt;strong&gt;Units of Measure&lt;/strong&gt; (Size, Weight), &lt;strong&gt;Brand&lt;/strong&gt; , &lt;strong&gt;Color&lt;/strong&gt; , and lifecycle fields such as &lt;strong&gt;Available For Sale Date&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Only a &lt;strong&gt;Standard Price Book&lt;/strong&gt; was used, which includes both the &lt;strong&gt;selling price&lt;/strong&gt; and a &lt;strong&gt;cost price&lt;/strong&gt;  field&lt;/li&gt;
&lt;li&gt;Product &lt;strong&gt;Categories&lt;/strong&gt; relied on the existing &lt;strong&gt;Product Family&lt;/strong&gt; structure, without using subcategories like those defined in Fabric&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Discounts&lt;/strong&gt; , promotions, and complex pricing logic were excluded for simplicity&lt;/li&gt;
&lt;li&gt;Although the ecosystem supports multiple currencies, we standardized everything on &lt;strong&gt;EUR&lt;/strong&gt; to avoid conversion logic&lt;/li&gt;
&lt;li&gt;Many standard Salesforce fields were left &lt;strong&gt;optional or unused&lt;/strong&gt; , and excluded from synchronization logic&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This approach helped streamline the implementation and keep the data model focused. It also ensured that only relevant fields were tracked and synchronized between systems, reducing noise and increasing maintainability.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Custom Fields&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To support the integration logic and ensure proper mapping across systems, we added several custom fields to key Salesforce entities. Most of these fields help enforce unique codes, synchronization statuses, or carry metadata not available in the standard schema.&lt;/p&gt;

&lt;p&gt;These fields are used mainly for mapping and syncing with external systems especially in cases where a specific code or identifier must be sent.&lt;/p&gt;

&lt;h4&gt;
  
  
  Field Overview per Entity
&lt;/h4&gt;

&lt;h4&gt;
  
  
  ✅ &lt;strong&gt;Account&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Email:&lt;/strong&gt; Reference for communication&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SyncStatus__c:&lt;/strong&gt; Sync conflict state&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  ✅ &lt;strong&gt;Product&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Size__c, SizeUnitOfMeasure__c, SizeUnitOfMeasureId__c:&lt;/strong&gt; Physical dimension details and its unit of measure (name+identifier)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Weight__c, WeightUnitOfMeasure__c, WeightUnitOfMeasureId__c:&lt;/strong&gt; Weight and its unit of measure (name+identifier)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Brand__c, Class__c, ClassId__c, Color__c, ColorId__c:&lt;/strong&gt; Classification data used for categorization(name+identifier)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AvailableForSaleDate__c, StopForSaleDate__c:&lt;/strong&gt; Lifecycle Sale dates indicators&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SyncStatus__c:&lt;/strong&gt; Sync conflict state&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  ✅ &lt;strong&gt;Price Book&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;CostPrice__c: Price at which the product was bought&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  ✅ &lt;strong&gt;Order&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SyncStatus__c:&lt;/strong&gt; Sync conflict state&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RetailStore__c:&lt;/strong&gt; Connects the order with a physical store&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  ✅&lt;strong&gt;Order Line (Order Product)&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;OrderItemLineNumber__c:&lt;/strong&gt; Autonumeric line item number&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  ✅&lt;strong&gt;Retail Store (Retail_Store__c custom object)&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;StoreCode__c:&lt;/strong&gt; A unique identifier for the retail store used for system integration&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Name:&lt;/strong&gt; The name of the retail store as displayed to users&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Description:&lt;/strong&gt; A free-text field providing additional information about the store&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;EmployeeCount:&lt;/strong&gt; Total number of employees currently working at the store location&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fax:&lt;/strong&gt; The store’s fax number&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phone:&lt;/strong&gt; The store’s main contact phone number&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;StoreType__c:&lt;/strong&gt; (Picklist) Categorization of the store type (e.g., Store, Catalog, Online, Reseller)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;StoreTypeId__c:&lt;/strong&gt; Identifier for the StoreType__c field&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Account__c:&lt;/strong&gt; (Lookup) A reference to the Account associated with this store.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Address:&lt;/strong&gt; The store’s physical address, including street, city, state, postal code, and country&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SyncStatus__c:&lt;/strong&gt; Sync conflict state&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Triggers&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;We use Salesforce triggers in two main ways, depending on the use case:&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Custom Code Triggers (Apex)&lt;/strong&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Apex triggers&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;enable you to perform custom actions before or after events to records in Salesforce, such as insertions, updates, or deletions. Just like database systems support triggers, Apex provides trigger support for managing records.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;These are used for critical integration logic where visual tools aren’t enough. For example:&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Populating External System References:&lt;/strong&gt; Triggers automatically fill in fields like external IDs which could be colors, unit of measures, store types needed by downstream systems like Dynamics 365 and Fabric.&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Deletion Workaround:&lt;/strong&gt; Since deletions on Flows are not supported on SalesForce (more on this later) we use that as workaround.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Declarative (No-Code) Triggers via Flows&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For simpler automation (like when a record is inserted in an entity), we use Salesforce Flows. These let us configure behavior visually without writing Apex code, making maintenance easier when business rules change.&lt;/p&gt;

&lt;p&gt;Together, these approaches allow for a flexible and maintainable integration logic mixing low-code with code when necessary.&lt;/p&gt;

&lt;p&gt;The following is an example of a trigger that populates the ID value when a color is selected from the map option field:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight apex"&gt;&lt;code&gt;&lt;span class="n"&gt;trigger&lt;/span&gt; &lt;span class="n"&gt;ColorTrigger&lt;/span&gt; &lt;span class="n"&gt;on&lt;/span&gt; &lt;span class="n"&gt;Product2&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;before&lt;/span&gt; &lt;span class="k"&gt;insert&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;before&lt;/span&gt; &lt;span class="k"&gt;Update&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="n"&gt;map&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;integer&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;oppMap&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;map&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;integer&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Azure'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Black'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Blue'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Brown'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Purple'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Red'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Silver'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;White'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Orange'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;9&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Pink'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Grey'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;11&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Silver Grey'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Yellow'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;13&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Green'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Gold'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s2"&gt;Transparent'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

 &lt;span class="nf"&gt;for&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Product2&lt;/span&gt; &lt;span class="nl"&gt;prod&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;trigger&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;new&lt;/span&gt; &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;if&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Color__c&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;ColorId__c&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;oppMap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Color__c&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
     &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;UI Customization: Lightning Pages &amp;amp; Field Visibility&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Since OmniSync also serves as a learning exercise in customizing Salesforce for real-world use, we made a few UI enhancements to improve usability and align with integration needs&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Lightning Record Pages&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The default page for &lt;strong&gt;Retail Store&lt;/strong&gt; object (Retail_Store__c) was customized using with Lightning Record Pages using the &lt;strong&gt;Lightning App Builder&lt;/strong&gt; :&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Lightning Record Pages&lt;/strong&gt; are configurable layouts in Salesforce that define how records are displayed to users. Built using &lt;strong&gt;Lightning App Builder&lt;/strong&gt; , they let admins arrange components, tabs, fields, and related lists without writing code — making it easy to tailor the UI for different roles, devices, and business scenarios.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;A header to highlight the Retail Store code and its name (on the read-only header).&lt;/li&gt;
&lt;li&gt;A detail section with the Retail Store Information fields&lt;/li&gt;
&lt;li&gt;A detail section with Address Information fields&lt;/li&gt;
&lt;li&gt;Integration related fields such as SyncStatus__c was added as the first visible field to see on the top in case there is a Synchronization conflict.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Below is an example of the &lt;strong&gt;Retail Store&lt;/strong&gt; screen, which was built entirely from scratch using Lightning App Builder.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F096fae290x6ms71danln.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F096fae290x6ms71danln.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And its results when published:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbm55ehhsq9t6ca7txsh.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbm55ehhsq9t6ca7txsh.webp" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Conditional Field Visibility&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Salesforce allows low-code visibility rules to control which fields appear under specific conditions. No Apex or Flow needed.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The SyncStatus__c field was made visible only when there was a Conflict value on that and added a warning icon to that.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This visual shows an image showing how the field becomes visible when a conflict is detected during synchronization:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frdzx0fiv8tdc35fpniok.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frdzx0fiv8tdc35fpniok.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even with just low-code tools, Salesforce provided enough flexibility to create a user-friendly and sync-aware UI.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Flows: Automation Without Code&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Salesforce &lt;strong&gt;Flows&lt;/strong&gt; are powerful, declarative tools that let you automate business logic and integrations with minimal to no code. Designed using &lt;strong&gt;Flow Builder&lt;/strong&gt; , they offer a drag-and-drop interface for building everything from simple field updates to complex multi-step logic.&lt;/p&gt;

&lt;p&gt;In OmniSync, all automation was implemented using &lt;strong&gt;Record-Triggered Flows&lt;/strong&gt; , these are flows that run automatically when a record is created, updated, or deleted.&lt;/p&gt;

&lt;p&gt;Here are the key Flow-based use cases implemented:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In-App Notifications&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
 When a sync fails or a conflict is detected on any entity, a notification is automatically sent to the relevant user. This was handled entirely via Record-Triggered Flow and Salesforce’s notification framework, no Apex or email alerts needed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5nt8fept38zfd0t4029m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5nt8fept38zfd0t4029m.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And the result of this custom notification can be seen below. Alongside the alert, the same Flow also updates the StatusSync__c field to reflect the latest sync status, in case notifications are ignored or missed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3atlud9ny767wx3e1mat.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3atlud9ny767wx3e1mat.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Line Number Calculation&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For child objects like product order items, we used a Record-Triggered Flow to auto-assign incremental &lt;strong&gt;Line Numbers&lt;/strong&gt;. This ensures consistency and makes integration with external systems cleaner and more deterministic.&lt;/p&gt;

&lt;p&gt;And the result of this custom notification can be seen below. Alongside the alert, the same Flow also updates the StatusSync__c field to reflect the latest sync status — in case notifications are ignored or missed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faciovmsi7dme0v3zhu7v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faciovmsi7dme0v3zhu7v.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Outbound Messages&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Used to synchronize entities with external systems in real time, using &lt;strong&gt;XML-based SOAP messages&lt;/strong&gt;. These were implemented entirely through Record-Triggered Flows, eliminating the need for custom Apex.&lt;/p&gt;

&lt;p&gt;Each time a record is &lt;strong&gt;inserted, updated, or deleted&lt;/strong&gt; , the Flow automatically sends an outbound message containing key business data to an external system acting as a Webhook.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukw471pijxopdpn69r94.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukw471pijxopdpn69r94.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These flows not only automated key parts of the data lifecycle but also ensured the UI remained responsive, synced, and reliable, all without a single line of Apex.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;Outbound Messages (Webhooks via SOAP)&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Salesforce provides several integration mechanisms, and one of the most useful for real-time scenarios is Outbound Messaging, which uses SOAP-based webhooks.&lt;/p&gt;

&lt;p&gt;In OmniSync, we use &lt;strong&gt;Outbound Messages&lt;/strong&gt; to detect changes (insert/update) on key entities like RetailStore__c. When such changes occur:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;An Outbound Message&lt;/strong&gt; is configured to include the necessary fields.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A Salesforce Flow&lt;/strong&gt; picks up the change and sends the message via a &lt;strong&gt;SOAP notification&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;This notification is received by a &lt;strong&gt;Logic App endpoint&lt;/strong&gt; , which handles the next steps in the integration pipeline.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This method is particularly useful when &lt;strong&gt;Change Data Capture (CDC)&lt;/strong&gt; isn’t available for certain standard or custom objects. While not technically streaming, this approach is near real-time, reliable, and easy to configure, no code required.&lt;/p&gt;

&lt;p&gt;It provides a practical balance between simplicity and speed for triggering external workflows on data changes.&lt;/p&gt;

&lt;p&gt;As shown in the figure below the RetailStoreChanges outbound message which happens when a record on the RetailStore is inserted or updated. Also the flow triggered from this outbound message can be seen on the previous section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6j52zrtnceir423cl3p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6j52zrtnceir423cl3p.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And a partial schema of the SOAP notification for this outbound message:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;schema&lt;/span&gt; &lt;span class="na"&gt;elementFormDefault=&lt;/span&gt;&lt;span class="s"&gt;"qualified"&lt;/span&gt; &lt;span class="na"&gt;targetNamespace=&lt;/span&gt;&lt;span class="s"&gt;"http://soap.sforce.com/2005/09/outbound"&lt;/span&gt;
          &lt;span class="na"&gt;xmlns:xsd=&lt;/span&gt;&lt;span class="s"&gt;"http://www.w3.org/2001/XMLSchema"&lt;/span&gt;
          &lt;span class="na"&gt;xmlns:ent=&lt;/span&gt;&lt;span class="s"&gt;"urn:enterprise.soap.sforce.com"&lt;/span&gt;
          &lt;span class="na"&gt;xmlns:ens=&lt;/span&gt;&lt;span class="s"&gt;"urn:sobject.enterprise.soap.sforce.com"&lt;/span&gt;
          &lt;span class="na"&gt;xmlns:tns=&lt;/span&gt;&lt;span class="s"&gt;"http://soap.sforce.com/2005/09/outbound"&lt;/span&gt;
          &lt;span class="na"&gt;xmlns=&lt;/span&gt;&lt;span class="s"&gt;"http://www.w3.org/2001/XMLSchema"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;import&lt;/span&gt; &lt;span class="na"&gt;namespace=&lt;/span&gt;&lt;span class="s"&gt;"urn:enterprise.soap.sforce.com"&lt;/span&gt; &lt;span class="na"&gt;schemaLocation=&lt;/span&gt;&lt;span class="s"&gt;"ID.xsd"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;import&lt;/span&gt; &lt;span class="na"&gt;namespace=&lt;/span&gt;&lt;span class="s"&gt;"urn:sobject.enterprise.soap.sforce.com"&lt;/span&gt; &lt;span class="na"&gt;schemaLocation=&lt;/span&gt;&lt;span class="s"&gt;"RetailStore.xsd"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="c"&gt;&amp;lt;!-- &amp;lt;element name="notifications"&amp;gt; --&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"notifications"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"tns:NotificationsType"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;complexType&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"NotificationsType"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;sequence&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"OrganizationId"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"ent:ID"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"ActionId"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"ent:ID"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"SessionId"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"xsd:string"&lt;/span&gt; &lt;span class="na"&gt;nillable=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"EnterpriseUrl"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"xsd:string"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"PartnerUrl"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"xsd:string"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"Notification"&lt;/span&gt; &lt;span class="na"&gt;maxOccurs=&lt;/span&gt;&lt;span class="s"&gt;"100"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"tns:RetailStoreNotification"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;/sequence&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;/complexType&amp;gt;&lt;/span&gt;
    &lt;span class="c"&gt;&amp;lt;!-- &amp;lt;/element&amp;gt; --&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;complexType&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"RetailStoreNotification"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;sequence&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"Id"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"ent:ID"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"sObject"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"ens:RetailStore"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;/sequence&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;/complexType&amp;gt;&lt;/span&gt;
    &lt;span class="c"&gt;&amp;lt;!-- &amp;lt;element name="notificationsResponse"&amp;gt; --&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;complexType&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"notificationsResponse"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;sequence&amp;gt;&lt;/span&gt;
          &lt;span class="nt"&gt;&amp;lt;element&lt;/span&gt; &lt;span class="na"&gt;name=&lt;/span&gt;&lt;span class="s"&gt;"Ack"&lt;/span&gt; &lt;span class="na"&gt;type=&lt;/span&gt;&lt;span class="s"&gt;"xsd:boolean"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;/sequence&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;/complexType&amp;gt;&lt;/span&gt;
    &lt;span class="c"&gt;&amp;lt;!-- &amp;lt;/element&amp;gt; --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/schema&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;The Delete Workaround&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;This snippet shows a piece of that code where the custom object is a RetailStoreDeletedEvent__c created on that situation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight apex"&gt;&lt;code&gt;&lt;span class="n"&gt;trigger&lt;/span&gt; &lt;span class="n"&gt;AfterDeleteRetailStore&lt;/span&gt; &lt;span class="n"&gt;on&lt;/span&gt; &lt;span class="n"&gt;RetailStore&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;after&lt;/span&gt; &lt;span class="k"&gt;delete&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;RetailStoreDeletedEvent&lt;/span&gt; &lt;span class="n"&gt;__c&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;co&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;RetailStoreDeletedEvent__&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="nf"&gt;for&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RetailStore&lt;/span&gt; &lt;span class="n"&gt;o&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Trigger&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;old&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;RetailStoreDeletedEvent&lt;/span&gt; &lt;span class="n"&gt;__c&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;RetailStoreDeletedEvent__&lt;/span&gt; &lt;span class="nf"&gt;c&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;o&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Name&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;DeletedId__c&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;o&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Id&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

        &lt;span class="n"&gt;co&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;insert&lt;/span&gt; &lt;span class="n"&gt;co&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And its associated flow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkq9sqvumw2c6kvi1ew9p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkq9sqvumw2c6kvi1ew9p.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Change Data Capture (CDC)&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;While outbound messages worked well for basic events, OmniSync also uses &lt;strong&gt;Salesforce Change Data Capture (CDC)&lt;/strong&gt; for &lt;strong&gt;streaming change events&lt;/strong&gt; , including inserts, updates, and deletes in near real time.&lt;/p&gt;

&lt;p&gt;CDC events are published to a &lt;strong&gt;PushTopic based streaming API&lt;/strong&gt; , which is accessed via &lt;strong&gt;Salesforce’s Pub/Sub API&lt;/strong&gt; and consumed downstream using the &lt;strong&gt;CDC Ingestor App&lt;/strong&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Change Data Capture (CDC) via Pub/Sub API&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To capture real-time changes in Salesforce data, OmniSync leverages &lt;strong&gt;Change Data Capture (CDC)&lt;/strong&gt; a native Salesforce feature that publishes events when records are created, updated, deleted, or undeleted.&lt;/p&gt;

&lt;p&gt;In earlier implementations, external clients had to rely on the &lt;strong&gt;Streaming API&lt;/strong&gt; to subscribe to CDC events. However, OmniSync uses the more modern &lt;strong&gt;Pub/Sub API&lt;/strong&gt; , which provides several advantages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Built on gRPC and HTTP/2&lt;/strong&gt; , making it more efficient and scalable than traditional Streaming API.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Delivers binary event messages&lt;/strong&gt; , reducing payload size and improving speed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Supports multiple languages&lt;/strong&gt; (Java, Go, Node.js, etc.), making it easier to integrate into modern backend systems.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reliable message delivery&lt;/strong&gt; with built-in acknowledgment and replay support.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;🧩 How It Works in OmniSync&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Insert, Update, and Delete&lt;/strong&gt; events are published for key objects (like Orders, Accounts, and Products)&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;CDC Ingestor App&lt;/strong&gt; , which is an Azure Container App implemented on Javascript receives these events and pushes them into &lt;strong&gt;Azure Event Grid&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;From there, Logic Apps which are subscribed as Webhooks and filtered by subject, execute the transformations logic syncing to the downstream systems.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here’s a partial example of a JSON CDC changed object:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"replayId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;211583&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"payload"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"ChangeEventHeader"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"entityName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Account"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"recordIds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"001WU00000vBSOLYA4"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"changeType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"UPDATE"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"changeOrigin"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"transactionKey"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"0000dece-b24f-fd96-1649-167b8ad0cbfa"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"sequenceNumber"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"commitTimestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1746061521000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"commitNumber"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"commitUser"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"005WU000009oVebYAE"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"nulledFields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"diffFields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"changedFields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"ParentId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"BillingAddress"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"ShippingAddress"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Street"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Belleveu Main Street"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"City"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Belleveu"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"State"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"PostalCode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"23232"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Country"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"USA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Latitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;37.5624&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Longitude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;-77.4467&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"GeocodeAccuracy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Zip"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Phone"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Fax"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"AccountNumber"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"CS424"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Website"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Sic"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;......&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;strong&gt;Event Enrichment&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To make the CDC data more useful Salesforce was configured to &lt;strong&gt;include extra fields&lt;/strong&gt; in CDC payloads since for Updates only the changed fields were sent, and to avoid extra calls on the Logic Apps integration layer to find lookups some fields were additionally sent to overcome this need.&lt;/p&gt;

&lt;p&gt;The REST API call can be seen following as an example of how to create a subscription named OmniSync_Channel_chn_AccountChangeEvent on the OmniSync_Channel__chn channel for the &lt;strong&gt;Account&lt;/strong&gt; entity, with additional fields included to enrich the event message.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqk155kb20jqzc913mwv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqk155kb20jqzc913mwv.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Delete Handling with CDC&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Unlike Outbound Messages, &lt;strong&gt;CDC supports deletes natively&lt;/strong&gt; , which made it the preferred approach for full bi-directional sync.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Logic Apps&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Logic Apps serve as the core integration engine in OmniSync, bridging Salesforce, Dynamics 365, and Microsoft Fabric. They handle &lt;strong&gt;data transformation&lt;/strong&gt; , &lt;strong&gt;validation&lt;/strong&gt; , and &lt;strong&gt;routing&lt;/strong&gt; , ensuring that only clean and properly formatted data reaches the destination systems.&lt;/p&gt;

&lt;p&gt;We use two main categories of Logic Apps based on their direction and purpose:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce to Dynamics 365:&lt;/strong&gt; Handles the creation, update, or deletion of retail-related entities like Orders, Products, and Accounts in D365.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce to Fabric (Lakehouse):&lt;/strong&gt; Pushes changes from Salesforce into Microsoft Fabric via Event Hub, powering near real-time analytics.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each Logic App is triggered by a different integration pattern depending on the nature of the Salesforce event:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SOAP Webhook (Outbound Message)&lt;/strong&gt; — Triggered by Salesforce outbound notifications for objects like RetailStore, which send data to a Logic App endpoint via SOAP.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Grid Webhook&lt;/strong&gt;  — Used for handling CDC events or platform events forwarded through Azure Event Grid.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Polling via Salesforce Connector&lt;/strong&gt;  — A fallback or scheduled mechanism to periodically check for new or updated data when webhooks are not available or reliable.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let’s walk through a few concrete scenarios to illustrate how different types of Logic Apps work across various triggers and systems.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Currency Inserted (Salesforce → Fabric)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;In this scenario, a &lt;strong&gt;Salesforce connector polls&lt;/strong&gt; the CurrencyType object every minute for changes. When a new currency is detected:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A &lt;strong&gt;Liquid template&lt;/strong&gt; is used to transform the CurrencyType format into a simplified Currency object.&lt;/li&gt;
&lt;li&gt;Some fields (like datetime formats) can’t be transformed using Liquid due to its limitations, so they’re adjusted directly within the Logic App using expressions.&lt;/li&gt;
&lt;li&gt;The transformed object is then sent to Event Hub, which pushes it into the Event Stream in Microsoft Fabric for analytics.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F681%2F1%2AXU1Al0XJzy-pQ8-Qjf4GHg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F681%2F1%2AXU1Al0XJzy-pQ8-Qjf4GHg.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Retail Store Deleted (Salesforce → Fabric via SOAP)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;When a Retail Store is deleted, Salesforce sends an &lt;strong&gt;Outbound Message&lt;/strong&gt; ( &lt;strong&gt;SOAP&lt;/strong&gt; -based webhook) to the Logic App:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The incoming message is validated against the OutboundRetailStoreDeletedEvent.xsd schema.&lt;/li&gt;
&lt;li&gt;A check is performed to confirm this change is part of an integration sync (using user context, as described in Part 1).&lt;/li&gt;
&lt;li&gt;After parsing and a manual field transformation (via a Compose action), the data is sent to Event Hub to deliver the new CDC deletion event into Fabric.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2ASw1p4nOkAjRkQdlEk5d0eg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2ASw1p4nOkAjRkQdlEk5d0eg.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dynamics 365 Account Sync (Fabric → Salesforce via Event Grid)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is a more advanced use case that handles create, update, and delete events in a single Logic App, all triggered by a subscription to Event Grid.&lt;/p&gt;

&lt;p&gt;Since subscriptions are scoped per entity (not per action), we handle branching within the Logic App itself:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;➕ Create:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Logic App checks if the incoming account already exists in Dynamics 365 by querying a MasterDataMapping table via a Lakehouse SQL endpoint.&lt;/li&gt;
&lt;li&gt;If it exists (e.g., an account code like A001 was already synced from Salesforce), a &lt;strong&gt;conflict is detected&lt;/strong&gt;. The Salesforce record is updated with a StatusSync = conflict.&lt;/li&gt;
&lt;li&gt;Otherwise, a new Dynamics 365 Account is created via the &lt;strong&gt;Dataverse connector&lt;/strong&gt; , with fields mapped accordingly and its id is sent to Fabric mapping table.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;🔁 Update:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A check is made to confirm the record exists in the mapping table.&lt;/li&gt;
&lt;li&gt;If not found, a 404 is returned.&lt;/li&gt;
&lt;li&gt;If it does exist, we call an &lt;strong&gt;Azure Function&lt;/strong&gt; to enrich the update, since Salesforce CDC events may not contain all fields needed for a proper sync.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;❌ Delete:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Again, existence is verified via the mapping table.&lt;/li&gt;
&lt;li&gt;If the account is found, we perform a cleanup of dependencies (e.g., &lt;strong&gt;Sales Orders&lt;/strong&gt; ) that must be removed before deleting the Account from Dynamics 365 using the &lt;strong&gt;Dataverse connector&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;If not found, a 404 is returned, indicating a potential sync issue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AMcx1215jt9WO2bN-fAE5wQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AMcx1215jt9WO2bN-fAE5wQ.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Transformations&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Inside the Logic Apps, data transformations are a key step to map and reshape fields coming from Salesforce into a format expected by the destination systems (such as Dynamics 365 or Fabric). Several transformation mechanisms depending on the complexity and context of the data are used. All approaches aim to produce the same result: a well-structured, consumable object on the other side.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;DataMapper (Visual XSLT Mapping)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;We used DataMapper to transform objects like RetailStore from Salesforce to Store objects consumed by Fabric.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;DataMapper&lt;/strong&gt; is Microsoft’s visual tool for mapping and transforming data across schemas. It allows users to build mappings between JSON or XML structures via a drag-and-drop interface, generating an XSLT file that can later be applied within Logic Apps.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;However, the tool currently has significant limitations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It behaves unpredictably when handling complex or mixed-schema scenarios (e.g., identical schema names in XML vs. JSON)&lt;/li&gt;
&lt;li&gt;XML-to-XML transformations often fail to display properly or get lost&lt;/li&gt;
&lt;li&gt;Some edits are lost when not saved properly due to the use of temporal files&lt;/li&gt;
&lt;li&gt;It lacks advanced customization, requiring manual XSLT editing in many cases&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because of these constraints, DataMapper was used only when absolutely necessary. While promising, it’s not stable enough yet for this project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIpGjHiqizTbufp6MizHl-g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AIpGjHiqizTbufp6MizHl-g.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Liquid Templates&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Liquid is a lightweight template engine originally developed by Shopify. In Logic Apps, we use Liquid templates primarily to &lt;strong&gt;transform incoming Salesforce events into CDC-style JSON objects&lt;/strong&gt; to be sent to Fabric Lakehouse via Event Hub.&lt;/p&gt;

&lt;p&gt;These templates allow field filtering and conditional logic, which makes them perfect for reshaping Salesforce payloads into clean, minimal representations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example: Salesforce Account to Customer&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight liquid"&gt;&lt;code&gt;&lt;span class="cp"&gt;{%-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;assign&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedFields&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ChangeEventHeader&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;changedFields&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;-%}&lt;/span&gt;

{
   "Operation": "&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ChangeEventHeader&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;changeType&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;capitalize&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;",
   "Entity": "Customer",
   "Values": "{
    \"CustomerCode\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;AccountNumber&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",    
    &lt;span class="cp"&gt;{%-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;for&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedField&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;in&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedFields&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;-%}&lt;/span&gt;
     &lt;span class="cp"&gt;{%&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedField&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Name"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;%}&lt;/span&gt;
      \"CompanyName\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
     &lt;span class="cp"&gt;{%&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;elsif&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedField&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Email__c"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;%}&lt;/span&gt;
      \"EmailAddress\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Email__c&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
     &lt;span class="cp"&gt;{%&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;elsif&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedField&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Phone"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;%}&lt;/span&gt;
      \"Phone\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Phone&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
     &lt;span class="cp"&gt;{%&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;elsif&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;changedField&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"LastModifiedDate"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;%}&lt;/span&gt;
      \"UpdatedDate\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;LastModifiedDate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
     &lt;span class="cp"&gt;{%&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;endif&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;%}&lt;/span&gt;
    &lt;span class="cp"&gt;{%-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;endfor&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;-%}&lt;/span&gt;
    \"AddressLine1\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Street&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt; &lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;City&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt; &lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;PostalCode&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt; &lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;State&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt; &lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Country&lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
    \"Latitude\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Latitude&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
    \"Longitude\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;Longitude&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\",
    \"CustomerType\": \"Company\",
    \"SalesForceId\": \"&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;ChangeEventHeader&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nv"&gt;recordIds&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;\"
     }",
 "CreatedDate": "&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"now"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;date&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"%Y-%m-%d %H:%M"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;",
    "UpdatedDate": "&lt;span class="cp"&gt;{{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"now"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;date&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"%Y-%m-%d %H:%M"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="cp"&gt;}}&lt;/span&gt;"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This approach works well for real-time event streaming, although it has limitations, especially with datetime transformations, which must be handled elsewhere in the Logic App.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Manual Mapping using Compositions&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;In many cases, particularly when data is already in a usable format, we perform transformations manually using Compose actions inside Logic Apps. This is a straightforward method to restructure or rename fields without involving external tools or templates.&lt;/p&gt;

&lt;p&gt;It’s especially useful for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Light field remapping.&lt;/li&gt;
&lt;li&gt;SOAP/XML payload extractions.&lt;/li&gt;
&lt;li&gt;Intermediate enrichments.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AokDZVf1zM3wIVzmLuREo7Q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AokDZVf1zM3wIVzmLuREo7Q.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Manual Mapping Using Connectors&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When synchronizing with Dynamics 365, we rely on the DataVerse connector and its built-in CRUD operations. This makes it straightforward to map previously decoded values from Salesforce to their counterparts in Dynamics 365. Thanks to the rich set of expression functions available in Logic Apps’ Workflow Definition Language, we can apply transformations directly, such as string manipulation, conditional logic, and date formatting.&lt;/p&gt;

&lt;p&gt;For instance, when mapping Salesforce Products to Dynamics 365, we convert date fields from Salesforce’s CDC (which are in epoch format) into standard UTC timestamps using expression functions before sending them downstream.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AIF4UFX55stPwrwyN479Fpg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2AIF4UFX55stPwrwyN479Fpg.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Azure Functions for Advanced Logic&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;When field transformations are too complex or when we only receive &lt;em&gt;partial&lt;/em&gt; data (as is the case with Salesforce CDC updates) logic is offloaded to an &lt;strong&gt;Azure Function&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;These functions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Read the incoming JSON payload.&lt;/li&gt;
&lt;li&gt;Identify only the changed fields.&lt;/li&gt;
&lt;li&gt;Query our master data mapping layer if needed.&lt;/li&gt;
&lt;li&gt;Apply transformations and send updates to Dynamics 365 through the Dataverse SDK.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example Snippet:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ServiceClient&lt;/span&gt; &lt;span class="n"&gt;svc&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;ServiceClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ConnectionStr&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;svc&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;IsReady&lt;/span&gt; &lt;span class="p"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt; &lt;span class="p"&gt;!=&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;account&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;Entity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"account"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="n"&gt;account&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Attributes&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"accountnumber"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cgcloud&lt;/span&gt; &lt;span class="n"&gt;__Account_Number__&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;foreach&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;changedField&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;changedFields&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;changedField&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;account&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Attributes&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Name&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;changedField&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"cgcloud __Account_Email__ c"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;account&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Attributes&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"emailaddress1"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cgcloud&lt;/span&gt; &lt;span class="n"&gt;__Account_Email__&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;changedField&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"Phone"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;account&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Attributes&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"telephone1"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Phone&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(!&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;IsNullOrEmpty&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;inputAccountValue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ShippingAddress&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Street&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This gives us full flexibility and control, especially when dealing with partial updates and nested logic.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Event Grid Integration&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;As part of our CDC integration model, &lt;strong&gt;Event Grid&lt;/strong&gt; is used as the main pub/sub mechanism for CDC events emitted by our custom CDC Ingestor running as a Container App.&lt;/p&gt;

&lt;p&gt;The CDC Ingestor is responsible for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Listening to entities’ changes from Salesforce CDC Pub/Sub API&lt;/li&gt;
&lt;li&gt;Publishing them into &lt;strong&gt;Event Grid Topics&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Logic Apps are then subscribed to these topics using &lt;strong&gt;Webhook-based filters&lt;/strong&gt; (based on the subject field). For example, a Logic App subscribed with a filter on subject = Account will only trigger on Account events, regardless of the operation type.&lt;/p&gt;

&lt;p&gt;This model avoids over subscribing and helps isolate business logic per entity.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2Agfd4zJDtVCoZe1j9CnHYXQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F940%2F1%2Agfd4zJDtVCoZe1j9CnHYXQ.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2APBpEXtmEgvx4CECvaSfoSw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2APBpEXtmEgvx4CECvaSfoSw.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Monitoring&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;All components in the OmniSync architecture are monitored using &lt;strong&gt;Azure Monitor&lt;/strong&gt; , with additional diagnostics enabled through a &lt;strong&gt;Log Analytics Workspace&lt;/strong&gt; solution.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Logic Apps Monitoring&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Logic Apps include native history tracking, where you can review runs, triggers, and failures directly in the Azure Portal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Awihci0c4Xzif4j5UXPBAkw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2Awihci0c4Xzif4j5UXPBAkw.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Container Apps Monitoring&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For monitoring the CDC Ingestor or other custom container-based services, &lt;strong&gt;Container Apps Logs&lt;/strong&gt; can be queried through Azure Monitor using Kusto queries. These logs include both system diagnostics and custom error messages emitted from the application code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A-UXpSy-uIG5RID2Z4dl53w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2A-UXpSy-uIG5RID2Z4dl53w.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Salesforce Monitoring&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Salesforce provides its own &lt;strong&gt;Monitoring tool&lt;/strong&gt; to track outbound messages, such as SOAP or Event-driven webhooks. This includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Viewing the &lt;strong&gt;delivery status&lt;/strong&gt; of outbound messages.&lt;/li&gt;
&lt;li&gt;Monitoring &lt;strong&gt;failed retries&lt;/strong&gt; or &lt;strong&gt;invalid endpoint responses&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Debugging SOAP/XML payloads directly from the monitoring queue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This ensures that any integration failures between Salesforce and Logic Apps (e.g., due to schema mismatches or endpoint issues) can be proactively addressed from both sides.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AetRKLhKH6vQapkrkAtdCGw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2AetRKLhKH6vQapkrkAtdCGw.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Salesforce CLI&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;All Salesforce metadata used for this integration is version-controlled in a Git repository. To extract and manage this metadata, we use the &lt;strong&gt;Salesforce CLI (sf)&lt;/strong&gt;, which supports robust project management and deployment automation.&lt;/p&gt;

&lt;p&gt;The CLI was used to retrieve key components from the source Salesforce org, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Custom Objects&lt;/li&gt;
&lt;li&gt;Apex Triggers&lt;/li&gt;
&lt;li&gt;Flows and Flow Definitions&lt;/li&gt;
&lt;li&gt;Layouts and Applications&lt;/li&gt;
&lt;li&gt;Workflow Rules&lt;/li&gt;
&lt;li&gt;Connected Apps
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;sf project retrieve start &lt;span class="nt"&gt;--metadata&lt;/span&gt; CustomObject &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; ApexTrigger &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; Flow &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; WorkFlow &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; Layout &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; FlowDefinition &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;--metadata&lt;/span&gt; CustomApplication &lt;span class="se"&gt;\&lt;/span&gt;
                          &lt;span class="nt"&gt;-m&lt;/span&gt; ConnectedApp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This allows easy export of Salesforce configuration and logic to be deployed across environments (e.g., staging, QA, production) via CI/CD pipelines or manual promotion.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Logic Apps&lt;/strong&gt; proved to be a powerful &lt;strong&gt;iPaaS&lt;/strong&gt; tool, allowing to connect Salesforce, Dynamics 365, and Microsoft Fabric with minimal custom code. By handling transformations, validations, and routing logic visually, writing full applications was avoided while still delivering complex integration flows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Organizing Logic Apps by direction&lt;/strong&gt; Salesforce to Dynamics 365 and Salesforce to Fabric made the architecture easier to manage and extend. Using a mix of triggers (polling, EventGrid, SOAP webhooks) gave the flexibility to respond to different Salesforce behaviors and system needs.&lt;/li&gt;
&lt;li&gt;For data transformation &lt;strong&gt;Liquid templates&lt;/strong&gt; gave a powerful template and transformation engine.&lt;/li&gt;
&lt;li&gt;Some complex cases needed the usage of &lt;strong&gt;Azure Functions&lt;/strong&gt; were essential when deeper logic was needed, since trying to do in a visual way on Logic Apps would have been terribly complicated.&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;Pub/Sub API for CDC&lt;/strong&gt; was a really efficient and easy way to provide a scalable event delivery over HTTP/2 and gRPC.&lt;/li&gt;
&lt;li&gt;Although at the start of the project &lt;strong&gt;DataMapper&lt;/strong&gt; looked promising, it had limitations and bugs for more complex use cases, so manual Compose actions, Liquid templates or XSLT updates were more reliable in those cases.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Coming Next
&lt;/h3&gt;

&lt;p&gt;📌 In the next post, I’ll dig into Dynamics 365 Integration&lt;/p&gt;

&lt;p&gt;👀 &lt;strong&gt;Follow me here on Medium&lt;/strong&gt; to catch Part 4.&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt; &lt;a href="https://github.com/zodraz/salesforce" rel="noopener noreferrer"&gt;https://github.com/zodraz/salesforce&lt;/a&gt;&lt;/p&gt;

</description>
      <category>integration</category>
      <category>ipaas</category>
      <category>salesforce</category>
      <category>logicapps</category>
    </item>
    <item>
      <title>OmniSync: Near Real-Time Lakehouse, Spark Streaming and Power BI in Microsoft Fabric (Part 2)</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Tue, 29 Apr 2025 15:38:59 +0000</pubDate>
      <link>https://dev.to/tarantarantino/omnisync-near-real-time-lakehouse-spark-streaming-and-power-bi-in-microsoft-fabric-part-2-noh</link>
      <guid>https://dev.to/tarantarantino/omnisync-near-real-time-lakehouse-spark-streaming-and-power-bi-in-microsoft-fabric-part-2-noh</guid>
      <description>&lt;p&gt;Building near real-time reports and analytics with Fabric’s Lakehouse, Spark, and Power BI.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzvrymj308pt71ju3yg3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzvrymj308pt71ju3yg3.png" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Posts in this Series
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-17bdfb29469e" rel="noopener noreferrer"&gt;OmniSync: A Real-World Architecture for Syncing Salesforce, D365, and Fabric in Near Real-Time (Part 1)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OmniSync: Real-Time Lakehouse, Spark Streaming, and Power BI in Microsoft Fabric (Part 2)&lt;/strong&gt; &lt;em&gt;(This Post)&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-integrating-salesforce-with-microsoft-fabric-and-dynamics-365-part-3-2a96f86e94a0" rel="noopener noreferrer"&gt;OmniSync: Integrating Salesforce with Microsoft Fabric and Dynamics 365 (Part 3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-dynamics-365-integration-with-salesforce-and-fabric-part-4-0ce408fe435a" rel="noopener noreferrer"&gt;&lt;em&gt;OmniSync:&lt;/em&gt; Dynamics 365 Integration with Salesforce and Fabric (Part 4)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This second part of the OmniSync series focuses on how &lt;strong&gt;Microsoft Fabric&lt;/strong&gt; was used to build the analytics and reporting layer, fed by both legacy data and near real-time sync flows from Salesforce and Dynamics 365.&lt;/p&gt;

&lt;p&gt;The goal wasn’t just to centralize data, but to apply real architectural principles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Medallion layer structuring (Bronze, Silver, Gold)&lt;/li&gt;
&lt;li&gt;Real-time streaming with Spark&lt;/li&gt;
&lt;li&gt;Unified models powering live Power BI reports&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Fabric provided a strong foundation for this — combining &lt;strong&gt;Dataflows Gen2&lt;/strong&gt; , &lt;strong&gt;Lakehouse&lt;/strong&gt; , &lt;strong&gt;Spark Notebooks&lt;/strong&gt; , and &lt;strong&gt;Power BI&lt;/strong&gt; into a single analytics platform.&lt;/p&gt;

&lt;p&gt;While this phase required learning new tools, it gave complete control over ingestion, transformation, and reporting, all within a single platform.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Microsoft Fabric Workspaces&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To keep things organized, OmniSync uses a dedicated Fabric workspace to manage all components involved in ingestion, transformation, and reporting.&lt;/p&gt;

&lt;p&gt;This workspace contains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lakehouses&lt;/strong&gt; for storing raw and curated data&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Environments&lt;/strong&gt; for custom libraries like log4Java&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API For GraphQL&lt;/strong&gt; for SPA accessing externally&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Stream&lt;/strong&gt; to get data externally and store on LakeHouse&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Semantic Models&lt;/strong&gt; as a way to model entities, relationships and measures&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dataflows Gen2&lt;/strong&gt; for batch ingestion (like the initial load)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spark Notebooks&lt;/strong&gt; for transformations, CDC logic, and business rules&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spark Jobs&lt;/strong&gt; for streaming notebooks&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Power BI reports&lt;/strong&gt; for final dashboards and metrics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monitoring Eventhouse&lt;/strong&gt; for monitoring workspace&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment Pipelines&lt;/strong&gt; for CI/CD amongst different environments&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each workspace is linked to a Microsoft OneLake storage container behind the scenes, where all the data (Delta tables, logs, artifacts) is physically stored.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc90hyxefcl726tqvn1ul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc90hyxefcl726tqvn1ul.png" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Why Lakehouse Instead of Data Warehouse?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync uses a Lakehouse as its main data store instead of a traditional Data Warehouse or even the newer SQL Database option in Microsoft Fabric.&lt;/p&gt;

&lt;p&gt;This decision was primarily driven by the goal of exploring and learning about the &lt;strong&gt;Lakehouse architecture&lt;/strong&gt; and the use of &lt;strong&gt;Spark&lt;/strong&gt; within Fabric. It gave a chance to evaluate its benefits and limitations compared to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A more traditional operational database like SQL&lt;/li&gt;
&lt;li&gt;A classic OLAP-style Data Warehouse&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While this may not have been the most production-optimized option for a real-time scenario (an &lt;strong&gt;Eventhouse&lt;/strong&gt; architecture might have been more suitable), it fit the PoC’s purpose perfectly, allowing to experiment with modern patterns, event-driven ingestion, and Spark-based transformations in a controlled and flexible environment.&lt;/p&gt;

&lt;p&gt;Microsoft provides a detailed guide to help decide between these storage types in Fabric:&lt;/p&gt;

&lt;p&gt;👉&lt;a href="https://learn.microsoft.com/en-us/fabric/fundamentals/decision-guide-data-store" rel="noopener noreferrer"&gt;Choosing a data store in Microsoft Fabric&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case, the Lakehouse provided the right mix of flexibility and hands-on Spark learning, while also supporting the real-time ingestion patterns needed from CDC.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;SQL Analytics Endpoint&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The SQL Analytics Endpoin &lt;strong&gt;t&lt;/strong&gt; in Microsoft Fabric allows you to query Lakehouse Delta tables using standard T-SQL, so there is no need to write Spark code for everything.&lt;/p&gt;

&lt;p&gt;For OmniSync, it was primarily used to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run ad hoc SQL queries to inspect raw and cleaned data&lt;/li&gt;
&lt;li&gt;Query externally from Logic Apps and validate integration data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This endpoint gives you a familiar SQL interface to the Lakehouse, which is especially useful when building reports or debugging synchronization issues without needing to open a notebook.&lt;/p&gt;

&lt;p&gt;It’s worth noting that while SQL Endpoint is great for reporting and light analytics, complex transformation logic is still handled more efficiently in &lt;strong&gt;Spark Notebooks&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4i7kpencn3fk7nfyqn8d.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4i7kpencn3fk7nfyqn8d.jpeg" width="800" height="354"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Medallion Architecture&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Before diving into the Medallion setup, keep in mind that OmniSync’s design draws on ideas from &lt;strong&gt;Kappa and Lambda architectures,&lt;/strong&gt; adapted into a simplified event-driven model.&lt;/p&gt;

&lt;p&gt;As explained in Part 1, OmniSync first runs an initial &lt;strong&gt;ETL&lt;/strong&gt; process using Dataflows Gen2 to load legacy SQL Server data into Fabric. After that, all operations are handled via streaming basis.&lt;/p&gt;

&lt;p&gt;This Medallion structure builds on that foundation: starting with batch ingestion, but fully shifting into real-time sync after the initial load.&lt;/p&gt;

&lt;p&gt;A Medallion Architecture is a common pattern in modern data engineering that organizes data into structured layers — typically &lt;strong&gt;Bronze&lt;/strong&gt; , &lt;strong&gt;Silver&lt;/strong&gt; , and &lt;strong&gt;Gold&lt;/strong&gt;  — to improve clarity, scalability, and data quality as it flows from raw ingestion to final reporting.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Bronze&lt;/strong&gt; : Raw, unprocessed data landed exactly as received&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Silver&lt;/strong&gt; : Cleaned, filtered, and enriched data, ready for modeling&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gold&lt;/strong&gt; : Business-ready data, often aggregated or transformed for analytics and BI tools&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This layered approach allows teams to decouple ingestion from transformation and reporting, while also enabling better debugging, governance, and performance optimization at each stage.&lt;/p&gt;

&lt;p&gt;OmniSync uses a Medallion architecture with a little difference: Bronze and Silver layers are used primarily for initial load, while the Gold layer is continuously updated with real-time streaming data from Salesforce and Dynamics 365.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2l327ykygfurztveiaqq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2l327ykygfurztveiaqq.png" width="411" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Bronze Layer&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Bronze layer ingests legacy data from a SQL Server database which in this case is Microsoft’s sample &lt;em&gt;ContosoDataWarehouse&lt;/em&gt;, adapted for our own entities and structure.&lt;/p&gt;

&lt;p&gt;Ingestion is handled via &lt;strong&gt;Dataflows Gen2&lt;/strong&gt; , which provides a fast, no-code way to move data into the Bronze Lakehouse without applying any transformations. It preserves the structure and contents exactly as received.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq69qodps3ue851x62tu5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq69qodps3ue851x62tu5.png" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Silver Layer&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The Silver layer is where we start to prepare the data for actual use.&lt;/p&gt;

&lt;p&gt;A Spark Notebook reads from the Bronze layer and performs several operations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cleansing and normalizing data&lt;/li&gt;
&lt;li&gt;Seeding initial structures&lt;/li&gt;
&lt;li&gt;Applying business transformations&lt;/li&gt;
&lt;li&gt;Creating new tables and adding derived columns&lt;/li&gt;
&lt;li&gt;Mapping values using reference tables (like MasterDataMapping)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For example, when processing &lt;strong&gt;Stores&lt;/strong&gt; :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A new table is created with cleaned schema&lt;/li&gt;
&lt;li&gt;Geolocation values are extracted into latitude and longitude&lt;/li&gt;
&lt;li&gt;Store-to-customer relationships are inferred and added&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe33pd5pl8k97hwkcc79t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe33pd5pl8k97hwkcc79t.png" width="800" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another example is Account seeding, where incoming Salesforce and Dynamics 365 Id’s are mapped using the MasterDataMapping table, aligning cross-platform references for future synchronization logic.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2och73ermeoh8zqjmvgs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2och73ermeoh8zqjmvgs.png" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this layer, data is structured and aligned with OmniSync’s data model, but not yet real-time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Gold Layer&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Gold layer is where real-time streaming meets analytics.&lt;/p&gt;

&lt;p&gt;Initially, this layer is populated via a notebook that copies cleaned tables from the Silver layer. But more importantly, Gold becomes the landing zone for live CDC events, handled by streaming Spark notebooks that process changes from Salesforce and Dynamics 365 in near real-time.&lt;/p&gt;

&lt;p&gt;This layer includes the final &lt;strong&gt;Sales fact table&lt;/strong&gt; , which joins and flattens multiple sources and becomes the main source for Power BI reporting.&lt;/p&gt;

&lt;p&gt;In OmniSync, the Gold layer is the main data layer. It combines past data with new updates, making the system simple, scalable, and ready for analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvmmz0xlrda2tlk1o65ms.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvmmz0xlrda2tlk1o65ms.png" width="800" height="325"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Star Schema Model&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The final Gold layer in OmniSync follows a Star Schema, a widely used and performant model for analytics tools like &lt;strong&gt;Power BI.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This wasn’t designed from scratch. The starting point was a legacy SQL Server Data Warehouse, which already had well-defined fact and dimension tables. For the PoC, we simplified that model and focused on what mattered most: clean reporting, easier maintenance, and compatibility with real-time updates.&lt;/p&gt;

&lt;p&gt;At the center of the model is the &lt;strong&gt;SalesOrder fact table&lt;/strong&gt; , adapted from the original FactOnlineSales fact table in Contoso’s warehouse. This is materialized in another fact table &lt;strong&gt;Sales&lt;/strong&gt; becoming the main table consumed by Power BI, and the foundation for all reporting.&lt;/p&gt;

&lt;p&gt;Surrounding it are simplified &lt;strong&gt;dimension tables&lt;/strong&gt; , created and enriched through Spark Notebooks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Products&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Product (Sub)Categories&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Stores&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Customers (Accounts)&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Geography&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Date&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Currency&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each of these dimensions is either loaded during the initial data load or kept fresh through real-time CDC synchronization from Salesforce and Dynamics 365.&lt;/p&gt;

&lt;p&gt;In this model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fact tables hold the &lt;strong&gt;transactional data&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Dimensions describe the &lt;strong&gt;context&lt;/strong&gt; (e.g., which product, store, customer, or region)&lt;/li&gt;
&lt;li&gt;Relationships are built using &lt;strong&gt;foreign keys&lt;/strong&gt; , enabling &lt;strong&gt;fast filtering and slicing&lt;/strong&gt; in Power BI&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The semantic layer was built using:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Spark Notebooks&lt;/strong&gt; to clean and prepare dimensions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Manual modeling in Power BI&lt;/strong&gt; to define relationships, hierarchies, and joins&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DAX measures&lt;/strong&gt; to define key metrics and calculated columns&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This setup offers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Fast cross-filtering&lt;/strong&gt; across dimensions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reusable metrics&lt;/strong&gt; that stay consistent across reports&lt;/li&gt;
&lt;li&gt;A clean separation between &lt;strong&gt;transformation logic&lt;/strong&gt; and &lt;strong&gt;report logic&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By applying a classic Star Schema, we keep the Power BI model both performant and intuitive, even with real-time data flowing in from multiple platforms.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7sqjtmf4gpa0oqa7o5ea.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7sqjtmf4gpa0oqa7o5ea.jpeg" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Power BI&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Power BI serves as the primary reporting tool for OmniSync, running directly on top of the Gold Lakehouse tables.&lt;br&gt;&lt;br&gt;
 Thanks to Microsoft Fabric’s native integration with Power BI, reporting is seamless, with no complex external connections needed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reports connect to the Gold Lakehouse tables using &lt;strong&gt;DirectLake&lt;/strong&gt; for high performance with cached data&lt;/li&gt;
&lt;li&gt;Data model is shaped using the &lt;strong&gt;Star Schema&lt;/strong&gt; built in Fabric&lt;/li&gt;
&lt;li&gt;DAX is used to define &lt;strong&gt;measures&lt;/strong&gt; and  &lt;strong&gt;KPIs&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Reports were modeled and created in &lt;strong&gt;Power BI Desktop&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Implemented Reports&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Several Power BI reports were created to visualize synced data from Salesforce and Dynamics 365 into our new LakeHouse built directly on top of the Gold layer in Fabric.&lt;/p&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Key Reports Included&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;📈Sales Report&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Displays total sales volume by region, stores, product categories, dates, COGS, Net Profit, % Change in Sales amongst others.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoqpv7d2qmqq8rimvh55.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzoqpv7d2qmqq8rimvh55.png" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;📈Sales By Country&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
 Displays total sales volume and LY comparison by selectable continent, country, brand, class and product category amongst others.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1ye4eozey9lfkl41kqw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1ye4eozey9lfkl41kqw.png" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;📈 Sales By Store&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Displays total sales by store, LY by month, product categories and geography of the stores.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmtg09swdx9jbu3difvr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmtg09swdx9jbu3difvr.png" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;📈Margin Analysis&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Shows profit analysis, net profit comparisons and those margins by month, city and product category.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb7hyc64j9qov6hqwptyv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb7hyc64j9qov6hqwptyv.png" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;📈 Sales By Product&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Displays total sales by product with last year comparison by country, brand, class, manufacturer, color and category amongst others.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftlhldsbd0jcl3yfcm91h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftlhldsbd0jcl3yfcm91h.png" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each report is built with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Slicers&lt;/strong&gt; for key dimensions (time, region)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DAX measures&lt;/strong&gt; for dynamic KPIs (e.g., % Net Profit, COGS)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Visual filtering&lt;/strong&gt; via field parameters&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Semantic Model &amp;amp; DAX Measures&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;On top of the Gold Lakehouse tables, OmniSync builds a Fabric semantic model to define relationships, and key performance metrics all using &lt;strong&gt;DAX&lt;/strong&gt; which then Power BI reports can query clean, efficiently and business stakeholders can consume.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Data Analysis Expressions (DAX)&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;is the formula language used in Power BI to define calculations, aggregations, and business logic across tables and models.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Semantic model includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Relationships&lt;/strong&gt; between fact and dimension tables are defined, following the &lt;strong&gt;Star Schema&lt;/strong&gt; built inside Fabric.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Calculated columns&lt;/strong&gt; are used for derived attributes, such as Store Type or Category Group.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hierarchies&lt;/strong&gt; are added for drilldowns, like navigating from Country → Region → City in Geography analysis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Business calculations&lt;/strong&gt; and KPIs are created using  &lt;strong&gt;DAX&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Several measures were used for the different reports some are relatively complex like the one below which calculates the Total Sales by Store and Date selected.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;Store&lt;/span&gt; &lt;span class="n"&gt;Sales&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Selected&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; 
&lt;span class="n"&gt;CALCULATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt; 
 &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Total&lt;/span&gt; &lt;span class="n"&gt;Sales&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt; 
 &lt;span class="n"&gt;FILTER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;ALLSELECTED&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Date'&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;DateKey&lt;/span&gt;&lt;span class="p"&gt;]),&lt;/span&gt;
        &lt;span class="s1"&gt;'Date'&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;DateKey&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Date'&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;DateKey&lt;/span&gt;&lt;span class="p"&gt;])),&lt;/span&gt;
    &lt;span class="n"&gt;GROUPBY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt; &lt;span class="n"&gt;Store&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Store&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;StoreName&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Other are simple like the Total Sales which is basically the sum of Sales&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;Total&lt;/span&gt; &lt;span class="n"&gt;Sales&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Sales&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Sales&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Real-Time Hub: Streaming Data into Fabric&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;After the initial batch load, OmniSync shifts entirely into near real-time data ingestion, powered by Azure services and Spark Streaming inside Microsoft Fabric.&lt;/p&gt;

&lt;p&gt;At the core of this live data flow is Fabric’s Real-Time Hub, a specialized engine that enables continuous event ingestion directly into Lakehouse tables without requiring heavy batch processing.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;How the Streaming Happens&lt;/strong&gt;
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce&lt;/strong&gt; and &lt;strong&gt;Dynamics 365&lt;/strong&gt; emit &lt;strong&gt;change events&lt;/strong&gt; (via CDC streams or webhooks).&lt;/li&gt;
&lt;li&gt;Events are routed through a &lt;strong&gt;middleware layer&lt;/strong&gt; involving &lt;strong&gt;EventGrid&lt;/strong&gt; , &lt;strong&gt;Logic Apps&lt;/strong&gt; , and &lt;strong&gt;Azure Event Hub&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;A &lt;strong&gt;Fabric Spark Streaming Notebook&lt;/strong&gt; listens continuously to &lt;strong&gt;Event Hub&lt;/strong&gt; using the &lt;strong&gt;Real-Time Hub&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Incoming events are processed, enriched, and immediately &lt;strong&gt;written into the Gold layer&lt;/strong&gt; of the Lakehouse.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In OmniSync, the Real-Time Hub makes it possible to &lt;strong&gt;continuously push changes&lt;/strong&gt; from Salesforce and Dynamics 365 directly into Fabric with minimal infrastructure setup.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Benefits of this Setup&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Near real-time updates&lt;/strong&gt; for key business entities (Accounts, Orders, Products)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automatic enrichment&lt;/strong&gt; (resolving relationships, IDs, lookups) during ingestion&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fresh Power BI dashboards&lt;/strong&gt; without relying on heavy batch refreshes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Low operational overhead&lt;/strong&gt; : No manual Spark cluster management needed&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While actual latency can vary slightly (due to retries, cold starts, or checkpoint delays), most updates flow through and appear in the Gold Lakehouse layer &lt;strong&gt;within 5–20 seconds&lt;/strong&gt; after a change is made in the source system.&lt;/p&gt;

&lt;p&gt;Thanks to Fabric’s Real-Time Hub, OmniSync achieves true near-real-time syncing, not just for analytics, but for live operational data across systems.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34avsij0vijueb036hzj.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34avsij0vijueb036hzj.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;CDC Table Format&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To make real-time ingestion manageable, each event captured from Salesforce or Dynamics 365 is normalized into a consistent &lt;strong&gt;CDC (Change Data Capture) format&lt;/strong&gt; before being processed in Fabric.&lt;/p&gt;

&lt;p&gt;Each incoming record includes fields like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Entity:&lt;/strong&gt; The type of entity (e.g., Account, Order)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operation:&lt;/strong&gt; Type of change (insert, update, delete)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Values:&lt;/strong&gt; JSON object representing the record state&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CreatedDate:&lt;/strong&gt; Creation timestamp of when the row was created&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;UpdateDate:&lt;/strong&gt; Update timestamp of when the was updated&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This consistent schema makes it easier for Spark notebooks to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Apply insert/update/delete logic&lt;/li&gt;
&lt;li&gt;Route each entity type to the right table&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuyq4vgztnzaeq8ocbbxv.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuyq4vgztnzaeq8ocbbxv.jpeg" width="800" height="355"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Streaming Notebook for CDC Processing
&lt;/h4&gt;

&lt;p&gt;As explained earlier, we use a &lt;strong&gt;streaming technique&lt;/strong&gt; to capture and process CDC data from external systems into Fabric. All raw CDC events are first stored in a centralized &lt;strong&gt;ExternalCDC Delta table&lt;/strong&gt; , which acts as the landing zone for real-time changes coming from &lt;strong&gt;Azure Event Hub&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;To process these changes continuously, we use &lt;strong&gt;Spark Structured Streaming&lt;/strong&gt; , and run it as a &lt;strong&gt;long-running Spark Job Definition&lt;/strong&gt; in Fabric. This keeps the notebook running in the background, monitoring for new events and processing them as they arrive in near real time.&lt;/p&gt;

&lt;p&gt;Here’s the basic Spark streaming code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;df&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;writeStream&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;outputMode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;append&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;trigger&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;processingTime&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;10 seconds&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;option&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;checkpointLocation&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Files/cdc_checkpoint&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;delta&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;foreachBatch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sendToSinkTable&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;start&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; \
    &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;awaitTermination&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This stream is scheduled and executed as a &lt;strong&gt;Fabric Job Definition&lt;/strong&gt; , allowing it to run continuously, restart on failure without manual triggering. Job is rescheduled every night at 1:00 am when nobody is working and the retry policy is to do indefinitely, on failure, every 10 seconds.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzlx96j0xrxlrnpktzxo1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzlx96j0xrxlrnpktzxo1.png" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The function &lt;em&gt;sendToSinkTable&lt;/em&gt; is a custom PySpark routine that decodes each batch of CDC events and applies them to the correct Lakehouse tables.&lt;/p&gt;

&lt;p&gt;It handles all &lt;strong&gt;Create&lt;/strong&gt; , &lt;strong&gt;Update&lt;/strong&gt; , and &lt;strong&gt;Delete&lt;/strong&gt; operations by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Parsing the raw JSON payloads into DataFrames&lt;/li&gt;
&lt;li&gt;Determining the type of operation&lt;/li&gt;
&lt;li&gt;Routing the data to the appropriate destination tables&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It also performs key logic like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creating &lt;strong&gt;auxiliary data&lt;/strong&gt; (Dates, Geography)&lt;/li&gt;
&lt;li&gt;Generating &lt;strong&gt;identity primary keys&lt;/strong&gt; where needed&lt;/li&gt;
&lt;li&gt;Mapping relationships (e.g., Store → Account)&lt;/li&gt;
&lt;li&gt;Decoding &lt;strong&gt;epoch timestamps&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Aligning cross-system IDs (Salesforce, Dynamics, Fabric)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Materializing sales data&lt;/strong&gt; for Power BI reporting&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Below we can see an image of the Sales materialization PySpark notebook code:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fighe0sike6526wb8uear.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fighe0sike6526wb8uear.jpeg" width="800" height="355"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Geolocation Mapping via GeoApify&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To enrich location-based entities like Stores and Accounts, OmniSync integrates with the &lt;strong&gt;GeoApify API&lt;/strong&gt; to resolve geographic coordinates and standardize location data within the &lt;strong&gt;Geography dimension&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This enrichment is handled by a dedicated Geography CDC Spark notebook, which is triggered as part of the real-time synchronization flow. The notebook analyzes each CDC event and determines the best way to enrich location fields based on the input data.&lt;/p&gt;

&lt;p&gt;Depending on the payload:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If &lt;strong&gt;latitude and longitude&lt;/strong&gt; are present, it performs &lt;strong&gt;reverse geocoding&lt;/strong&gt; using GeoApify to extract: street, city, postal code, region and country&lt;/li&gt;
&lt;li&gt;If coordinates are &lt;strong&gt;not available&lt;/strong&gt; , the notebook uses available address components (street, city, country, etc.) to perform &lt;strong&gt;forward geocoding&lt;/strong&gt; , resolving coordinates instead.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The notebook then merges the enriched fields (like resolved_country, lat, lon) back into the corresponding entity record before writing to the Delta table.&lt;/p&gt;

&lt;p&gt;GeoApify Provides:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Address normalization and cleaning&lt;/li&gt;
&lt;li&gt;Reverse geocoding (latitude/longitude → full address info)&lt;/li&gt;
&lt;li&gt;Forward geocoding (partial address → coordinates)&lt;/li&gt;
&lt;li&gt;A consistent &lt;strong&gt;geographic hierarchy&lt;/strong&gt; : Country &amp;gt; Region &amp;gt; City &amp;gt; Postal Code &amp;gt; Street&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The API is called directly from within Spark using an HTTP client, and the GeoApify API key is stored securely in Azure Key Vault, injected into the notebook environment at runtime.&lt;/p&gt;

&lt;p&gt;This enrichment is especially useful for region-based reporting in Power BI and for filtering and aggregating KPIs by geography, ensuring accurate and consistent location data across the entire reporting layer.&lt;/p&gt;

&lt;p&gt;Below we can see an image of the PySpark notebook code of this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0r2hyc85jn3b6i3buaw6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0r2hyc85jn3b6i3buaw6.png" width="800" height="647"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Master Data Mapping&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;As mentioned earlier, OmniSync uses a dedicated table called &lt;strong&gt;MasterDataMapping&lt;/strong&gt; to maintain consistent references between entities across different systems.&lt;/p&gt;

&lt;p&gt;This table plays a critical role in aligning data between &lt;strong&gt;Fabric&lt;/strong&gt; , &lt;strong&gt;Salesforce&lt;/strong&gt; , &lt;strong&gt;Dynamics 365&lt;/strong&gt; , and potentially &lt;strong&gt;SAP&lt;/strong&gt; (in future phases). It supports mapping for key business entities like &lt;strong&gt;Customers, Products, Stores, SalesOrders, and Currencies&lt;/strong&gt; , ensuring that records from different platforms can be reliably matched.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🔧 Table Structure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;MasterDataMapping:&lt;/strong&gt; Identity-based primary key (auto-incremented)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;FabricId:&lt;/strong&gt; Internal identifier used within Fabric&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SalesForceId:&lt;/strong&gt; External key referencing the Salesforce ID&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;D365Id:&lt;/strong&gt; External key referencing the Dynamics 365 ID&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SAPId:&lt;/strong&gt; Reserved for future SAP integration&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Entity:&lt;/strong&gt; The type of entity (e.g., Currency, Customer, Product, SalesOrder, Store)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Name:&lt;/strong&gt; Natural key or label to help identify the external entity&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IsDeleted:&lt;/strong&gt; Soft-delete flag&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CreatedDate:&lt;/strong&gt; When the record was created&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;UpdateDate:&lt;/strong&gt; When the record was last modified&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Previous Synced Systems&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;This table is pre-seeded with mapping values during the Silver layer processing, as explained earlier. These seed values allow the system to function properly from the start by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Establishing relationships across systems for historical data&lt;/li&gt;
&lt;li&gt;Preventing sync conflicts&lt;/li&gt;
&lt;li&gt;Enabling lookups during real-time CDC operations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This table is essential for ensuring that all references remain aligned, especially when syncing multiple systems that don’t share a common key structure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xiqg4vr3jllxm882z1i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xiqg4vr3jllxm882z1i.png" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;CRUD&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Supported operations in OmniSync are &lt;strong&gt;Create&lt;/strong&gt; , &lt;strong&gt;Update&lt;/strong&gt; , and &lt;strong&gt;Delete (obviating Read)&lt;/strong&gt;. While &lt;strong&gt;UnDelete&lt;/strong&gt; was also explored, mainly because Salesforce supports it, it was ultimately left out due to its complexity and the following reasons:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dynamics 365 does have an UnDelete feature, but it’s still in Preview.&lt;/li&gt;
&lt;li&gt;SAP, which we may support later, doesn’t offer this capability at all.&lt;/li&gt;
&lt;li&gt;Dynamics 365 also requires a non-trial environment to fully use it.&lt;/li&gt;
&lt;li&gt;Recover synced deleted entities in SAP and Dynamics 365 was not feasible in an easy way.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So to avoid overcomplicating the logic, ensure broader compatibility, and easy to test the system the decision was to not implement UnDelete in OmniSync.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Salesforce Deletes&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Even without UnDelete, Salesforce handles deletions as &lt;strong&gt;soft deletes&lt;/strong&gt;. To align with that behavior, OmniSync implements a soft delete strategy across all entities.&lt;/p&gt;

&lt;p&gt;Every table includes an &lt;strong&gt;IsDeleted&lt;/strong&gt; column, and delete operations only mark the record as deleted, rather than removing it.&lt;/p&gt;

&lt;p&gt;This decision leaves the door open to reintroduce the UnDelete logic in the future, if needed.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Idempotency&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;One critical part of the design is idempotency. In a distributed, event-driven system, “ &lt;strong&gt;exactly once&lt;/strong&gt; ” delivery can’t be guaranteed, due to retries, delays, or duplicated events amogst others.&lt;/p&gt;

&lt;p&gt;To handle this, OmniSync enforces idempotent logic during CDC processing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If a record already exists, it won’t be created again&lt;/li&gt;
&lt;li&gt;Deletes won’t be applied multiple times&lt;/li&gt;
&lt;li&gt;Updates are allowed if the incoming change is newer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is all handled in PySpark notebooks, since working with Lakehouse Delta tables means we don’t have traditional database guarantees like primary keys, unique constraints, or foreign keys. Those must be managed at the application logic level.&lt;/p&gt;

&lt;p&gt;While not bulletproof, this approach is solid enough for near real-time sync scenarios and avoids common pitfalls in distributed data processing.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Monitoring in Microsoft Fabric&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Microsoft Fabric offers a comprehensive suite of monitoring tools to help track performance, resource usage, and activity across your entire data ecosystem, from ingestion flows to Power BI and Spark jobs.&lt;/p&gt;

&lt;p&gt;These tools are especially important in scenarios like OmniSync, where streaming jobs, resource capacity, and real-time sync health must be monitored closely.&lt;/p&gt;

&lt;p&gt;We will detail now those used in Fabric for OmniSync.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Monitor Hub&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt; : Centralized activity tracking across all workspaces&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Provides a real-time view of active tasks, such as &lt;strong&gt;pipelines&lt;/strong&gt; , &lt;strong&gt;dataflows&lt;/strong&gt; , and &lt;strong&gt;notebooks&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Fabric users can see activities for items they have access to useful for quick status checks&lt;/li&gt;
&lt;li&gt;Great for monitoring job completion, failure states, or troubleshooting long-running tasks&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqj776dwknmwi475qp9u.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqj776dwknmwi475qp9u.jpeg" width="800" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Workspace Monitoring&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt; : In-depth logging and metric collection within a specific workspace&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Collects logs and performance data for all workspace-level items&lt;/li&gt;
&lt;li&gt;Helps with &lt;strong&gt;troubleshooting&lt;/strong&gt; , &lt;strong&gt;performance tuning&lt;/strong&gt; , and &lt;strong&gt;resource optimization&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Useful for capacity planning and identifying usage patterns over time&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6mpwtit1z7ap7mp6twy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6mpwtit1z7ap7mp6twy.png" width="800" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Microsoft Fabric Capacity Metrics App&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt; : : Enables monitoring of compute and storage usage for Microsoft Fabric and Power BI Premium capacities.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Tracks Capacity Units (CU) consumption offering insights into compute resource usage.&lt;/li&gt;
&lt;li&gt;Provides storage metrics usage data including OneLake and tenant-level storage.&lt;/li&gt;
&lt;li&gt;Identifies peak demand, usage patterns, and potential throttling issues to optimize performance.&lt;/li&gt;
&lt;li&gt;Helps admins decide on resource scaling or enabling autoscale for efficient capacity management.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryc3wtrhdci6ukqupli6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryc3wtrhdci6ukqupli6.jpeg" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Lakehouse &amp;amp; Spark Monitoring (with Log Analytics)&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt; : Track activity, performance, and health of Spark notebooks and streaming jobs running inside Fabric Lakehouses.&lt;/p&gt;

&lt;p&gt;Because OmniSync relies heavily on &lt;strong&gt;Spark Structured Streaming&lt;/strong&gt; to ingest CDC events, it’s essential to monitor Spark jobs at a granular level, beyond just workspace or capacity views.&lt;/p&gt;

&lt;p&gt;Fabric provides two main ways to monitor Spark activity:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🔎 Native Spark Monitoring&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Track Spark session start/end times&lt;/li&gt;
&lt;li&gt;Monitor executor memory and job durations&lt;/li&gt;
&lt;li&gt;Inspect checkpointing behavior for streaming pipelines&lt;/li&gt;
&lt;li&gt;Detect job failures, retries, and streaming lags&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This basic monitoring is available through Fabric UI — but for deeper observability, integration with &lt;strong&gt;Azure Log Analytics&lt;/strong&gt; is recommended.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzvcwk0zyirc5c0b48292.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzvcwk0zyirc5c0b48292.png" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🛠️ Log Analytics Integration for Spark Logs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To have proper, centralized log monitoring, &lt;strong&gt;Azure Log Analytics&lt;/strong&gt; is connected to Fabric through &lt;strong&gt;Environment configuration&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This setup allows you to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Capture &lt;strong&gt;custom logs&lt;/strong&gt; and &lt;strong&gt;metrics&lt;/strong&gt; directly from Spark notebooks&lt;/li&gt;
&lt;li&gt;Retain detailed execution history&lt;/li&gt;
&lt;li&gt;Run advanced queries to analyze job behavior, memory usage, and exceptions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To enable this it is needed to create a &lt;strong&gt;Fabric Environment&lt;/strong&gt; linked to the Omnisync workspace and configure with a Log Analytics Workspace and Key ID. After that Spark jobs and notebooks can be attached to run inside that Environment.&lt;/p&gt;

&lt;p&gt;More detailed instructions are provided here:&lt;br&gt;&lt;br&gt;
 &lt;a href="https://learn.microsoft.com/en-us/fabric/data-engineering/azure-fabric-diagnostic-emitters-log-analytics" rel="noopener noreferrer"&gt;🔗 Configure Fabric Diagnostic Emitters for Log Analytics&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9371ygqifvoa8kokiu8w.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9371ygqifvoa8kokiu8w.jpeg" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once configured, you can query Spark execution logs in Log Analytics using Kusto Query Language (KQL).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Kusto Query Language (KQL)&lt;/em&gt;&lt;/strong&gt; &lt;em&gt;is a powerful, query language used to analyze large volumes of structured, semi-structured, and unstructured data in Microsoft Azure services like Azure Data Explorer, Azure Monitor, and Microsoft Fabric.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;


&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;SparkListenerEvent_CL&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;fabricWorkspaceId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{FabricWorkspaceId}"&lt;/span&gt; 
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;artifactId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{ArtifactId}"&lt;/span&gt; 
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;fabricLivyId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{LivyId}"&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;order&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="n"&gt;TimeGenerated&lt;/span&gt; &lt;span class="k"&gt;desc&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;limit&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This query lets you monitor:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When jobs start and finish&lt;/li&gt;
&lt;li&gt;Memory and execution stats per notebook run&lt;/li&gt;
&lt;li&gt;Errors, retries, and warnings inside Spark pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This level of deep monitoring is essential for projects like OmniSync, where continuous ingestion reliability and performance are critical.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh10a1mnelcy1hozvat9b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh10a1mnelcy1hozvat9b.png" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;GraphQL Monitoring API&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt; : Monitors GraphQL API activity, providing metrics on request rates, latency, and errors for performance optimization.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dashboard &amp;amp; Logs: Features a visual dashboard and stores detailed logs in Kusto tables, retained for 30 days.&lt;/li&gt;
&lt;li&gt;Troubleshooting: Identifies slow queries and issues, requiring workspace monitoring and incurring storage costs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdh0otclutogw2veloou.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdh0otclutogw2veloou.png" width="800" height="510"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Security&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Microsoft Fabric uses a &lt;strong&gt;three-level security model&lt;/strong&gt; , evaluated sequentially when users access data:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Microsoft Entra ID authentication&lt;/strong&gt; : Verifies the user can authenticate to Azure Active Directory (now called Entra ID).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fabric access&lt;/strong&gt; : Checks whether the user has access rights to Fabric itself.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data security&lt;/strong&gt; : Controls what actions the user can perform on specific tables, files, or reports.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;At the &lt;strong&gt;data security level&lt;/strong&gt; , Fabric offers several access controls:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Workspace roles&lt;/strong&gt; (Admin, Contributor, Viewer)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Item-level permissions&lt;/strong&gt; (datasets, notebooks, reports)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Compute and granular permissions&lt;/strong&gt; (control over Spark, SQL resources)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OneLake data access controls&lt;/strong&gt; (still in preview)&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;How Security Was Handled in OmniSync&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Since OmniSync was a PoC developed by a &lt;strong&gt;single user&lt;/strong&gt; , security was intentionally kept very  &lt;strong&gt;simple&lt;/strong&gt; :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Only the Admin role was used across the Fabric workspace.&lt;/li&gt;
&lt;li&gt;Item-level permissions and granular compute permissions were not configured.&lt;/li&gt;
&lt;li&gt;OneLake access controls were not enabled.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This allowed faster development without worrying about permission conflicts.&lt;/p&gt;

&lt;p&gt;In a production environment, OmniSync would require full role separation, proper item-level security, compute-level permissions, and centralized identity management via Entra ID.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Client App Permissions&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For external integrations (like the SPA dashboard or Power BI REST calls), a &lt;strong&gt;Microsoft Entra App Registration&lt;/strong&gt; was created with only:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Power BI Service” permission&lt;/li&gt;
&lt;li&gt;“Run Queries and Mutations” (for GraphQL-based access)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This limits the app to read-only query access and prevents modification of workspace artifacts.&lt;/p&gt;

&lt;h4&gt;
  
  
  Key Vault Integration
&lt;/h4&gt;

&lt;p&gt;Key Vault was used to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Securely store &lt;strong&gt;sensitive tokens&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Inject secrets dynamically into Spark notebooks or environment variables at runtime&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In OmniSync, GeoApify external API keys were stored securely in Key Vault and pulled when needed by streaming notebooks.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;SPA Client App: GraphQL and Embedded Power BI&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;OmniSync includes a lightweight &lt;strong&gt;Single Page Application (SPA)&lt;/strong&gt; designed to provide a flexible and user-friendly way to interact with the system’s data.&lt;/p&gt;

&lt;p&gt;The SPA serves two main purposes:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🔎 Querying Fabric Lakehouse via GraphQL&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The app connects to Microsoft Fabric’s &lt;strong&gt;GraphQL endpoint&lt;/strong&gt; to directly query real-time data from the Lakehouse.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SPA can run GraphQL queries against views and tables.&lt;/li&gt;
&lt;li&gt;This allows lightweight, low-latency access to live data without needing to always use Power BI dashboards (if needed more complicated stuff).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example GraphQL query:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight graphql"&gt;&lt;code&gt;&lt;span class="k"&gt;query&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="n"&gt;lakehouseTable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;tableName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"salesorders"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;limit&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;OrderId&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;TotalAmount&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="n"&gt;StoreName&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Publishing Embedded Power BI Reports&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Besides querying raw data, the SPA also embeds full Power BI reports.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This lets users interact with pre-built dashboards without needing to navigate the Fabric portal directly.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Authentication is done through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Microsoft Entra App Registration&lt;/strong&gt; using OAuth 2.0.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MSAL&lt;/strong&gt; client library&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This approach combines flexibility (GraphQL) and rich visualization (Power BI) in one unified front-end.&lt;/p&gt;

&lt;p&gt;By combining direct data access and embedded visual analytics, the SPA provides a complete real-time monitoring and reporting solution without needing to depend entirely on the Fabric UI.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;CI/CD and Deployment&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Even though OmniSync was built as a simple PoC with a single main environment, it still took the opportunity to experiment with Fabric’s Git integration and deployment pipelines to simulate how Dev-to-Prod promotion and version control would work in a real-world scenario.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Git Integration in Fabric&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Fabric’s Git integration allows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Versioning&lt;/strong&gt; of artifacts (reports, datasets, notebooks, etc.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Branch separation&lt;/strong&gt; between development and production workspaces&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbay536ipo414mbks24ik.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbay536ipo414mbks24ik.jpeg" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Fabric Workspace Pipelines&lt;/strong&gt;
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;A &lt;strong&gt;deployment pipeline&lt;/strong&gt; was created inside Fabric, based on the &lt;strong&gt;OmniSync workspace&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Two stages were defined:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✔️ &lt;strong&gt;Development&lt;/strong&gt; (default workspace)&lt;/p&gt;

&lt;p&gt;✔️ &lt;strong&gt;Production&lt;/strong&gt; (new workspace connected to a different Git branch)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The pipeline was &lt;strong&gt;linked to Git integration&lt;/strong&gt; , assigning each workspace to its own branch:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✔️The Development workspace mapped to a &lt;strong&gt;dev Git branch&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;✔️The Production workspace mapped to the &lt;strong&gt;main Git branch&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;However, Fabric’s promotion between stages is  &lt;strong&gt;manual&lt;/strong&gt; :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Developers push changes into the Development branch.&lt;/li&gt;
&lt;li&gt;Inside Fabric, &lt;strong&gt;you must manually press the “Deploy” button&lt;/strong&gt; to move artifacts from Development to Production stages.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Still a Pull Request strategy could be implemented but as said to keep things simply this approach was chosen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbjnlobyplxgmtppj2kfd.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbjnlobyplxgmtppj2kfd.jpeg" width="800" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Working with Microsoft Fabric for real-time and its pipelines helped underastand more about an Analytics platform like Fabric. Here are the main takeaways from this phase of the project:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lakehouse + Spark Work :&lt;/strong&gt; Getting started with &lt;strong&gt;Spark notebooks&lt;/strong&gt; and &lt;strong&gt;Delta tables&lt;/strong&gt; inside Fabric was not easy at first, especially coming from more traditional SQL environments. But once the basics were in place, the flexibility of Spark for transformation and control logic was a huge win.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Medallion Architecture:&lt;/strong&gt; Organizing data into Bronze, Silver, and Gold layers helped keep everything clearly separated, even if it felt a bit heavy for a PoC where only the initial load used the full structure and only the Gold layer stayed active. Still, the separation brought real benefits in keeping the architecture clean and organized.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Streaming in Fabric:&lt;/strong&gt; Streaming data into Fabric Lakehouse using Spark notebooks works well, but it’s not as plug-and-play as typical streaming services. Features like CDC merge logic, idempotency all need to be implemented manually.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Power BI:&lt;/strong&gt; Is an amazing tool to display and work with Dashboards and makes super easy when you structure properly with STAR schema, geography and dates to display data through DirectLake. Still DAX measures somehow are not as comfortable and easy to learn if you are more used to a SQL syntaxis.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Coming Next&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;📌 In the next post, I’ll dig into SalesForce Integration&lt;/p&gt;

&lt;p&gt;👀 &lt;strong&gt;Follow me here on Medium&lt;/strong&gt; to catch Part 3.&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt; &lt;a href="https://github.com/zodraz/omnisync-fabric" rel="noopener noreferrer"&gt;https://github.com/zodraz/omnisync-fabric&lt;/a&gt;&lt;/p&gt;

</description>
      <category>microsoftfabric</category>
      <category>fabric</category>
      <category>ipaas</category>
      <category>integration</category>
    </item>
    <item>
      <title>OmniSync: A Real-World Architecture for Syncing Salesforce, D365 and Fabric in Near Real-Time (Part…</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Thu, 24 Apr 2025 17:22:35 +0000</pubDate>
      <link>https://dev.to/tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-part-5da2</link>
      <guid>https://dev.to/tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-part-5da2</guid>
      <description>&lt;h3&gt;
  
  
  OmniSync: A Real-World Architecture for Syncing Salesforce, D365 and Fabric in Near Real-Time (Part 1)
&lt;/h3&gt;

&lt;p&gt;A practical breakdown of building near real-time data sync across cloud CRMs and analytics using serverless Azure services.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frjeb50xc4djgd7p4wjj5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frjeb50xc4djgd7p4wjj5.png" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Posts in this Series
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://medium.com/@tarantarantino/omnisync-a-real-world-architecture-for-syncing-salesforce-d365-and-fabric-in-near-real-time-17bdfb29469e" rel="noopener noreferrer"&gt;&lt;strong&gt;OmniSync: A Real-World Architecture for Syncing Salesforce, D365 and Fabric in Near Real-Time (Part 1)&lt;/strong&gt;&lt;/a&gt; &lt;em&gt;(This Post)&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://medium.com/@tarantarantino/omnisync-near-real-time-lakehouse-spark-streaming-and-power-bi-in-microsoft-fabric-part-2-6f3177f0931a" rel="noopener noreferrer"&gt;OmniSync: Near Real-Time Lakehouse, Spark Streaming and Power BI in Microsoft Fabric (Part 2&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-integrating-salesforce-with-microsoft-fabric-and-dynamics-365-part-3-2a96f86e94a0" rel="noopener noreferrer"&gt;OmniSync: Integrating Salesforce with Microsoft Fabric and Dynamics 365 (Part 3)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/@tarantarantino/omnisync-dynamics-365-integration-with-salesforce-and-fabric-part-4-0ce408fe435a" rel="noopener noreferrer"&gt;&lt;em&gt;OmniSync:&lt;/em&gt; Dynamics 365 Integration with Salesforce and Fabric (Part 4)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;OmniSync&lt;/strong&gt; started as a side project for building a near real-time, multi-system data synchronization using mostly Azure-native tools. It connects systems like &lt;strong&gt;Salesforce&lt;/strong&gt; , &lt;strong&gt;Dynamics 365 (Sales)&lt;/strong&gt;, and &lt;strong&gt;Microsoft Fabric&lt;/strong&gt; , with future plans to bring in &lt;strong&gt;SAP S/4HANA&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;OmniSync is meant to sync with clean, scalable patterns that avoid tight coupling and support near real-time updates.&lt;/p&gt;

&lt;p&gt;OmniSync uses:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Logic Apps&lt;/strong&gt; to orchestrate&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Hub + Event Grid&lt;/strong&gt; for messaging&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Functions&lt;/strong&gt; for lightweight transformation&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Power BI and Fabric&lt;/strong&gt; for analytics&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Use Case
&lt;/h3&gt;

&lt;p&gt;To put OmniSync into context, let’s imagine a scenario based on a fictional enterprise: Contoso Inc.&lt;/p&gt;

&lt;p&gt;They’re a global company with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce&lt;/strong&gt; as the CRM for North America&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dynamics 365 Sales&lt;/strong&gt; running in EMEA&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SAP&lt;/strong&gt; running backend finance in APAC&lt;/li&gt;
&lt;li&gt;A legacy &lt;strong&gt;SQL Server warehouse&lt;/strong&gt; stitched together with SSIS&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And have these issues:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Different systems have different versions of the same customer&lt;/li&gt;
&lt;li&gt;Sales data is duplicated, delayed, or just plain wrong&lt;/li&gt;
&lt;li&gt;Nobody trusts the reports, especially executives&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Contoso wants to fix this. So instead of rebuilding from scratch, the idea is to &lt;strong&gt;sync just enough data across platforms&lt;/strong&gt; in near real time and centralize it for analytics.&lt;/p&gt;

&lt;p&gt;Here’s what they’re aiming for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Near real-time sync&lt;/strong&gt; between Salesforce and Dynamics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Unification of customer and sales dat&lt;/strong&gt; a across platforms&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Power BI dashboard&lt;/strong&gt; s driven by Microsoft Fabric&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Minimal coupling&lt;/strong&gt;  — each system should still work on its own&lt;/p&gt;

&lt;h3&gt;
  
  
  Architecture
&lt;/h3&gt;

&lt;p&gt;OmniSync is built on a loosely coupled, event-driven architecture, systems communicate through events or APIs.&lt;/p&gt;

&lt;p&gt;Here’s how it works:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Integration Layer&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Logic Apps&lt;/strong&gt; handle orchestration. Each system has its own Logic App that listens to changes (via webhooks, APIs or CDC) and pushes updates to other systems&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Event Hub&lt;/strong&gt; sits in the middle to send events to Fabric&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Grid&lt;/strong&gt; routes system-generated events like function triggers or retries&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Functions&lt;/strong&gt; take care of small, stateless transformation or filtering jobs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Data Platform&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Microsoft Fabric&lt;/strong&gt; handles the analytics side. It ingests data , stores it in a &lt;strong&gt;Lakehouse&lt;/strong&gt; , and exposes it through &lt;strong&gt;Power BI&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Fabric follows a &lt;strong&gt;medallion architecture&lt;/strong&gt; :&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ Bronze: Legacy &lt;strong&gt;SQL Server warehouse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✅ Silver: Cleaned business objects (Accounts, Orders, etc.)&lt;/p&gt;

&lt;p&gt;✅ Gold: Dashboard-ready data with measures and KPIs&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Identity &amp;amp; Security&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SSO on Azure and proper OAuth implementations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration users&lt;/strong&gt; in Salesforce and D365 prevent update loops&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Key Vault&lt;/strong&gt; stores secrets and credentials securely&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Monitoring and observability&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Azure Monitor&lt;/strong&gt; as the overall platform for &lt;strong&gt;Monitoring&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App Insights&lt;/strong&gt; as detailed view for different applications&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Log Analytics&lt;/strong&gt; to see pertinent Logs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lndkomvgky232xo4rry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lndkomvgky232xo4rry.png" width="800" height="414"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Event-Driven&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;When syncing systems like Salesforce and Dynamics 365, the challenge isn’t just &lt;em&gt;moving data&lt;/em&gt;, it’s doing it &lt;strong&gt;without collisions, loops, or delays&lt;/strong&gt;. OmniSync handles this using a &lt;strong&gt;clean event-driven architecture (EDA)&lt;/strong&gt; that combines:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Change Data Capture (CDC)&lt;/strong&gt; or webhook-based triggers&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sync origin tracking&lt;/strong&gt; to avoid loops&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration users&lt;/strong&gt; to track synchronizations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Light transformation logic&lt;/strong&gt; in Azure Functions or inline Logic App steps&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Hub&lt;/strong&gt; to route and decouple publishers from subscribers&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Loop Prevention &amp;amp; Conflict Handling&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;When you’re syncing systems in both directions, the biggest problem is to avoid &lt;strong&gt;update loops&lt;/strong&gt; and &lt;strong&gt;collisions between records&lt;/strong&gt; that were never meant to sync.&lt;/p&gt;

&lt;p&gt;OmniSync solves this in two key ways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Loop prevention using tracking&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Conflict detection using unique identifiers (like customer numbers)&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;How Loop Prevention Works&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Every record that gets synced carries a hidden field:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If Salesforce sends an update to D365, the Logic App tags it with the integration user.&lt;/li&gt;
&lt;li&gt;Then, when a Logic App monitoring D365 sees that record later, it checks if the user is the integration one &lt;strong&gt;.&lt;/strong&gt; If that is true, the synchronization is skipped. This completely stops update loops without relying on timestamps or hashes.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Conflict Handling&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;But what if the same account gets created in both systems independently?&lt;/p&gt;

&lt;p&gt;For instance, when a new Account is created in D365, the corresponding Logic App detects the change. Before inserting the record into Salesforce, a check is performed against the Fabric DataLake to determine if the record already exists. If a match is found, the &lt;em&gt;sync_status&lt;/em&gt; field in D365 is updated to “Conflict”.&lt;/p&gt;

&lt;h4&gt;
  
  
  ETL
&lt;/h4&gt;

&lt;p&gt;OmniSync uses a &lt;strong&gt;single ETL process&lt;/strong&gt; to migrate legacy data from a &lt;strong&gt;SQL Server Data Warehouse&lt;/strong&gt; into the new environment. This is done using &lt;strong&gt;Microsoft Fabric’s Dataflow Gen2&lt;/strong&gt; , a visual, user-friendly tool that makes it easy to extract and stage structured data.&lt;/p&gt;

&lt;p&gt;From there, the data enters the &lt;strong&gt;Bronze layer&lt;/strong&gt; of the Medallion architecture and flows through to the &lt;strong&gt;Golden Lakehouse layer&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This ETL step is just the &lt;strong&gt;starting point&lt;/strong&gt; , once the historical data is loaded, all updates and sync operations are fully &lt;strong&gt;event-driven&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;We’ll go deeper into this data flow in the follow-up article focused on &lt;strong&gt;Fabric + Medallion implementation&lt;/strong&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Kappa &amp;amp; Lambda Architectures&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Ideas from both architectures are taken, adapting them to fit on OmniSync.&lt;/p&gt;

&lt;p&gt;Let’s break down what that means in practice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lambda Architecture&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Lambda architecture&lt;/strong&gt; was designed to combine:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A &lt;strong&gt;batch layer&lt;/strong&gt; (slow, reliable processing)&lt;/li&gt;
&lt;li&gt;A &lt;strong&gt;speed layer&lt;/strong&gt; (real-time but eventually consistent)&lt;/li&gt;
&lt;li&gt;A &lt;strong&gt;serving layer&lt;/strong&gt; (where queries run)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It works well for &lt;strong&gt;big data pipelines&lt;/strong&gt;  — but it’s overkill for sync. Managing two parallel paths (real-time and batch) adds complexity and technical debt that OmniSync doesn’t need. Plus, it assumes you’re rebuilding and querying from large volumes of data — not syncing discrete, structured entities like Accounts and Orders.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Kappa Architecture&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Kappa model&lt;/strong&gt; removes the batch layer and handles everything via streaming. You store events once, replay them as needed, and build everything on top of that.&lt;/p&gt;

&lt;p&gt;In OmniSync, that’s &lt;strong&gt;closer to the truth&lt;/strong&gt; , especially with Event Hub acting as the event source of record. But OmniSync still isn’t doing full stream processing, continuously transforming big scale data. We’re handling &lt;strong&gt;discrete change events&lt;/strong&gt; with lightweight logic and routing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;OmniSync’s Actual Model: Event + Delta + Lakehouse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;What OmniSync really does is combine:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Event-based triggers&lt;/strong&gt; (CDC, webhooks)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Per-record transformation&lt;/strong&gt; (Functions, Logic Apps)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Push-based syncing&lt;/strong&gt; across systems&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;“Append-only” storage&lt;/strong&gt; in Fabric Lakehouse&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This gives us the best parts of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Kappa’s streaming mindset&lt;/li&gt;
&lt;li&gt;Lambda’s layered thinking (but simpler)&lt;/li&gt;
&lt;li&gt;Fabric’s native support for &lt;strong&gt;medallion architecture&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And most importantly: it’s a model that works for &lt;strong&gt;business systems&lt;/strong&gt; , not just analytics engines.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Medallion Architecture&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Once data is flowing between systems like Salesforce and Dynamics 365 OmniSync uses Microsoft Fabric with a &lt;strong&gt;Lakehouse&lt;/strong&gt; model and &lt;strong&gt;Power BI for its Data Analytics&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;We’ll use &lt;strong&gt;Dataflows Gen2&lt;/strong&gt; for the initial load, &lt;strong&gt;Notebooks&lt;/strong&gt; to move data across the Bronze, Silver, and Gold layers, and &lt;strong&gt;Streaming Spark&lt;/strong&gt; for CDC to materialize entities in real time. &lt;strong&gt;Power BI&lt;/strong&gt; then connects directly to the Gold layer to visualize data coming from Salesforce and Dynamics 365.&lt;/p&gt;

&lt;p&gt;More on the Lakehouse structure and Fabric in the next article.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Scope Simplification&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The project intentionally narrowed its scope to keep things manageable.&lt;/p&gt;

&lt;p&gt;In a full-scale retail implementation, you’d have to deal with dozens of interconnected business processes, entities, dependencies, and flows.&lt;/p&gt;

&lt;p&gt;But OmniSync is a PoC designed to explore new technologies and prove that near real-time, cross-system sync can work.&lt;/p&gt;

&lt;p&gt;So instead of modeling everything, is focused on a smaller, representative set of entities that still reflects real-world complexity:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Currencies&lt;/li&gt;
&lt;li&gt;Product Categories&lt;/li&gt;
&lt;li&gt;Products&lt;/li&gt;
&lt;li&gt;Stores&lt;/li&gt;
&lt;li&gt;Sales Orders&lt;/li&gt;
&lt;li&gt;Accounts&lt;/li&gt;
&lt;li&gt;Geography&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each of these entities was used to build and test sync flows between Salesforce, Dynamics 365, and Microsoft Fabric, with the goal of validating patterns, not rebuilding an entire enterprise data model.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;API Usage&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Most of the API interactions in OmniSync are handled indirectly. Thanks to Azure Logic Apps connectors and the built-in abstractions in Salesforce and Dynamics 365, there’s usually no need to manually call APIs or manage HTTP flows yourself.&lt;/p&gt;

&lt;p&gt;Still there were a few cases where direct API calls were unavoidable:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Salesforce CDC Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Salesforce’s Change Data Capture requires manual setup steps via API specifically to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create Streaming Channels&lt;/li&gt;
&lt;li&gt;Register Channel Members&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A Postman REST collection has been used to setup those:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.postman.com/salesforce-developers/salesforce-developers/folder/kjdbbxk/rest" rel="noopener noreferrer"&gt;&lt;strong&gt;Salesforce CDC API Reference (Postman)&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;GeoApify: Geolocation for the Lakehouse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To enrich location-based data like Stores and Geography in the DataLake, OmniSync used GeoApify, which is a lightweight, reliable geolocation API that converts addresses or coordinates inputs into normalized location data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.geoapify.com/" rel="noopener noreferrer"&gt;&lt;strong&gt;GeoApify Docs&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Below you can see a sample of the JSON returned for the GeoCoding API passing a latitude and longitude parameters.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"FeatureCollection"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"features"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Feature"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"properties"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"datasource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"sourcename"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"openstreetmap"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"attribution"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"© OpenStreetMap contributors"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"license"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Open Database License"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.openstreetmap.org/copyright"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"United Kingdom"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"country_code"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"gb"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"state"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"England"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"county"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Greater London"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"city"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"London"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"postcode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"W1H 1LJ"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"suburb"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Marylebone"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"street"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Upper Montagu Street"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"housenumber"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"38"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"iso3166_2"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"GB-ENG"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"lon"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;-0.160306360235508&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"lat"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;51.52016005&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"state_code"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ENG"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"result_type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"building"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"formatted"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"38 Upper Montagu Street, London, W1H 1LJ, United Kingdom"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"address_line1"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"38 Upper Montagu Street"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"address_line2"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"London, W1H 1LJ, United Kingdom"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"category"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"building.residential"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"timezone"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Europe/London"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"offset_STD"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"+00:00"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"offset_STD_seconds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"offset_DST"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"+01:00"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"offset_DST_seconds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;3600&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"abbreviation_STD"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"GMT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="s2"&gt;"ab
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Logic Apps as the Workflow Engine&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync uses Azure Logic as the low-code orchestrator that ties everything together.&lt;/p&gt;

&lt;p&gt;Every synchronization operation from listening to Salesforce events to pushing data into Dataverse runs through a Logic App.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Logic Apps?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Low-code, but powerful: Easily hook into HTTP endpoints, Dataverse, Event Hub, SQL, and more&lt;/li&gt;
&lt;li&gt;Built-in error handling: Retry policies, scopes, and parallel branches make it easy to isolate failures&lt;/li&gt;
&lt;li&gt;Easy to understand: Non developers like Business analysts can see workflows without needing to read code&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use Cases in OmniSync&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Listening to Salesforce platform events or Dataverse change notifications&lt;/li&gt;
&lt;li&gt;Tranformations (XML to JSON…)&lt;/li&gt;
&lt;li&gt;Formatting data (datettimes…)&lt;/li&gt;
&lt;li&gt;Constraints on checking data (data already inserted…)&lt;/li&gt;
&lt;li&gt;Validation logic on synchronization cycles&lt;/li&gt;
&lt;li&gt;Enriching payloads with metadata (&lt;em&gt;sync_status…&lt;/em&gt;)&lt;/li&gt;
&lt;li&gt;Routing data into Event Hub, Event Grid or Azure Functions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Logic Apps as a Low code tool made for orchestration is a great tool with flexibility and simplicity.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyoqfwvrcdugwr7fy27u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyoqfwvrcdugwr7fy27u.png" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;CI/CD&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;All source code used in OmniSync is stored in several public GitHub repositories and deployed using GitLab Actions across multiple projects. Each GitLab project is responsible for deploying a specific part of the architecture:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Main infrastructure is provisioned via Bicep templates&lt;/li&gt;
&lt;li&gt;Azure Container Apps are deployed using Bicep as well&lt;/li&gt;
&lt;li&gt;Azure Functions are built and published from source using CI pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This setup automates infrastructure and app deployment from code to cloud.&lt;/p&gt;

&lt;p&gt;Fabric, SalesForce and Dynamics 365 source code can be deployed manually through CLI or for better use integrated and deployed in their respective Platforms since they provide connectors to GitHub on their side.&lt;/p&gt;

&lt;p&gt;For the purposes of this PoC, we’re not managing separate Dev, Staging, and Production environments. However in a real-world scenario, a proper promotion model with multiple environments would be essential — and this architecture is designed to support that when needed.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Costs&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync was designed to be cost-effective, especially on the Azure side.&lt;/p&gt;

&lt;p&gt;Of course, you’ll still need valid licenses for Salesforce, Dynamics 365, Microsoft Fabric, and GeoApify, but most of these platforms offer free basic trials, which are more than enough to test and validate the system.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure Cost Optimizations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To keep Azure costs under control, OmniSync uses:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure Consumption Logic Apps — pay only when a workflow runs&lt;/li&gt;
&lt;li&gt;Azure Consumption Functions — event-driven, billed per execution&lt;/li&gt;
&lt;li&gt;Azure Container Apps — deployed with minimum CPU and memory, and limited to one replica&lt;/li&gt;
&lt;li&gt;All supporting Azure services (Event Hub, App Insights, Key Vault…) are configured using Free or Basic SKUs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Even Microsoft Fabric’s free trial includes access to an F64 SKU, which far exceeds the compute needs of this PoC and allows you to explore the entire Lakehouse and Power BI setup at full scale for free.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Scalability&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync is built entirely on serverless and event-driven components, scalability is not an issue by design. The architecture can grow automatically with demand, without needing any manual scaling logic, complex provisioning, or infrastructure refactoring.&lt;/p&gt;

&lt;p&gt;On the Azure side, all core services scale seamlessly:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure Logic Apps can run multiple parallel instances of the same workflow with built-in throttling and retry logic.&lt;/li&gt;
&lt;li&gt;Azure Functions automatically spin up additional compute to handle bursts of sync events or large payloads.&lt;/li&gt;
&lt;li&gt;Azure Container Apps are configured with a minimal baseline (1 replica, low CPU/memory)&lt;/li&gt;
&lt;li&gt;Azure Event Hub and Event Grid are designed for high-throughput, low-latency messaging and can absorb tens of thousands of events per second if needed.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Even though the PoC traffic is minimal, this same stack could support enterprise scale volumes without changes to the architecture.&lt;/p&gt;

&lt;p&gt;On the Microsoft Fabric side, scaling is easy. Since it’s a fully managed SaaS platform, you don’t need to worry about infrastructure. If things get heavy, like high sync volume or refresh rate, you can just increase the capacity (e.g., F8 to F16) with a few clicks.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Reliability&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Since the project prioritized cost over high availability, the reliability approach was intentionally kept lean, focusing only on zone redundancy only (not region wise).&lt;/p&gt;

&lt;p&gt;Here’s how redundancy was handled across key Azure services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Azure Logic Apps: ❌&lt;/strong&gt; Not zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Consumption Functions: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Event Grid: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Container Apps: ❌&lt;/strong&gt; Not zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Event Hub: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Storage Accounts: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Container Registry: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration Account: ❌&lt;/strong&gt; Not zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Key Vault: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Monitor: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Log Analytics: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Application Insights: ✅&lt;/strong&gt; Zone-redundant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Microsoft Entra (Azure AD): ✅&lt;/strong&gt; Globally available&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Since zone redundancy was considered sufficient for the scope and scale of the PoC, no separate DR strategy was implemented.&lt;/p&gt;

&lt;p&gt;For a production deployment, a more solid decision should be revisited and evaluate things like multi-region failover, backup strategies, and high-availability tiers.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Performance&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;While this project prioritized cost optimization, performance was never ignored. Even with the most basic SKUs across the Azure stack, OmniSync delivers solid real-world responsiveness thanks to its event-driven and auto-scaling architecture.&lt;/p&gt;

&lt;p&gt;No formal performance or stress testing was conducted, but manual tests syncing entities across the system typically complete in just a few seconds or up to ~20 seconds in edge cases; depending on factors like cold starts (Functions, Container Apps), retry logic or latencies.&lt;/p&gt;

&lt;p&gt;Even without tuning for performance, the architecture has proven to be fast enough for operational needs while keeping costs low.&lt;/p&gt;

&lt;p&gt;Note: Fabric’s free F64 SKU was not considered as part of the performance baseline, since it vastly exceeds this project’s actual requirements.&lt;/p&gt;

&lt;p&gt;If this were to move into production, full load testing and performance profiling would be the logical next step — but that’s outside the scope of this PoC.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Monitoring &amp;amp; Error Handling&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync uses Azure-native monitoring tools to track activity, detect issues, and collect performance metrics across all key components — from Logic Apps to Container Apps to Microsoft Fabric.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Monitoring Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;All apps and services are wired into Azure Monitor, with the following setup:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Application Insights is enabled across Azure Functions and Container Apps for telemetry, dependency tracking, and live logs.&lt;/li&gt;
&lt;li&gt;A Log Analytics Workspace is used to centralize monitoring across services.&lt;/li&gt;
&lt;li&gt;Logic Apps Monitoring Solution is deployed and connected to the workspace for advanced Logic App logging beyond the default run history.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This provides visibility across every layer of the sync process: API calls, retries, internal events, and more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fepxyk0zvfagf0i9qld3k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fepxyk0zvfagf0i9qld3k.png" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logs &amp;amp; Metrics&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Different services push logs and metrics in different ways:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Logic Apps: Errors and run history can be viewed directly in the Logic App blade or queried through Log Analytics via the monitoring solution.&lt;/li&gt;
&lt;li&gt;Container Apps: Logs are streamed into Azure Monitor and viewable via the Log stream or Log Analytics.&lt;/li&gt;
&lt;li&gt;Fabric: Metrics (especially for Spark executions) can be queried using KQL inside the Fabric workspace.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here’s a sample query used to inspect memory usage of Spark workloads:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;SparkMetrics_CL&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;fabricWorkspaceId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{FabricWorkspaceId}"&lt;/span&gt;
&lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;artifactId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{ArtifactId}"&lt;/span&gt;
&lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;fabricLivyId_g&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nv"&gt;"{LivyId}"&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;name_s&lt;/span&gt; &lt;span class="n"&gt;endswith&lt;/span&gt; &lt;span class="nv"&gt;"jvm.total.used"&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;summarize&lt;/span&gt; &lt;span class="k"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;value_d&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="n"&gt;bin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TimeGenerated&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;executorId_s&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="k"&gt;order&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="n"&gt;TimeGenerated&lt;/span&gt; &lt;span class="k"&gt;asc&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Error Detection &amp;amp; Diagnostics&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;OmniSync includes multiple built-in mechanisms to track and detect errors:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dead Letter Queues (DLQ) on Event Grid handle failures in event delivery when downstream handlers fail.&lt;/li&gt;
&lt;li&gt;Application Insights provides rich exception tracking and telemetry for Azure Functions and Container Apps.&lt;/li&gt;
&lt;li&gt;Azure Monitor Logs (via KQL) are used for deeper insights across services, especially for correlating multi-step sync issues.&lt;/li&gt;
&lt;li&gt;Logic App Run History allows quick inspection of individual failed runs, including step-by-step trace.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These tools offer both real-time observability and post-mortem diagnostics without adding external tooling.&lt;/p&gt;

&lt;h3&gt;
  
  
  Security
&lt;/h3&gt;

&lt;p&gt;OmniSync implements security with simplicity and control in mind, sticking to platform standards, but also making practical decisions to keep the setup lightweight for a PoC.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Authentication&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;There’s &lt;strong&gt;no unified SSO&lt;/strong&gt; across all platforms. Ideally, a single &lt;strong&gt;Microsoft Entra ID (Azure AD) user&lt;/strong&gt; would be used to sign in across Salesforce, Dynamics 365, and Fabric using SSO (Single Sign-On).&lt;/p&gt;

&lt;p&gt;But to keep things simple:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Each platform uses its &lt;strong&gt;own authentication system&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Even &lt;strong&gt;Azure&lt;/strong&gt; , &lt;strong&gt;Dynamics 365&lt;/strong&gt; , and &lt;strong&gt;Fabric&lt;/strong&gt; don’t share the same Entra user (Still they should but the project and trials evolved with 3 different Entra users)&lt;/li&gt;
&lt;li&gt;Platform specific users were configured instead of federating identity across systems&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Authorization&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Authorization was kept minimal for this PoC. In a production setup, roles would be more granular, but here most systems used &lt;strong&gt;default admin users&lt;/strong&gt; or minimally scoped integration users:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Dynamics 365 Integration User&lt;/strong&gt; : Assigned the &lt;em&gt;Basic User&lt;/em&gt; and &lt;em&gt;Audit&lt;/em&gt; roles.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce Integration User&lt;/strong&gt; : Assigned a custom &lt;em&gt;Integration Role.&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce Client App&lt;/strong&gt; : Connected via OAuth client credentials with minimum required permissions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fabric Client App&lt;/strong&gt; : Entra App Registration with only: “Run Queries and Mutations” and “Power BI Service” access&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Other Secrets &amp;amp; Access Keys&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The architecture also relied on a few platform specific keys and credentials:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Event Grid&lt;/strong&gt; : Accessed using access tokens&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Event Hub&lt;/strong&gt; : Secured with SAS tokens&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Salesforce&lt;/strong&gt; : OAuth 2.0 client credentials flow for integrations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SPA&lt;/strong&gt; : OAuth 2.0 Delegated Entra permissions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GeoApify&lt;/strong&gt; : External API key for resolving geolocation data&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure services&lt;/strong&gt; : Use &lt;strong&gt;Managed Identity&lt;/strong&gt; when possible for secure inter-service authentication&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Most secrets were &lt;strong&gt;not stored in Azure Key Vault&lt;/strong&gt; , intentionally, to simplify the PoC. However:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;GeoApify API key&lt;/strong&gt; and &lt;strong&gt;Container App certificates&lt;/strong&gt; were securely stored in &lt;strong&gt;Azure Key Vault&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Network Security &amp;amp; Private Access&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Private endpoints were not used&lt;/strong&gt; in OmniSync. Since the architecture didn’t require private networking or integrate with on-prem/IaaS resources, all traffic was routed publicly over the internet.&lt;/p&gt;

&lt;p&gt;This choice simplified the infrastructure and avoided unnecessary VNET configurations which is acceptable for a proof-of-concept like this.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Working on OmniSync provided a lot of insight into what it takes to integrate multiple enterprise systems together using native Azure tools. Here are some of the key lessons from building the PoC:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Event-Driven Patterns and Preventions:&lt;/strong&gt; while Logic Apps and Event Grid make near real-time sync simple, loop prevention is essential. Without these, you can quickly end up in an infinite update cycle between systems.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Loosely Coupled architecture:&lt;/strong&gt; by not using coupled services (Logic Apps, Event Hub, Functions) it makes easy to add or change systems (like SAP later) without rewriting the whole flows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monitoring:&lt;/strong&gt; even the project is at a small scale monitoring becomes important anyhow. Azure Monitor, App Insights, and simple KQL queries made it easy to trace errors and understand behavior without needing external tools.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;(Near) Real-Time Sync ≠ Instant Synchronization&lt;/strong&gt;: Even in an event-driven system, things like cold starts, retries, and API latency mean synchronizations that won’t always be sub-second. But 20 seconds at max for this end-to-end scenario is still very usable for operational needs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Platform ecosystem:&lt;/strong&gt; to integrate with SalesForce, Dynamics 365 and Fabric you are forced at least to understand and implement few things on those not familiar (until now) features like Flows on SalesForce, Power Apps or Lakehouses and DAX queries on Power BI.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Low Code tools:&lt;/strong&gt; most of the project was built without writing much code thanks to the low-code and visual tools provided by each platform. Logic Apps acting as the iPaaS layer handled most of the orchestration. Overall, these low-code tools significantly reduced development effort and complexity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fabric: t&lt;/strong&gt; urned out to be a strong choice for the analytics layer. While getting started with Lakehouse concepts and Spark notebooks wasn’t easy at first, it paid off quickly. It took time to ramp up, but in the end, Fabric offered a modern, scalable, and efficient platform for centralized reporting and analytics.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Future Plans&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;OmniSync already connects Salesforce, Dynamics 365 (Sales), and Microsoft Fabric still on the roadmap there’s still &lt;strong&gt;SAP S/4HANA Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Future plan is to use:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SAP Event Mesh&lt;/strong&gt; to publish change events (e.g. Product, Order, Customer)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Logic Apps for SAP&lt;/strong&gt; to consume and route those events&lt;/li&gt;
&lt;li&gt;Reuse existing OmniSync sync patterns: conflict detection, and integration users&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This will complete the loop across CRM, ERP, and analytics.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Coming Soon&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;📌In the next post, I’ll dig into Fabric Integration.&lt;/p&gt;

&lt;p&gt;👀 &lt;strong&gt;Follow me here on Medium&lt;/strong&gt; to catch Part 2.&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt; &lt;a href="https://github.com/zodraz/omnisync-docs" rel="noopener noreferrer"&gt;https://github.com/zodraz/omnisync-docs&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ipaasintegration</category>
      <category>dynamics365</category>
      <category>logicapps</category>
      <category>salesforce</category>
    </item>
    <item>
      <title>TimeWarden: An autonomous multi-agent with Copilot Studio</title>
      <dc:creator>Abel</dc:creator>
      <pubDate>Sat, 08 Feb 2025 00:21:46 +0000</pubDate>
      <link>https://dev.to/tarantarantino/timewarden-an-autonomous-multi-agent-with-copilot-studio-2gp8</link>
      <guid>https://dev.to/tarantarantino/timewarden-an-autonomous-multi-agent-with-copilot-studio-2gp8</guid>
      <description>&lt;h4&gt;
  
  
  Time-Off Management with Autonomous AI in Microsoft Copilot Studio
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmwbnl8c63fh3idu989z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmwbnl8c63fh3idu989z.png" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After watching Ignite 2024 this October, I was amazed by the capabilities showcased in the Copilot autonomous agent presentations. Intrigued by their potential, I set out to explore and test these features, which led me to create a project designed to deepen my understanding of this innovative technology. This is how TimeWarden came to life: a multi-agent solution developed using Copilot Studio.&lt;/p&gt;

&lt;h3&gt;
  
  
  Copilot Studio
&lt;/h3&gt;

&lt;p&gt;Copilot Studio provides a powerful platform for building autonomous AI agents that understand work processes and act on behalf of users. By combining managed SaaS infrastructure, advanced AI models, a low-code design interface, and a vast library of prebuilt connectors, these agents can automate tasks, support business roles, and enhance efficiency across various applications and data sources.&lt;/p&gt;

&lt;h4&gt;
  
  
  Key Benefits
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Efficiency:&lt;/strong&gt; Autonomous agents handle repetitive tasks, freeing up employees to focus on more complex and creative work.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Accuracy:&lt;/strong&gt; By automating processes, agents reduce the risk of human error, ensuring more accurate and consistent outcomes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability:&lt;/strong&gt; Agents can be scaled to support large organizations, handling high volumes of tasks without additional human resources.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost Savings:&lt;/strong&gt; Automating routine tasks can lead to significant cost savings by reducing the need for manual labor.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Improved Customer Experience:&lt;/strong&gt; Agents can provide real-time support and information, enhancing customer satisfaction.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Characteristics
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Autonomous Functionality:&lt;/strong&gt; Operating continuously, they proactively monitor business events and execute actions without manual intervention.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dynamic Workflow Planning:&lt;/strong&gt; Their ability to create and adjust execution strategies in real time enables flexibility in complex environments.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;User-Friendly Integration:&lt;/strong&gt; With low-code customization, organizations of all sizes can tailor these agents to address specific operational challenges.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These innovations mark a significant step toward creating AI-first business processes, promising to streamline operations and drive measurable business value.&lt;/p&gt;

&lt;h3&gt;
  
  
  TimeWarden
&lt;/h3&gt;

&lt;p&gt;TimeWarden is an autonomous multi-agent system designed to manage employee time-off requests by cross-referencing them with a Jira project. It intelligently determines whether to approve or reject requests automatically or escalate them for project manager approval based on specific conditions.&lt;/p&gt;

&lt;p&gt;It is not intended for production use but serves as a proof of concept (POC) for exploring the potential of this technology. Due to this, it has multiple limitations, and the scenario is constrained.&lt;/p&gt;

&lt;p&gt;Additionally, TimeWarden can be extended or updated as needed. For example, while the current trigger is an email-based time-off request, it could be replaced with a Power Platform Form App or any other trigger supported by Copilot Studio. In this case, I chose email as a simplified scenario, allowing TimeWarden and its supporting agents to interpret and categorize the message.&lt;/p&gt;

&lt;h4&gt;
  
  
  Description
&lt;/h4&gt;

&lt;p&gt;One of the first steps in creating a Copilot is providing a description of the agent, outlining its purpose, target audience, and desired outcomes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkw3ufdqob1rnmkxjpgxw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkw3ufdqob1rnmkxjpgxw.png" width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Instructions
&lt;/h3&gt;

&lt;p&gt;The Instructions section defines how the agent behaves, outlining its tasks and the approach it takes to complete them. Below are the instructions for TimeWarden:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Checking the Email&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Verifies if the email is a time-off request. If not, it stops processing.&lt;/li&gt;
&lt;li&gt;Checks if the sender belongs to the Contoso project team. If not, it sends a rejection email explaining why.&lt;/li&gt;
&lt;li&gt;Ensures all required details (dates, hours, reason, type of leave) are provided. If anything is missing, the requester receives an email listing the missing fields.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Reviewing the Sprint Schedule&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Retrieves the latest sprint schedule from Jira to assess how the request fits into the project timeline.&lt;/li&gt;
&lt;li&gt;Categorizes the request into:
– Current Sprint (happening now)
– Future Sprint (upcoming)
– Outside Sprint (not affecting any sprint)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Handling Current Sprint Requests&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Checks if the employee has sufficient time-off balance.&lt;/li&gt;
&lt;li&gt;If everything is fine, the request is sent to the manager for approval.&lt;/li&gt;
&lt;li&gt;If approved, the system updates the records and notifies the requester.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Managing Requests Outside a Sprint&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If no sprint is affected and the leave balance is sufficient, the request is automatically approved.&lt;/li&gt;
&lt;li&gt;If the balance is insufficient, the request is rejected, and the employee is notified.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 5: Handling Future Sprint Requests&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Assesses whether the absence impacts sprint capacity.&lt;/li&gt;
&lt;li&gt;If the request does not reduce sprint availability below 80%, it is approved automatically.&lt;/li&gt;
&lt;li&gt;Otherwise, it is escalated to a manager for review.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At every step, TimeWarden ensures employees and managers stay informed via automated emails, streamlining time-off management.&lt;/p&gt;

&lt;p&gt;Notice that intentionally a simplified version is provided rather than the exact instructions from Copilot Studio. One key lesson learnt through experimentation was &lt;strong&gt;prompt engineering&lt;/strong&gt; , which let me to rework and refine multiple times those instructions to make them more precise and detailed so that Copilot Studio could accurately understand and execute them.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Prompt engineering&lt;/strong&gt; is the practice of designing and refining inputs (prompts) to guide AI models, like ChatGPT or Copilot, to generate accurate, relevant, and useful outputs. It involves structuring questions, instructions, or contextual information in a way that optimizes the AI’s response.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fykm4kx7cgzkr5rezms9e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fykm4kx7cgzkr5rezms9e.png" width="800" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Autonomous Triggers
&lt;/h3&gt;

&lt;p&gt;Autonomous Triggers in Copilot Studio are event-driven mechanisms that allow AI agents to take action automatically, without requiring direct user input. These triggers detect specific conditions-such as an incoming email, a database update, or a scheduled event-and initiate workflows or responses accordingly, having Copilot Studio hundreds of those.&lt;/p&gt;

&lt;p&gt;TimeWarden detects incoming emails that are likely time-off requests and parses their content to extract all the necessary information to create a time-off request.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dynamic Agent Plans
&lt;/h3&gt;

&lt;p&gt;Copilot Studio allow AI agents to create and adjust execution strategies on the fly based on real-time data and evolving conditions. Instead of following a fixed workflow, these agents analyze the situation, determine the best course of action, and adapt as needed.&lt;/p&gt;

&lt;p&gt;So depending on the use case on the instructions they will do one thing or another for example let’s see a TimeWarden example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An employee requests time off.&lt;/li&gt;
&lt;li&gt;The agent checks sprint schedules and workload balance dynamically.&lt;/li&gt;
&lt;li&gt;If the request fits within capacity, it auto-approves it.&lt;/li&gt;
&lt;li&gt;If it risks project delays, the agent escalates it to a manager for review.&lt;/li&gt;
&lt;li&gt;If a manager rejects the request, the agent suggests alternative dates based on team availability.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Activity
&lt;/h3&gt;

&lt;p&gt;When testing your agent Copilot Studio provides an Activity Map for every session, offering a visual representation of an agent’s sequence of inputs, decisions, and outputs. This helps in identifying issues and improving workflows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kamdqmidchwkrd04zj4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kamdqmidchwkrd04zj4.png" width="800" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Users can also access the Activity page to review:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Session details&lt;/li&gt;
&lt;li&gt;Step-by-step activities within a session&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4o6msdsv7xshelrpdoyi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4o6msdsv7xshelrpdoyi.png" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Actions
&lt;/h3&gt;

&lt;p&gt;Actions in Copilot Studio are pre-built functionalities that can be used to extend and automate workflows. They allow you to perform specific tasks, such as generating code, making API calls, manipulating data, or integrating with other services, all through a simple interface.&lt;/p&gt;

&lt;p&gt;There are different type of actions and hundreds which could be used, but the ones used by TimeWarden are custom Power Automate flows (except for the Send Mail action)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbtwdsdj9v8l9xwm8npc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbtwdsdj9v8l9xwm8npc.png" width="800" height="360"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl0qcbcembfuiv56immj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl0qcbcembfuiv56immj.png" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Power Automate
&lt;/h3&gt;

&lt;p&gt;As mentioned before, TimeWarden’s actions use Power Automate flows, as well as other features from Power Automate like custom connectors, AI Hub, and custom prompts, which we will explain later. This integration allows Copilot Studio to automate tasks, connect with external services, and enhance AI-driven workflows, enabling Copilots to fetch data, update records, and execute complex actions seamlessly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Custom Connectors
&lt;/h3&gt;

&lt;p&gt;A Custom Connector in Power Automate allows you to create your own connectors to integrate services or APIs that aren’t already available in Power Automate’s built-in connectors.&lt;/p&gt;

&lt;p&gt;Since the Jira connector didn’t provide the needed actions needed and BambooHR connector didn’t exist, 2 custom connectors were created with the needed actions to integrate those APIs with the flows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkctodughz82xbcry5oru.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkctodughz82xbcry5oru.png" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F545c6ps7iuz21gvvuh8w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F545c6ps7iuz21gvvuh8w.png" width="390" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Flows
&lt;/h3&gt;

&lt;p&gt;A flow in Power Automate is an automated process that connects different apps and services to perform tasks without manual intervention. There are different flows but the ones used are Cloud Flows which connect TimeWarden Copilot through events which map to those Copilot actions, properly specifying input and output parameters (if specified).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc6endz0ooe02wrgoyuhm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc6endz0ooe02wrgoyuhm.png" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Approvals
&lt;/h3&gt;

&lt;p&gt;TimeWarden uses Microsoft Teams Approvals (powered by Power Platform), allowing users to create, manage, and track approval requests directly within Teams.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjr1qwf92h1k01b3iipbx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjr1qwf92h1k01b3iipbx.png" width="465" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Depending on the scenario, a manager approval may be required to create a time-off request. In this case, the manager will receive a Teams notification containing all the relevant details provided by the user, allowing them to approve or reject the request.&lt;/p&gt;

&lt;h3&gt;
  
  
  Multi Agent
&lt;/h3&gt;

&lt;p&gt;One of the challenges with the initial version of TimeWarden was that it attempted to handle all tasks within the Instructions prompt. This approach proved ineffective in certain cases, such as categorizing emails or identifying missing fields. The solution was to introduce specialized agents dedicated to specific tasks, working together under the coordination of TimeWarden, which serves as the main orchestrator. This led to the creation of three additional agents:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Time Off Request Categorizer&lt;/li&gt;
&lt;li&gt;Email Time Off field extractor&lt;/li&gt;
&lt;li&gt;Sprint classifier&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All these agents are available in the AI Hub of Power Automate, a centralized platform that allows users to integrate AI-driven capabilities into their automation workflows. It offers access to pre-built AI models, custom AI models, and Azure OpenAI services.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylsst7js79hlfk6svhc2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylsst7js79hlfk6svhc2.png" width="800" height="287"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Time Off Request Categorizer
&lt;/h4&gt;

&lt;p&gt;Below is a picture displaying the actual prompt. This prompt essentially analyzes a text (such as an email) and classifies it into various categories, including Vacation, Sick, etc. The prompt’s response is provided in natural language.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwx930i3pfjnucrsx9s7t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwx930i3pfjnucrsx9s7t.png" width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the next picture, you can see that it is activated within the email trigger flow, and its output is directed to the TimeWarden Copilot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx4msc9jvpwcdrczsktxt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx4msc9jvpwcdrczsktxt.png" width="800" height="567"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Email Time Off field extractor
&lt;/h4&gt;

&lt;p&gt;As we can see in the following prompt image, this agent analyzes an email to determine if any fields are missing, like Start Date or End Date. Additionally, it takes into account whether a Time off Type parameter is specified, without needing to check if that field is missing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogb1gqt99d0ueiht27qv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogb1gqt99d0ueiht27qv.png" width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As before, it is triggered within the email flow, and its output is sent to TimeWarden Copilot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fowhs5qh8awxg346o6leg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fowhs5qh8awxg346o6leg.png" width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Sprint classifier
&lt;/h4&gt;

&lt;p&gt;Another agent with an interesting functionality classifies a sprint as current, future, or outside based on a JSON containing all sprint data, along with the start and end dates of a time-off request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3rev8k6jo22237ll065.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl3rev8k6jo22237ll065.png" width="800" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The following image depicts a simple flow triggered by a Copilot. The Sprint Classifier agent is invoked with parameters received from the Copilot, and its output is then sent back to TimeWarden.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbltq6pzxwqahpmofv290.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbltq6pzxwqahpmofv290.png" width="800" height="632"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Lessons Learnt
&lt;/h3&gt;

&lt;p&gt;During the development apart from learning about the technologies some lessons applying to LLM and AI were learnt.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Attempting to perform specialized actions, like recognizing whether an email was a time-off request with established categories, didn’t work as expected. This led to the creation of additional agents (prompts) that specialized in these tasks, enabling them to be successfully completed.&lt;/li&gt;
&lt;li&gt;Prompt engineering provided the opportunity to explore and evaluate other LLMs, such as DeepSeek, Qwen, Copilot, and OpenAI. I leveraged them to help describe instructions for Copilot when the initial instructions were unclear or too vague for Copilot Studio to understand.&lt;/li&gt;
&lt;li&gt;Even though Copilot Studio allows up to 8,000 characters for instructions, TimeWarden was limited to 4,400 characters, not even accounting for the prompts of other agents. This made me realize that for complex projects, instructions must be broken down into many smaller agents. Despite that, it still feels like not enough input space for a complex system. I hope Microsoft will increase this limit in the future.&lt;/li&gt;
&lt;li&gt;Copilot Studio requires a high level of clarity for certain tasks, such as forcing it to stop the entire process or not execute specific parts of the instructions (i.e., conditional branches).&lt;/li&gt;
&lt;li&gt;As Copilot Studio is still in preview, some tasks required workarounds, like disabling topics, as they should never be invoked by an autonomous agent.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusions
&lt;/h3&gt;

&lt;p&gt;TimeWarden demonstrates the transformative potential of integrating autonomous agents into everyday business processes. By leveraging Copilot Studio and some specialized agents, TimeWarden automates time-off request management. Throughout this project, several important insights have come to light:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Agent Collaboration:&lt;/strong&gt; Breaking down complex tasks into specialized agents not only improved performance but also underscored the importance of a modular approach in designing scalable AI systems.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Prompt Engineering Mastery:&lt;/strong&gt; The iterative refinement of prompts was crucial for aligning AI outputs with specific business logic. This process highlighted both the power and the current limitations of AI-driven instructions, prompting continuous enhancements.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability and Flexibility Challenges:&lt;/strong&gt; Although Copilot Studio provides powerful automation features, its limitation contraints and the requirement for explicit control in workflows pose challenges that need to be overcome to scale more complex solutions.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Overall, TimeWarden stands as a compelling proof of concept that not only streamlines time-off management but also lays the groundwork for next-generation AI-powered business automation. Additionally, the continuous evolution of Copilot Studio and autonomous agents is likely to inspire further innovation in how organizations harness AI to drive measurable business value.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Source code&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;You can find all the source code with a detailed install instructions on:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/zodraz/TimeWarden" rel="noopener noreferrer"&gt;GitHub - zodraz/TimeWarden: A multi-agent Copilot&lt;/a&gt;&lt;/p&gt;

</description>
      <category>genai</category>
      <category>artificialintelligen</category>
      <category>copilotstudio</category>
      <category>conversationalai</category>
    </item>
  </channel>
</rss>
