DEV Community

Cover image for Event-Driven Integration in 2026: Connecting Azure and SAP Without Heavy Middleware
Der Sascha
Der Sascha

Posted on • Originally published at blog.bajonczak.com

Event-Driven Integration in 2026: Connecting Azure and SAP Without Heavy Middleware

In many enterprises, integration still means nightly batch jobs, CSV exports and brittle point-to-point connections. Meanwhile, the rest of the world expects near real-time updates, automated workflows and systems that talk to each other without human glue.

In 2026, we finally have the tools to make this easier — especially in the Microsoft/Azure ecosystem. Event-driven integration is no longer a buzzword from conference talks; it’s a practical way to connect systems like SAP and Microsoft 365 without building another giant ESB.

In this article, I’ll walk through a concrete, reproducible example:

  • We assume an event happens in SAP (e.g. a sales order is created or changed).
  • This event is exposed via a simple webhook or API proxy.
  • Azure receives the event, routes it via Event Grid and triggers an Azure Function.
  • The Function enriches the data and posts a message into a Microsoft Teams channel.

This is obviously only one direction (SAP → Azure → M365), but the pattern generalises: once you have a clean event-driven backbone, you can plug in other consumers: data warehouses, monitoring, additional line-of-business apps.

I’ll show you:

  1. A high-level architecture
  2. A minimal Terraform setup for Event Grid + Function + Storage
  3. A simple Azure Function implementation
  4. A Copilot prompt you can use to generate a full GitHub repo from this idea

1. High-level architecture

Let’s keep the scenario simple and realistic.

  • SAP is the system of record for orders.
  • We want near real-time notifications in a Teams channel whenever a new high-value order is created.
  • We also want a lightweight way to plug additional consumers onto the same event stream later.

The flow looks like this:

  1. SAP emits an event when an order is created/changed.In practice this could be:
    • an outbound integration via SAP BTP / Integration Suite
    • an OData/REST API call to a small adapter
    • an HTTP webhook from SAP into Azure API Management
  2. Azure API Management receives the OrderCreated event and forwards it to an Azure Function or directly into Event Grid.
  3. Azure Event Grid acts as the event backbone, fanning out the event to one or more subscribers:
    • an Azure Function that posts to a Teams channel
    • optionally a second Function that writes to a data store
    • maybe a Logic App for non-code workflows
  4. The Azure Function formats a message and posts it to a Teams channel via an incoming webhook or Graph API.

Key idea: SAP only cares about emitting a clean event once. Azure takes care of routing and fan-out.

2. Terraform: setting up the Azure side

Below is a simplified Terraform configuration that sets up:

  • a resource group
  • a storage account (for the function app)
  • a function app
  • an Event Grid topic
  • an Event Grid subscription that sends events to the function
Note: This is intentionally simplified. In a real setup you’d add proper naming conventions, tags, secrets management (Key Vault), etc.
terraform {
  required_version = ">= 1.5.0"

  required_providers {
    azurerm = {
      source  = "hashicorp/azurerm"
      version = "~> 3.100"
    }
  }
}

provider "azurerm" {
  features {}
}

variable "location" {
  default = "westeurope"
}

variable "resource_group_name" {
  default = "rg-sap-events-demo"
}

variable "project_name" {
  default = "sap-events-demo"
}

resource "azurerm_resource_group" "rg" {
  name     = var.resource_group_name
  location = var.location
}

resource "random_string" "sa_suffix" {
  length  = 6
  upper   = false
  lower   = true
  numeric = true
  special = false
}

resource "azurerm_storage_account" "sa" {
  name                     = "sapevents${random_string.sa_suffix.result}"
  resource_group_name      = azurerm_resource_group.rg.name
  location                 = azurerm_resource_group.rg.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
}

resource "azurerm_app_service_plan" "plan" {
  name                = "${var.project_name}-plan"
  location            = azurerm_resource_group.rg.location
  resource_group_name = azurerm_resource_group.rg.name
  kind                = "FunctionApp"
  reserved            = true

  sku {
    tier = "Dynamic"
    size = "Y1"
  }
}

resource "azurerm_linux_function_app" "func" {
  name                = "${var.project_name}-func"
  location            = azurerm_resource_group.rg.location
  resource_group_name = azurerm_resource_group.rg.name
  service_plan_id     = azurerm_app_service_plan.plan.id

  storage_account_name       = azurerm_storage_account.sa.name
  storage_account_access_key = azurerm_storage_account.sa.primary_access_key

  identity {
    type = "SystemAssigned"
  }

  site_config {
    application_stack {
      node_version = "~18"
    }
  }

  app_settings = {
    FUNCTIONS_WORKER_RUNTIME = "node"
    AzureWebJobsStorage      = azurerm_storage_account.sa.primary_connection_string
    TEAMS_WEBHOOK_URL        = "https://outlook.office.com/webhook/..." # replace
  }
}

resource "azurerm_eventgrid_topic" "topic" {
  name                = "${var.project_name}-topic"
  location            = azurerm_resource_group.rg.location
  resource_group_name = azurerm_resource_group.rg.name
}

resource "azurerm_eventgrid_event_subscription" "func_sub" {
  name  = "${var.project_name}-func-sub"
  scope = azurerm_eventgrid_topic.topic.id

  webhook_endpoint {
    url = azurerm_linux_function_app.func.default_hostname
  }

  retry_policy {
    event_time_to_live    = 1440
    max_delivery_attempts = 5
  }
}

3. Azure Function: from event to Teams message

Next, we implement a simple Azure Function that:

  • receives an Event Grid event
  • extracts key fields
  • posts a formatted message into Teams
// File: OrderCreated/index.js
const axios = require("axios");

/*
Expected event:
{
  "orderId": "12345",
  "customerName": "Contoso AG",
  "amount": 50000,
  "currency": "EUR",
  "createdAt": "2026-02-20T18:41:00Z",
  "sourceSystem": "SAP"
}
*/

module.exports = async function (context, eventGridEvent) {
  context.log("Received Event Grid event:", eventGridEvent);

  const data = eventGridEvent.data || {};

  const orderId = data.orderId;
  const customerName = data.customerName;
  const amount = data.amount;
  const currency = data.currency;
  const createdAt = data.createdAt;
  const sourceSystem = data.sourceSystem || "SAP";

  if (!orderId || !customerName || !amount || !currency) {
    context.log.warn("Missing required fields in event data, skipping");
    return;
  }

  const webhookUrl = process.env.TEAMS_WEBHOOK_URL;
  if (!webhookUrl) {
    context.log.error("TEAMS_WEBHOOK_URL not configured");
    return;
  }

  const text =
`🚀 New high-value order from ${sourceSystem}

**Order ID:** ${orderId}
**Customer:** ${customerName}
**Amount:** ${amount} ${currency}
${createdAt ? `**Created at:** ${createdAt}` : ""}`;

  try {
    await axios.post(webhookUrl, { text });
    context.log("Posted order notification to Teams");
  } catch (err) {
    context.log.error("Failed to post to Teams", err);
  }
};
function.json
{
  "bindings": [
    {
      "type": "eventGridTrigger",
      "name": "eventGridEvent",
      "direction": "in"
    }
  ]
}

4. How SAP fits into this pattern

On the SAP side, there are multiple ways to emit events:

  • SAP S/4HANA eventing
  • SAP BTP Integration Suite
  • Custom adapter watching table changes

Important:

  • Define a clear event schema
  • Publish events to Event Grid
  • Azure handles routing

Adding new consumers later becomes trivial.

5. The Code

The code for this solution is stored onto my github account https://github.com/SBajonczak/BlogSapTeams

6. Closing

The nice thing about this pattern is that it’s not limited to SAP or orders. Once you have an event-driven backbone in Azure, you can:

  • plug in other systems as event sources
  • add new consumers without touching the source system
  • move from nightly batch to near real-time

It’s not the flashiest trend headline, but in 2026 this kind of pragmatic event-driven integration is often where real value hides.

Top comments (0)