DEV Community

Cover image for Consuming an Azure ML service from .NET
Sam Vanhoutte
Sam Vanhoutte

Posted on

Consuming an Azure ML service from .NET

For a project in the cycling community, I had to predict how long pro-riders would take to complete a certain race on the calendar. Different options, such as the parcours type, the distance and elevation obviously play a role in this.

So I used Azure Machine Learning to create an inference service that got deployed on Azure Compute Instance and that exposes an API for this.

In this post, I will explain how you can easily call the inference services of AzureML from dotnet, by passing strong typed objects to the inference method.

All of the code is available in the Open Sourced library: Aerozure.

Calling the scoring service from Postman

You can call the service that is hosted on the following url : https://..inference.ml.azure.com/score.
To authenticate, you have to pass in a Bearer token that you can retrieve from the Azure ML portal.

Calling the scoring endpoint can be done by posting a JSON that looks like the following. As you can see, it's a list of the column names, and their given values.

{
  "input_data": {
    "columns": [
      "month",
      "year",
      "distance",
      "elevation",
      "PointsScale",
      "UciScale",
      "ParcoursType",
      "ProfileScore",
      "RaceRanking",
      "StartlistQuality",
      "Classification",
      "Category"
    ],
    "index": [1],
    "data": [
        [2, 2025, 192, 1523, "2.PRO.Stage", "UCI.WR.Pro.Stage", 1, 22, null, 641, "2.Pro", "ME - Men Elite"]
    ]
  }
}
Enter fullscreen mode Exit fullscreen mode

Streamline this call from csharp

The following code converts a strong typed dotnet class to the required input for the API.

Defining the request in csharp

The following class represents the actual data that is being sent to the API endpoint.

Inputdata.cs

using System.Text.Json.Serialization;
namespace Aerozure.Azureml;

public class InputData
{
    [JsonPropertyName("columns")]
    public string[] Columns { get; set; }

    [JsonPropertyName("index")]
    public int[] Index { get; set; }

    [JsonPropertyName("data")]
    public object[][] Data { get; set; }
}
Enter fullscreen mode Exit fullscreen mode

The AzureML request

The AzuremlRequest will be used to serialize payload that is posted to the API and most importantly will convert the strong typed (PoCo) class to this request. For this, some basic Reflection is used to loop through all the properties of the object and take these properties and their values to the actual payload.

AzuremlRequest.cs

using System.Reflection;
using System.Text.Json.Serialization;

namespace Aerozure.Azureml;

public class AzuremlRequest
{
    [JsonPropertyName("input_data")] public InputData InputData { get; set; }

    public static AzuremlRequest Create(object payload)
    {
        var columns = new List<string>();
        var data = new List<object>();
        foreach (PropertyInfo property in payload.GetType().GetProperties())
        {
            columns.Add(property.Name);
            data.Add(property.GetValue(payload, null));
        }

        return new AzuremlRequest
        {
            InputData = new InputData
            {
                Columns = columns.ToArray(),
                Index = [1],
                Data = [data.ToArray()]
            }
        };
    }
}
Enter fullscreen mode Exit fullscreen mode

The AzureML Inference client

The AzuremlClient class can be injected and expects the AzuremlOptions to be available on startup. Two properties are required there to be set (BearerToken and InferenceEndpoint).

AzuremlClient.cs

using System.Net.Http.Json;
using Aerozure.Configuration;
using Flurl.Http;
using Microsoft.Extensions.Options;

namespace Aerozure.Azureml;

public class AzuremlClient
{
    private readonly AzuremlOptions mlOptions;

    public AzuremlClient(IOptions<AzuremlOptions> mlOptions)
    {
        this.mlOptions = mlOptions.Value;
    }
    public async Task<T?> CallInferenceAsync<T>(object request)
    {
        var mlRequest = AzuremlRequest.Create(request);
        var response = await mlOptions.InferenceEndpoint
            .WithHeader("Authorization", $"Bearer {mlOptions.BearerToken}")
            .PostJsonAsync(mlRequest);
        return await response.ResponseMessage.Content.ReadFromJsonAsync<T>();
    }
}
Enter fullscreen mode Exit fullscreen mode

Using the above code in your project

If you are using Aerozure, you can just enable the integration on startup in your Program.cs, by executing the following lines of code.

builder.Services
    .AddAerozure()
    .Configure<AzuremlOptions>(options => configuration.GetSection("azureml").Bind(options));
Enter fullscreen mode Exit fullscreen mode

This will make the AzuremlClient available, configured for your endpoint with the right BearerToken.

Calling the inference endpoint, can be done by executing the following lines (here using the information on a given Race object).

var raceData = new RaceData
{
    month = race.Date!.Value.Month,
    year = race.Date!.Value.Year,
    distance = race.Distance,
    elevation = race.Elevation,
    race.PointsScale,
    race.UciScale,
    race.ParcoursType,
    race.ProfileScore,
    race.RaceRanking,
    race.StartlistQuality,
    race.Classification,
    race.Category
};
var result = await azuremlClient.CallInferenceAsync<double[]>(dataObject);
if (result != null)
{
    results.Add(race.Id, new RaceDurationPrediction
    {
        Distance = (int)race.Distance,
        Duration = result.First()
    });
}
Enter fullscreen mode Exit fullscreen mode

Top comments (0)