This article is part of #ServerlessSeptember. You'll find other helpful articles, detailed tutorials, and videos in this all-things-Serverless content collection. New articles are published every day β that's right, every day β from community members and cloud advocates in the month of September.
Find out more about how Microsoft Azure enables your Serverless functions at https://docs.microsoft.com/azure/azure-functions/.
The managed environment, consumption based billing and the event based programming model of Azure Functions makes them an ideal compute platform for many cloud applications and the simplicity can make it super easy to get started and try out ideas. However without care its easy for a codebase to grow to contain a lot of repetitive boilerplate and inconsistencies. Additionally the fairly raw nature of HTTP functions can make it difficult to build a REST API as you need to provide a lot of the plumbing you'd get for free with something like ASP.Net Core (a much requested feature for Azure Functions is to be able to run ASP.Net Core - I'm sure it will arrive eventually but its not here yet!).
Many examples of Azure Functions focus on simple use cases and don't really address these things. As such you'll often see the trigger, logging, serialization and other cross cutting concerns wrapped up with the business problem and it all gets a bit messy. While this can be fine for small projects it can become a maintenance nightmare when you have more than a handful of functions, limits reuse and can make it difficult to host your business logic in different environments.
To address this I created Function Monkey - a framework that allows you to write Azure Functions using the command and mediator pattern in a way that naturally leads to a clean separation of concerns and also provides the infrastructure for building REST APIs and uses a declarative builder style approach to creating a serverless application based on Azure Functions.
To demonstrate how this works I'm going to show how you can use Function Monkey to build out the backend for a simple ToDo application in C# that makes use of Cosmos DB as a data store and then show how the command dispatcher / mediator can be used to address additional cross cutting concerns.
There's a lot of code in this post and the code for the completed project can be found on GitHub.
Creating a REST API
Firstly, using the IDE or editor of your choice (I use Rider), create an empty Azure Functions application and then install two NuGet packages:
dotnet add package FunctionMonkey -v 3.0.11-beta5
dotnet add package FunctionMonkey.Compiler -v 3.0.12-beta5
Note: we're adding beta versions of Function Monkey here and so if you're using an IDE you may need to enable pre-release packages - one of the additions to the 3.x series was support for Service Bus sessions however that requires referencing a beta version of the Microsoft.Azure.WebJobs.Extensions.ServiceBus package, the Function Monkey packages will be moved out of preview just as soon as that is but in all other respects they are production ready.
We'll begin by create a model to represent a ToDo item. Create a folder in the project called Models and in that folder create a class called ToDoItem:
using System;
namespace ToDo.Models
{
public class ToDoItem
{
public string Id { get; set; }
public string CreatedByUserId { get; set; }
public string Title { get; set; }
public DateTime CreatedAtUtc { get; set; }
public bool IsComplete { get; set; }
}
}
Next create a folder in the project called Commands and in that folder create a class called AddToDoItem that contains the state needed to create a command for a given user:
using System;
using AzureFromTheTrenches.Commanding.Abstractions;
using ToDo.Models;
namespace ToDo.Commands
{
public class AddToDoItemCommand : ICommand<ToDoItem>
{
public string UserId { get; set; }
public string Title { get; set; }
}
}
ICommand is a Function Monkey defined interface that identifies a class as being a command, in this case we also declare a response type for the command and set it to be our ToDoItem model - we'll return the newly created item. If your command does not have a response then their is a non-generic variant that can be used to declare a command with no response.
Now we have a mechanism of representing a ToDo item and a means of expressing our intent to create an item we need to implement the functionality that will carry this out. First create a folder called Services and in their create an interface called IToDoItemRepository:
using System;
using System.Threading.Tasks;
using ToDo.Models;
namespace ToDo.Services
{
internal interface IToDoItemRepository
{
Task Upsert(ToDoItem toDoItem);
}
}
We'll use Azure Cosmos DB as our database for this simple application and so before we can fill out the implementation of our repository we'll need to add the Cosmos DB package:
dotnet add package FunctionMonkey Microsoft.Azure.Cosmos
This will add the latest 3.x version of the Cosmos client which is a big improvement on previous releases with many usability and performance enhance,ments.
You'll need to create a Cosmos account and inside that a database and a document container, if you're new to Cosmos you can follow these instructions. As you create the resources take note of the Cosmos connection string, the database name and the container name as we'll need them in the next steps.
Now we'll need a way of getting the connection information for Cosmos to our repository. To do this we'll create a class called ApplicationSettings in the root of our project:
namespace ToDo
{
internal class ApplicationSettings
{
public string CosmosConnectionString { get; set; }
public string CosmosDatabaseName { get; set; }
public string CosmosContainerName { get; set; }
}
}
And now we can create an implementation of our repository interface in the Services folder called ToDoItemRepository:
using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;
using Microsoft.Azure.Cosmos;
using ToDo.Models;
namespace ToDo.Services
{
internal class ToDoItemRepository : IToDoItemRepository
{
private readonly CosmosClient _cosmosClient;
private readonly string _databaseName;
private readonly string _containerName;
public ToDoItemRepository(ApplicationSettings settings)
{
CosmosClientOptions options = new CosmosClientOptions()
{
SerializerOptions = new CosmosSerializationOptions()
{
PropertyNamingPolicy = CosmosPropertyNamingPolicy.CamelCase
}
};
_cosmosClient = new CosmosClient(settings.CosmosConnectionString, options);
_databaseName = settings.CosmosDatabaseName;
_containerName = settings.CosmosContainerName;
}
public async Task Upsert(ToDoItem toDoItem)
{
Container container = _cosmosClient.GetContainer(_databaseName, _containerName);
await container.UpsertItemAsync(toDoItem, new PartitionKey(toDoItem.Id));
}
}
}
In the sample above we're using a feature of the new 3.x Cosmos client to use camel case when serializing our models - previously we would have needed to mark our Id property on the ToDoItem class with JsonProperty("id") or use lower case for its name. In our upsert method we simply use the client to upsert a to do item.
Next we need to wire all this up with a command handler. Create a folder called Handlers and create a class called AddToDoItemCommandHandler:
using System;
using System.Threading.Tasks;
using AzureFromTheTrenches.Commanding.Abstractions;
using ToDo.Commands;
using ToDo.Models;
using ToDo.Services;
namespace ToDo.Handlers
{
internal class AddToDoItemCommandHandler : ICommandHandler<AddToDoItemCommand, ToDoItem>
{
private readonly IToDoItemRepository _repository;
public AddToDoItemCommandHandler(IToDoItemRepository repository)
{
_repository = repository;
}
public async Task<ToDoItem> ExecuteAsync(AddToDoItemCommand command, ToDoItem previousResult)
{
ToDoItem newItem = new ToDoItem
{
CreatedAtUtc = DateTime.UtcNow,
CreatedByUserId = command.UserId,
Id = Guid.NewGuid().ToString(),
IsComplete = false,
Title = command.Title
};
await _repository.Upsert(newItem);
return newItem;
}
}
}
Command handlers are marked by their implementation of the ICommandHandler interface which accepts as its first generic parameter the type of the command it responds to and as the second, optional, generic parameter the response type of the command. Handlers must then implement the ExecuteAsync method to perform their logic. Although we won't use the functionality in this walk-through multiple command handlers can run in response to a command and the previousResult parameter represents the output from the previous handler. In our case it will always be null.
Finally with our command, command handler and repository in place we can wire it all together and expose out add command as a function by providing a function app builder. To do this create a class called FunctionAppConfiguration in the project root that implements the IFunctionAppConfiguration interface as shown below:
using System;
using System.Net.Http;
using FunctionMonkey.Abstractions;
using FunctionMonkey.Abstractions.Builders;
using Microsoft.Extensions.DependencyInjection;
using ToDo.Commands;
using ToDo.Services;
namespace ToDo
{
public class FunctionAppConfiguration : IFunctionAppConfiguration
{
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) =>
{
ApplicationSettings applicationSettings = new ApplicationSettings
{
CosmosContainerName = "todoItems",
CosmosDatabaseName = "testdatabase",
CosmosConnectionString =
Environment.GetEnvironmentVariable("cosmosConnectionString", EnvironmentVariableTarget.Process)
};
serviceCollection
.AddSingleton(applicationSettings)
.AddTransient<IToDoItemRepository, ToDoItemRepository>();
commandRegistry.Discover<FunctionAppConfiguration>();
})
.Functions(functions => functions
.HttpRoute("api/v1/todoItem", route => route
.HttpFunction<AddToDoItemCommand>(HttpMethod.Post)
)
);
}
}
}
The Setup method exposes an IServiceCollection interface for us to register our dependencies - in this case we register the repository (required by the handler) and the application settings (required by the repository). It also exposes an ICommandRegistry interface for us to register our commands and handlers. We can do this manually or, more straightforwardly, by using the Discover method and providing a generic argument of a type in the assembly that contains the handlers.
Next the Functions method allows us to register our functions. In this case we declare a HTTP route and a function on that route that binds the post verb to the AddToDoItemCommand.
Before we can run this we have just two final changes to make. First (for reasons that will become more obvious a little later) we need to prevent Azure Functions from prefixing "api" onto each of our functions. We do this by creating a host.json file with the following contents:
{
"version": "2.0",
"extensions": {
"http": {
"routePrefix": ""
}
}
}
And finally we need to provide the connection information for Cosmos, as we're going to be running these functions locally we'll set this up in our local.settings.json file:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"cosmosConnectionString": "<enter your connection string here>"
}
}
Azure Functions need a storage account to run and in this example I've used the storage emulator. If you're running on a Mac you can use a real storage account or use Azurite (an open source alternative to the Azure Storage Emulator).
With all that done we can, finally!, run our Azure Functions app. If we run the app we can make a request in Postman to our endpoint and see the created ToDo item returned:
Great - we can call our endpoint using the POST verb and have a to do item created, saved in Cosmos, and returned. However if you look at the payload we passed to Postman you might notice we're passing in a user ID. This is intended to be the ID of the user who is signed in but being able to pass it as part of the payload would mean any authentication and authorisation system could be circumvented. To address this we'll set Function Monkey up so it reads the user ID from a claim. In a real example we'd use a signed token in the Authorization header but to demonstrate how this works we'll simply validate that a token is supplied and set things up with a hard coded user ID. If you're interested in exploring this more fully see this video which demonstrates how to configure authorization against Auth0.
First we need to implement a class that can validate a supplied token - we do this by implementing the ITokenValidator interface. Add the class TokenValidator to the root of the project:
using System.Security.Claims;
using System.Threading.Tasks;
using FunctionMonkey.Abstractions;
namespace ToDo
{
public class TokenValidator : ITokenValidator
{
public Task<ClaimsPrincipal> ValidateAsync(string authorizationHeader)
{
if (string.IsNullOrEmpty(authorizationHeader))
{
return Task.FromResult<ClaimsPrincipal>(null);
}
ClaimsPrincipal claimsPrincipal = new ClaimsPrincipal(
new[]
{
new ClaimsIdentity(new[]
{
new Claim("userId", "user1"),
}),
}
);
return Task.FromResult(claimsPrincipal);
}
}
}
For authorized users token validators should return a ClaimsPrincipal (normally constructed from the claims in a bearer token - see the documentation here) or return null (or throw an exception) for unauthorized users. In this example we create a claim with a name of userId and give it a hardcoded value of user1 if any none-zero length value is passed in the Authorization header.
Now we need to tell Function Monkey to use this validator and to map the claim on to the UserId property of our command. We do this using the Authorization builder options as shown below:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup( /* as before */ )
.Authorization(authorization => authorization
.TokenValidator<TokenValidator>()
.AuthorizationDefault(AuthorizationTypeEnum.TokenValidation)
.Claims(claims => claims
.MapClaimToCommandProperty<AddToDoItemCommand>("userId", cmd => cmd.UserId)
)
)
.Functions( /* as before */ );
}
We're doing three things here:
- We use the TokenValidator<> method to register our token validator implementation as the one Function Monkey should use.
- We use the AuthorizationDefault method to tell Function Monkey that the default authorization mode for all our functions is token validation
- We use the Claims method and the MapClaimToCommandProperty method to tell Function Monkey to set the UserId property on the AddToDoItemCommand from the name identifier claim in our claims principal.
With that done we have one more thing to do - we need to tell Function Monkey to never allow our UserId property to be set by anything other than a claims mapping - that way no one can use a request body or query, header, or route parameters to set the user ID. We do that by marking it with the SecurityProperty attribute on the command model:
public class AddToDoItemCommand : ICommand<ToDoItem>
{
[SecurityProperty]
public string UserId { get; set; }
public string Title { get; set; }
}
With that done lets try making a POST request in Postman. When we do so without setting an authorization header we get a 401 Unauthorized error as shown below:
But if we set an Authorization header the request is successful and the user ID of the created post is set to user1 - the value we set in our claims principal:
Next lets add some validation to our inputs. Function Monkey supports validation of commands by any validation framework of your choice but includes a package for easy use of Fluent Validation. First we'll need to add the package:
dotnet add package FunctionMonkey.FluentValidation -v 3.0.11-beta5
Next create a folder called Validators and in there create a class called AddToDoItemCommandValidator:
using FluentValidation;
using ToDo.Commands;
namespace ToDo.Validators
{
public class AddToDoItemCommandValidator : AbstractValidator<AddToDoItemCommand>
{
public AddToDoItemCommandValidator()
{
RuleFor(x => x.UserId).NotEmpty();
RuleFor(x => x.Title).NotEmpty().MaximumLength(128);
}
}
}
The above is just standard Fluent Validation code. Note that we can validate the UserId because validation takes place after claims mapping has occurred. We now need to register our validator in our builder and we can do this in a similar way to how we added our commands and handlers:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) =>
{
ApplicationSettings applicationSettings = new ApplicationSettings
{
CosmosContainerName = "todoItems",
CosmosDatabaseName = "testdatabase",
CosmosConnectionString =
Environment.GetEnvironmentVariable("cosmosConnectionString", EnvironmentVariableTarget.Process)
};
serviceCollection
.AddSingleton(applicationSettings)
.AddTransient<IToDoItemRepository, ToDoItemRepository>()
.AddValidatorsFromAssemblyContaining<FunctionAppConfiguration>() // add this line
;
commandRegistry.Discover<FunctionAppConfiguration>();
})
.AddFluentValidation() // add this line
.Authorization( /* as before */ )
.Functions( /* as before */ );
}
If we run this and use Postman to send a post request with an invalid zero length title we can see a bad request response returned:
With all that up and running lets add a new command, or rather a query, that allows us to return all the posts for a given user. Again we begin by creating a class to represent our intent - in the Commands folder create the class GetAllToDoItemsQuery:
using System.Collections.Generic;
using AzureFromTheTrenches.Commanding.Abstractions;
using ToDo.Models;
namespace ToDo.Commands
{
public class GetAllToDoItemsQuery : ICommand<IReadOnlyCollection<ToDoItem>>
{
[SecurityProperty]
public string UserId { get; set; }
}
}
You'll notice that we've marked our UserId property as a security property as before however we don't have a claims mapping for this command. We could add a claims mapping for each command type but this could be difficult to maintain with a large number of commands. Instead we'll update our claims mapping to use a convention based approach so that any property on any command called UserId is populated from the userId claim. This requires a small change to our builder:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup( /* as before */ )
.AddFluentValidation() // as before
.Authorization(authorization => authorization
.TokenValidator<TokenValidator>()
.AuthorizationDefault(AuthorizationTypeEnum.TokenValidation)
.Claims(claims => claims
.MapClaimToPropertyName("userId", "UserId") // our change!
)
)
.Functions( /* as before */ );
}
Now we need to add a method to our repositories interface:
internal interface IToDoItemRepository
{
Task Upsert(ToDoItem toDoItem);
Task<IReadOnlyCollection<ToDoItem>> Get(Guid userId);
}
And to the implementation:
internal class ToDoItemRepository : IToDoItemRepository
{
private readonly CosmosClient _cosmosClient;
private readonly string _databaseName;
private readonly string _containerName;
public ToDoItemRepository(ApplicationSettings settings) { /* as before */ }
public async Task Upsert(ToDoItem toDoItem) { /* as before */ }
public async Task<IReadOnlyCollection<ToDoItem>> Get(string userId)
{
string query = "SELECT * FROM c WHERE c.createdByUserId = @userId";
Container container = _cosmosClient.GetContainer(_databaseName, _containerName);
QueryDefinition queryDefinition = new QueryDefinition(query)
.WithParameter("@userId", userId);
FeedIterator<ToDoItem> queryResultSetIterator = container.GetItemQueryIterator<ToDoItem>(queryDefinition);
List<ToDoItem> items = new List<ToDoItem>();
while (queryResultSetIterator.HasMoreResults)
{
FeedResponse<ToDoItem> currentResultSet = await queryResultSetIterator.ReadNextAsync();
items.AddRange(currentResultSet);
}
return items;
}
}
Next we need a command handler:
using System.Collections.Generic;
using System.Threading.Tasks;
using AzureFromTheTrenches.Commanding.Abstractions;
using ToDo.Commands;
using ToDo.Models;
using ToDo.Services;
namespace ToDo.Handlers
{
internal class GetAllToDoItemsQueryHandler : ICommandHandler<GetAllToDoItemsQuery, IReadOnlyCollection<ToDoItem>>
{
private readonly IToDoItemRepository _repository;
public GetAllToDoItemsQueryHandler(IToDoItemRepository repository)
{
_repository = repository;
}
public async Task<IReadOnlyCollection<ToDoItem>> ExecuteAsync(
GetAllToDoItemsQuery command,
IReadOnlyCollection<ToDoItem> previousResult)
{
return await _repository.Get(command.UserId);
}
}
}
And then we need to register our command with a function using our builder:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup( /* unchanged */ )
.Authorization( /* unchanged */ )
.Functions(functions => functions
.HttpRoute("api/v1/todoItem", route => route
.HttpFunction<AddToDoItemCommand>(HttpMethod.Post)
.HttpFunction<GetAllToDoItemsQuery>(HttpMethod.Get) // our addition
)
);
}
If we run our application and use Postman to call our endpoint we'll see an array of the items we've added so far returned:
Let's wrap up the API by adding a function that allows items to be marked as complete, again we'll need to add a command:
using AzureFromTheTrenches.Commanding.Abstractions;
namespace ToDo.Commands
{
public class MarkItemCompleteCommand : ICommand
{
public string UserId { get; set; }
public string ItemId { get; set; }
}
}
We'll need to add a method to our repository to get a single item by id (I'll leave out the interface addition as it should be self explanatory):
public async Task<ToDoItem> GetSingleItem(string itemId)
{
Container container = _cosmosClient.GetContainer(_databaseName, _containerName);
return await container.ReadItemAsync<ToDoItem>(itemId, new PartitionKey(itemId));
}
And we'll need a command handler:
using System.Threading.Tasks;
using AzureFromTheTrenches.Commanding.Abstractions;
using ToDo.Commands;
using ToDo.Models;
using ToDo.Services;
namespace ToDo.Handlers
{
internal class MarkItemCompleteCommandHandler : ICommandHandler<MarkItemCompleteCommand>
{
private readonly IToDoItemRepository _repository;
public MarkItemCompleteCommandHandler(IToDoItemRepository repository)
{
_repository = repository;
}
public async Task ExecuteAsync(MarkItemCompleteCommand command)
{
ToDoItem item = await _repository.GetSingleItem(command.ItemId);
item.IsComplete = true;
await _repository.Upsert(item);
}
}
}
And finally we'll need to map the command to a function. We'll use the PUT verb and supply the item ID using a route parameter:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) => { /* as before */ }
.AddFluentValidation() /* as before */
.Authorization( /* as before */ )
.Functions(functions => functions
.HttpRoute("api/v1/todoItem", route => route
.HttpFunction<AddToDoItemCommand>(HttpMethod.Post)
.HttpFunction<GetAllToDoItemsQuery>(HttpMethod.Get)
// our addition
.HttpFunction<MarkItemCompleteCommand>("/{itemId}/complete", HttpMethod.Put)
)
);
}
If we run this in Postman we'll get a simple 200 response:
If we look at our MarkItemCompleteCommandHandler we can see that it's possible for a user to change another users item completion status if they know the ID of a to do item. As a final change to the API we'll look at how we can prevent this and return a 401 Unauthorized response using a HTTP response handler. First lets create an exception that we'll throw in this situation:
using System;
namespace ToDo
{
public class UnauthorizedItemAccessException : Exception
{
}
}
Next we'll update our command handler to throw this exception if the user ID on the command doesn't match the user ID of the to do Item:
public async Task ExecuteAsync(MarkItemCompleteCommand command)
{
ToDoItem item = await _repository.GetSingleItem(command.ItemId);
if (item.CreatedByUserId != command.UserId)
{
throw new UnauthorizedItemAccessException();
}
item.IsComplete = true;
await _repository.Upsert(item);
}
Finally we need a way of translating our .NET exception into a 401 HTTP response. Function Monkey allows you to register a HTTP response handler that can be used to shape responses. Create a class in the root called HttpResponseHandler that implements the IHttpResponseHandler interface:
using System;
using System.Threading.Tasks;
using AzureFromTheTrenches.Commanding;
using AzureFromTheTrenches.Commanding.Abstractions;
using FunctionMonkey.Abstractions.Http;
using FunctionMonkey.Commanding.Abstractions.Validation;
using Microsoft.AspNetCore.Mvc;
namespace ToDo
{
public class HttpResponseHandler : IHttpResponseHandler
{
public Task<IActionResult> CreateResponseFromException<TCommand>(TCommand command, Exception ex) where TCommand : ICommand
{
// exceptions thrown from inside command handlers are reraised as CommandExecutionException's with the
// inner exception set to the handler raised exception - so we need to unwrap this to shape our response
Exception unwrappedException = ex is CommandExecutionException ? ex.InnerException : ex;
if (unwrappedException is UnauthorizedAccessException)
{
return Task.FromResult((IActionResult) new UnauthorizedResult());
}
// returning null tells Function Monkey to create its standard response
return null;
}
public Task<IActionResult> CreateResponse<TCommand, TResult>(TCommand command, TResult result) where TCommand : ICommand<TResult>
{
return null; // returning null tells Function Monkey to create its standard response
}
public Task<IActionResult> CreateResponse<TCommand>(TCommand command)
{
return null; // returning null tells Function Monkey to create its standard response
}
public Task<IActionResult> CreateValidationFailureResponse<TCommand>(TCommand command, ValidationResult validationResult) where TCommand : ICommand
{
return null; // returning null tells Function Monkey to create its standard response
}
}
}
In the above code we simply look for our exception and translate it to the appropriate response. Now we need to update our builder to register it:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) => { /* as before */ }
.AddFluentValidation() // as before
.Authorization(/* as before */ )
.DefaultHttpResponseHandler<HttpResponseHandler>() // added
.Functions(/* as before */);
}
This is a little difficult to test with our mocked up authorization handler but you can do so by changing the user ID we set in our TokenValidator and attempting to mark an item we created earlier as complete. If you do so you will recieve a 401 Unauthorized in Postman.
Finally lets add Open API support. To do this all we need to do is add a new section to the builder:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) => { /* as before */ }
.AddFluentValidation() // as before
.Authorization(/* as before */ )
.DefaultHttpResponseHandler<HttpResponseHandler>() // as before
.OpenApiEndpoint(openApi => openApi
.Title("ToDo API")
.Version("1.0.0")
.UserInterface()
)
.Functions(/* as before */);
}
With this section added if we run our project and visit http://localhost:7071/openapi we will find the OpenAPI explorer for our project:
It's worth noting that if we hadn't customised our route prefix in the host.json file earlier we wouldn't have been able to place this on the root - instead it would have sat under http://localhost:7071/api/openapi (if you want this behaviour you can simply leave out the host.json routePrefix customisation).
Addressing cross cutting concerns with a custom dispatcher
We've not really seen this in our work so far but Function Monkey makes use of a commanding / mediator framework to associate commands with handlers. When an event occurs the following occurs:
- The Azure Functions runtime invokes the appropriate trigger
- The trigger is handled by a Function Monkey generated function
- The function deserializes the input to the function
- The function passes the deserialized input to the dispatcher
- The dispatcher locates the handler(s) to invoke
- The dispatcher instantiates and executes the handler
The common element to every execution is the dispatcher and we can add behaviour to this to address cross cutting concerns. Examples might include fine grained permissions, logging and telemetry. As an example we'll implement a dispatcher that uses a stopwatch to measure the duration of handler executions and saves them to Azure Application Insights.
To begin with create an Application Insights instance in Azure following these instructions taking note of the instrumentation key. Before you deploy an Azure Functions application into Azure itself its worth also familiarising yourself with the integration between Functions and App Insights.
Next update the local.settings.json file with the instrumentation key:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"cosmosConnectionString": "<enter your connection string here>",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<enter your instrumentation key here>"
}
}
Next add the Appliation Insights NuGet package:
dotnet add package Microsoft.ApplicationInsights.AspNetCore
Now create an interface called ITelemetry in the Services folder:
using System;
namespace ToDo.Services
{
internal interface ITelemetry
{
void RecordExecutionTime(Type commandType, double executionTime);
}
}
And an implementation called Telemetry:
using System;
using System.Collections.Generic;
using Microsoft.ApplicationInsights;
namespace ToDo.Services
{
internal class Telemetry : ITelemetry
{
private readonly TelemetryClient _telemetryClient = new TelemetryClient();
public void RecordExecutionTime(Type commandType, double executionTime)
{
_telemetryClient.TrackEvent(commandType.Name, metrics: new Dictionary<string, double>
{
{ "executionTimeMs", executionTime}
});
}
}
}
Now we need to decorate the default dispatcher with one that records our telemetry. To do this we need to implement the ICommandDispatcher as shown below, create a class called CustomDispatcher in the services folder:
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
using AzureFromTheTrenches.Commanding.Abstractions;
using AzureFromTheTrenches.Commanding.Abstractions.Model;
namespace ToDo.Services
{
internal class CustomDispatcher : ICommandDispatcher
{
private readonly IFrameworkCommandDispatcher _underlyingDispatcher;
private readonly ITelemetry _telemetry;
public CustomDispatcher(
IFrameworkCommandDispatcher underlyingDispatcher,
ITelemetry telemetry
)
{
_underlyingDispatcher = underlyingDispatcher;
_telemetry = telemetry;
}
public async Task<CommandResult<TResult>> DispatchAsync<TResult>(ICommand<TResult> command, CancellationToken cancellationToken = new CancellationToken())
{
Stopwatch sw = Stopwatch.StartNew();
CommandResult<TResult> result = await _underlyingDispatcher.DispatchAsync(command, cancellationToken);
sw.Stop();
_telemetry.RecordExecutionTime(command.GetType(), sw.ElapsedMilliseconds);
return result;
}
public async Task<CommandResult> DispatchAsync(ICommand command, CancellationToken cancellationToken = new CancellationToken())
{
Stopwatch sw = Stopwatch.StartNew();
CommandResult result = await _underlyingDispatcher.DispatchAsync(command, cancellationToken);
sw.Stop();
_telemetry.RecordExecutionTime(command.GetType(), sw.ElapsedMilliseconds);
return result;
}
public ICommandExecuter AssociatedExecuter => null;
}
}
The two DispatchAsync methods are responsible for dispatching commands to the appropriate handlers (which can be local or remote - see the documentation here) and the IFrameworkCommandDispatcher is the default implementation provided by the commanding framework (ICommandDispatcher is the interface used by consumers of the framework but we don't decorate it directly - IFrameworkCommandDispatcher is provided to avoid awkward to resolve circular situations).
We now need to register our dispatcher and telemetry service in our builder:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) =>
{
// ...
serviceCollection
.AddSingleton(applicationSettings)
.AddTransient<IToDoItemRepository, ToDoItemRepository>()
.AddValidatorsFromAssemblyContaining<FunctionAppConfiguration>()
// our additions
.AddSingleton<ITelemetry, Telemetry>()
.Replace(
new ServiceDescriptor(
typeof(ICommandDispatcher),
typeof(CustomDispatcher),
ServiceLifetime.Transient)
)
;
// ...
})
.AddFluentValidation() // as before
.Authorization( /* as before */ )
.DefaultHttpResponseHandler<HttpResponseHandler>() // as before
.OpenApiEndpoint( /* as before */ )
.Functions( /* as before */ );
}
If we run our project and execute a few commands in Postman then after a short delay we'll see our events logged in the Application Insights portal:
Adding a service bus queue function
In this final section lets assume that we also want to allow items to be marked as complete through the Service Bus queue. First we'll need to create a Service Bus namespace, you can follow the instructions here. In the portal also create a queue called newtodoitem (there's a button on the toolbar) and take note of the connection string for the RootManageSharedAccessKey in the Shared access policies section.
Next add the connection string to the local.settings.json file:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"cosmosConnectionString": "<enter your connection string here>",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<enter your instrumentation key here>",
"serviceBusConnectionString": "<enter your service bus connection string here>"
}
}
Now add the Microsoft.Azure.WebJobs.Extensions.ServiceBus package to the project:
dotnet add package Microsoft.Azure.WebJobs.Extensions.ServiceBus
And now all we need to do is associate the AddToDoItemCommand we created earlier with an Azure Function triggered by our Service Bus queue:
public void Build(IFunctionHostBuilder builder)
{
builder
.Setup((serviceCollection, commandRegistry) => /* as before */)
.AddFluentValidation() // as before
.Authorization( /* as before */ )
.DefaultHttpResponseHandler<HttpResponseHandler>() // as before
.OpenApiEndpoint(/* as before */ )
.Functions(functions => functions
.HttpRoute("api/v1/todoItem", route => route
.HttpFunction<AddToDoItemCommand>(HttpMethod.Post)
.HttpFunction<GetAllToDoItemsQuery>(HttpMethod.Get)
.HttpFunction<MarkItemCompleteCommand>("/{itemId}/complete", HttpMethod.Put)
)
// Our addition
.ServiceBus(serviceBus => serviceBus
.QueueFunction<AddToDoItemCommand>("newtodoitem")
)
);
}
Now run the application and then, using the tool of your choice (for example Azure Service Bus Explorer), post the following item to the queue:
{
"userId": "sbuser",
"title": "Item from Service Bus"
}
This will result in the AddToDoItemCommandHandler being invoked, along with our validations, and the item being saved to Cosmos. Note that although we marked the UserId property with a SecurityProperty attribute those attributes only apply to HTTP triggers and so in this case we can set the UserId property.
This simple additon also demonstrates the value of separating out our concerns, and specifically the trigger from the business logic, as with just a few lines of code we've been able to invoke all our "add to do item" business logic via a different protocol.
Concluding
In addition to the features we've explored here there is support for other trigger types, output bindings and a varierty of cross cutting concerns details of which can be found in the projects documentation. And if you need to do something not supported by Function Monkey then standard Azure Functions can happily co-exist alongside Function Monkey functions.
Its also worth exploring the command framework in conjunction - the patterns explored here can be useful tools in organising large applications and evolving them from modular monoliths to microservice architectures, and the framework includes capabilities that allow execution to be both abstracted and remoted over various channels (message queues and HTTP for example). If you're interested in exploring this further then this series here may be useful along with this slidedeck.
Finally, this post has focused on C# and while it can be used in F# the builder pattern expressed in a C# style doesn't feel natural in that language. To address this a more F# friendly interface is under development and you can find details of that here.
If you have any questions please reach out to me in the comments or on Twitter.
Top comments (0)