DEV Community

loading...

Azure functions custom handlers in Go: Logging

mattiasfjellstrom profile image Mattias Fjellström ・4 min read

Introduction

Azure recently announced general availability of custom handlers for Azure functions. A custom handler is a light-weight web server that receives HTTP requests from the functions host. Any programming language or runtime that is capable of setting up a web server to receive this request can be used as a custom handler. In this post I will consider custom handlers in Go, and specifically logging from this custom handler.

Limitations when logging from a custom handler

There are some limitations to using custom handlers, and the one I will focus on in this post is logging. Logging is not listed as an official limitation, but once you start exploring custom handlers for a production implementation you quickly discover this fact. The current solution lets you to store all your logs in a string array that you return in the response to the function host. These logs are then inserted into your log destination for you.

Why is this a limitation? There are two major things that could be a concern.

  1. You can only control the message field of each log. You can't influence any other field, or include custom fields. This includes the severity field, which means all your logs will have the same severity.
  2. All the logs will have close to the same timestamp. This is because they are returned to the host in an array that is then inserted to the log destination in a batch. As far as I can tell the order is preserved, but I would not count on it.

How should I log?

Maybe Azure will extend the logging functionality for custom handlers in the future, but until then: what are your options?

You can still log to stdout by a simple fmt.Println(...), but doing that means your logs will lose some context that you otherwise would get. What context is that? For instance, the invocation ID and the name of the function that created the log.

If your log messages follow a known format you could still use the log array solution and search the message field to find the logs you want. If your logs is structured JSON you can use functions in the Kusto query language to handle the parsing, and this is what I will demonstrate in the text below.

An example solution of using custom JSON logging

The complete code for this sample is available at https://github.com/mattias-fjellstrom/go-custom-handler-logging. Note that in this sample I have set the enableForwardingHttpRequest to true, and this means that the Functions host forwards the HTTP request to my handler without modifying it.

My entrypoint in main.go looks like this

package main

import (
  "log"
  "net/http"
  "os"

  "github.com/mattias-fjellstrom/go-custom-handler/src/handlers"
)

func main() {
  port, exists := os.LookupEnv("FUNCTIONS_CUSTOMHANDLER_PORT")
  if !exists {
    port = "8080"
  }

  mux := http.NewServeMux()
  mux.HandleFunc("/api/greeting", handlers.GreetingHandler)

  log.Fatal(http.ListenAndServe(":"+port, mux)
}
Enter fullscreen mode Exit fullscreen mode

And my handler function that initializes my custom logger

package handlers

import (
  "net/http"
  "time"
  "encoding/json"

  "github.com/mattias-fjellstrom/go-custom-handler-logging/src/logging"
)

// GreetingHandler handles GET requests to /api/greeting
func GreetingHandler(w http.ResponseWriter, r *http.Request) {
  logger := logging.NewLogger()

  logger.Info("Log before a five seconds sleep")
  time.Sleep(5 * time.Second)
  logger.Info("Log after a five seconds sleep")
  logger.Warning("This is a warning log")
  logger.Error("This is an error log")

  response := "Hello!"
  w.Header().Set("Content-Type", "application/json")
  w.WriteHeader(http.StatusOK)
  json.NewEncoder(w).Encode(response)
}
Enter fullscreen mode Exit fullscreen mode

Finally my custom logger is

package logging

import (
  "time"

  "github.com/sirupsen/logrus"
)

func NewLogger() logrus.FieldLogger {
  logger := logrus.New()
  logger.SetLevel(logrus.InfoLevel)
  jsonFormatter := logrus.JSONFormatter{
    TimestampFormat: time.RFC3339Nano,
    FieldMap: logrus.FieldMap{
      logrus.FieldKeyTime: "timestamp",
      logrus.FieldKeyMsg: "message",
      logrus.FieldKeyLevel: "level",     
    },
  }
  logger.SetFormatter(&jsonFormatter)
  return logger
}
Enter fullscreen mode Exit fullscreen mode

I deploy this function using the excellent VS Code extension for Azure functions. Once it is deployed I invoke it a few times and then head to my Application Insights resource and run the following query

traces 
| where message startswith "{"
| extend d=parse_json(message)
| project d.timestamp, d.level, d.message
Enter fullscreen mode Exit fullscreen mode

The where message startswith "{" statement is a simple way of only including messages that are logged as JSON. It is not ideal, because the host will log a few JSON messages as well. The extend d=parse_json(message) statement reads the message field as JSON and allows us to project (i.e. only show the specified fields) fields from the JSON log object. A sample response is shown in the image below

Azure Application Insights query results

It is clear from the result that we have a solution to both of the limitations I listed above: we can create whatever field we want, and the relative time between logs is preserved. You can extend the log format with more fields to get the context you want.

Discussion (0)

pic
Editor guide