The previous blog demonstrated an example of how to use Azure Event Hubs integration with Dapr. Azure Event Hubs was used as a "binding" within the Dapr
runtime to allow the application to communicate with Azure Event Hubs without actually knowing about it or being coupled to it directly (via SDK, library etc.), using a simple model defined by the Dapr
runtime.
This blog post will showcase how to stitch together multiple Dapr
bindings. Although this applies to any supported binding, the example will combine Azure Event Hubs and Azure Blob Storage, where they will be used as Input and Output bindings respectively. All this is done in a simplified manner with no direct reference to Event Hubs or Blob storage in the application code, thanks to the binding based integration.
We will cover:
- Setup for
Dapr
and related Azure services - Run our application and see it in action, sending data from Azure Event Hubs to Azure Blog Storage
- Walkthrough of how it works behind the scenes
Azure Event Hubs is a fully managed Platform-as-a-Service (PaaS) for streaming and event ingestion and Azure Blob Storage is an object storage solution for the cloud, optimized for storing massive amounts of unstructured data.
Hello Dapr!
Dapr stands for Distributed Application Runtime. It is an open source, portable runtime to help developers build resilient, microservice stateless and stateful applications by codifying the best practices for building such applications into independent components.
If you're new to Dapr, I would recommend starting off with the overview and concepts. Try the getting-started guide and then move on to samples and how-to guides. As you advance further, you can dig into the Dapr runtime API reference and individual components
Dapr Bindings
At the time of writing, Dapr
is in alpha state and supports the following distributed systems building blocks which you can plug into your applications - Service invocation, State Management, Pub/Sub messaging, Resource Bindings, Distributed tracing and Actor pattern. Bindings provide a common way to trigger an application with events from external systems or invoke an external system with optional data payloads. These "external systems" could be anything: a queue, messaging pipeline, cloud-service, filesystem, etc.
Currently supported bindings include Kafka, Rabbit MQ, Azure Event Hubs etc.
In a nutshell, Dapr
bindings allow you to focus on business logic rather than integrating with individual services such as databases, pub/sub systems, blob storage etc.
Let's get started...
Setup: Dapr, Azure Event Hubs and Blob Storage
This section will guide you through the setup process for Dapr, Azure Event Hubs, and Blob Storage.
To begin with, you will need a Microsoft Azure account. If you don't have one already, please go ahead and sign up for a free one!
Setup Dapr
For Dapr
, you will require:
To keep things simple, we'll run Dapr
locally as a standalone component.
If you're itching to run
Dapr
on Kubernetes, check out this getting started guide!
Start by installing the Dapr CLI which allows you to setup Dapr on your local dev machine or on a Kubernetes cluster, provides debugging support, launches and manages Dapr instances.
For e.g. on your Mac, you can simply use this to install Dapr
to /usr/local/bin
curl -fsSL https://raw.githubusercontent.com/dapr/cli/master/install/install.sh | /bin/bash
Refer to the documentation for details
You can use the CLI to install Dapr in standalone mode. All you need is a single command
dapr init
.. and that's it!
Setup Azure Event Hubs
You can quickly set up Azure Event Hubs using either of the following quickstarts:
- Using Azure portal - here is a step-by-step guide
- Using Azure CLI or Azure Cloud shell (in your browser!) - here is a step-by-step guide
You should now have an Event Hub instance with a namespace and associated Event Hub (topic). As a final step, you need to get the connection string in order to authenticate to Event Hubs - use this guide to finish this step.
Setup Azure Blob storage
If you'd like to use the Azure portal:
If you want to use the Azure CLI or Azure Cloud shell, use this step-by-step guide
Now that you have everything set up, let's move ahead and try out the application
Run the app with Dapr
Start by cloning the repo and change into the correct directory
git clone https://github.com/abhirockzz/dapr-eventhubs-blobstore
Update components/eventhubs_binding.yaml
to include Azure Event Hubs connection string in the spec.metadata.value
section.
Please note that you will have to append the name of the Event Hub to end the connection string i.e.
;EntityPath=<EVENT_HUBS_NAME>
.
This is what the value for connectionString
attribute should look like:
Endpoint=sb://<EVENT_HUBS_NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=<EVENT_HUBS_KEY>;EntityPath=<EVENT_HUBS_NAME>
Update components/blobstorage.yaml
to include Azure Blob storage details - storageAccount
, storageAccessKey
and container
Start the Go app
export APP_PORT=8080
dapr run --app-port $APP_PORT go run app.go
You should see the logs:
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="STARTING Dapr Runtime -- version edge -- commit ff7815d-dirty"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="log level set to: info"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="standalone mode configured"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="dapr id: Seekergrass-Shaker"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="loaded component messagebus (pubsub.redis)"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="loaded component statestore (state.redis)"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="loaded component eventhubs-input (bindings.azure.eventhubs)"
== DAPR == time="2019-11-14T15:33:14+05:30" level=info msg="loaded component storage (bindings.azure.blobstorage)"
......
Run Azure Event Hubs producer application
This app uses the Azure Event Hubs native Go client to send messages.
Set the required environment variables:
export EVENT_HUBS_NAMESPACE="<EVENT_HUBS_NAMESPACE>"
export EVENT_HUBS_KEY="<EVENT_HUBS_KEY>"
export EVENT_HUB_NAME="<EVENT_HUB_NAME>"
Please ensure that the name of the Event Hub is the same as what you configured for the connection string in the input binding configuration
Run the producer app
export GO111MODULE=on
go run eventhubs-producer/producer.go
It will send five messages to Event Hubs and exit. you should see logs like:
Sent message {"time":"Thu Nov 14 15:35:13 2019"}
Sent message {"time":"Thu Nov 14 15:35:18 2019"}
Sent message {"time":"Thu Nov 14 15:35:20 2019"}
Sent message {"time":"Thu Nov 14 15:35:23 2019"}
Sent message {"time":"Thu Nov 14 15:35:25 2019"}
Confirm
Check Dapr application logs, you should see the messages received from Event Hubs.
== APP == time from Event Hubs 'Thu Nov 14 15:35:13 2019'
== APP == time from Event Hubs 'Thu Nov 14 15:35:18 2019'
== APP == time from Event Hubs 'Thu Nov 14 15:35:20 2019'
== APP == time from Event Hubs 'Thu Nov 14 15:35:23 2019'
== APP == time from Event Hubs 'Thu Nov 14 15:35:25 2019'
Check Azure Blob storage. First, the Azure CLI needs your storage account credentials. Get your storage account keys by using the az storage account keys list
command
export AZURE_SUBSCRIPTION_ID=<to be filled>
export AZURE_STORAGE_ACCOUNT=<to be filled>
export AZURE_STORAGE_ACCOUNT_RESOURCE_GROUP=<to be filled>
az storage account keys list --account-name $AZURE_STORAGE_ACCOUNT --resource-group $AZURE_STORAGE_ACCOUNT_RESOURCE_GROUP --subscription $AZURE_SUBSCRIPTION_ID --output table
Use either of the two keys and export it in form of environment variable
export AZURE_STORAGE_KEY=<to be filled>
List the blobs in the container (use the name of the container which you created while setting up Azure Blob Storage in the previous section)
export CONTAINER_NAME=<to be filled>
az storage blob list --container-name $CONTAINER_NAME --subscription $AZURE_SUBSCRIPTION_ID --output table
You will should five blobs in the container. This is because five messages were pushed to Event Hubs and then saved to Azure Blob storage by Dapr. You can confirm their contents as well by downloading the blob
export BLOB_NAME=<to be filled>
az storage blob download --container-name $CONTAINER_NAME --subscription $AZURE_SUBSCRIPTION_ID --name $BLOB_NAME --file $BLOB_NAME
This will download the contents to a file (with same name as the blob) in your current directory. To peek inside, simply
cat $BLOB_NAME
You should see the message sent to Azure Event Hubs
{"time":"Thu Nov 14 15:35:20 2019"}
repeate the same with other blobs
Behind the scenes
Here is a summary of how it works:
Input Binding
The eventhub_binding.yaml
config file captures the connection string for Azure Event Hubs.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: eventhubs-input
spec:
type: bindings.azure.eventhubs
metadata:
- name: connectionString
value: Endpoint=sb://<EVENT_HUBS_NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=<KEY>;EntityPath=<EVENT_HUBS_NAME>
The key attributes are:
-
metadata.name
- name of the Input binding component -
spec.metadata.name
- Event Hubs connection string
Notice that the connection string contains the information for the broker URL (
<EVENT_HUBS_NAMESPACE>.servicebus.windows.net
), primary key (for authentication) and also the name of the topic or Event Hub to which your app will be bound and receive events from.
Output Binding
The blobstorage.yaml
config file captures the connection string for Azure Blob storage
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: storage
spec:
type: bindings.azure.blobstorage
metadata:
- name: storageAccount
value: [BLOB_STORAGE_ACCOUNT]
- name: storageAccessKey
value: [BLOB_STORAGE_ACCESS_KEY]
- name: container
value: [BLOB_STORAGE_CONTAINER_NAME]
Using the bindings in the app
The Go app exposes a REST endpoint at /eventhubs-input
- this is the same as the name of the Input Binding component (not a coincidence!)
func main() {
http.HandleFunc("/eventhubs-input", func(rw http.ResponseWriter, req *http.Request) {
var _time TheTime
err := json.NewDecoder(req.Body).Decode(&_time)
if err != nil {
fmt.Println("error reading message from event hub binding", err)
rw.WriteHeader(500)
return
}
fmt.Printf("time from Event Hubs '%s'\n", _time.Time)
rw.WriteHeader(200)
err = json.NewEncoder(rw).Encode(Response{To: []string{"storage"}, Data: _time})
if err != nil {
fmt.Printf("unable to respond'%s'\n", err)
}
})
http.ListenAndServe(":"+port, nil)
}
Dapr runtime does the heavy lifting of consuming from Event Hubs and making sure that it invokes the Go application with a POST
request at the /eventhubs-input
endpoint with the event payload.
The application responds by returning a JSON output which contains the payload received from Event Hub and the name of the output binding (storage
in this case). It is represented by the following struct
type Response struct {
To []string `json:"to"`
Data interface{} `json:"data"`
}
And that's where the magic happens! Dapr gathers this response and sends the payload to the output binding which is Azure Blob Storage in this case.
Summary
In this blog post, you saw how to stitch (or compose) together different services by using them as Dapr bindings. All your application did was interact with the Dapr runtime (just a sidecar) using the Dapr HTTP API!
It is also possible to do it using gRPC or language specific SDKs
As the time of writing, Dapr
is in alpha state (v0.1.0
) and gladly accepting community contributions 😃 Vist https://github.com/dapr/dapr to dive in!
If you found this article helpful, please like and follow 🙌 Happy to get feedback via Twitter or just drop a comment.
Top comments (0)