Subscribe to my YouTube channel : MyDigitalWorld
Objectives:
• Launch a Kafka instance and use it to communicate with Pub/Sub
• Configure a Kafka connector to integrate with Pub/Sub
• Setup topics and subscriptions for message communication
• Perform basic testing of both Kafka and Pub/Sub services
• Connect IoT Core to Pub/Sub
Architecture:
Introduction:
With the announcement of the Google Cloud Confluent managed Kafka offering, it has never been easier to use Google Cloud's great data tools with Kafka. You can use the Apache Beam Kafka.io connector to go straight into Dataflow, but this may not always be the right solution.
Whether Kafka is provisioned in the Cloud or on premise, you might want to push to a subset of Pub/Sub topics. Why? For the flexibility of having Pub/Sub as your Google Cloud event notifier. Then you could not only choreograph Dataflow jobs, but also use topics to trigger Cloud Functions.
So how do you exchange messages between Kafka and Pub/Sub? This is where the Pub/Sub Kafka Connector comes in handy.
Tip: Here we use a virtual machine with a single instance of Kafka. This Kafka instance connects to Pub/Sub and exchanges event messages between the two services.
In the real world, Kafka would likely be run in a cluster.
Details:
Youtube link: https://tinyurl.com/y6dd28vr
GitHub: https://github.com/IamVigneshC/GCP-Streaming-IoT-Kafka-to-PubSub
Top comments (0)