Hello, everyone!
When developing distributed systems like My Broker B3, you often find yourself "working blindly" if you don't have the right tools to validate whatβs happening inside your containers.
Today, I want to share two essential tools Iβm using to ensure that the market data fetched with Python is correctly reaching the database and the message broker.
1. MongoDB Compass: The Official MongoDB GUI
Even though MongoDB is running in an isolated environment via Docker, MongoDB Compass is the tool I use to "see" the documents saved by the Market Data microservice.
-
What it solves: It allows me to validate if the data mapping in Python worked correctly and if fields like
created_atandpriceare in the expected format. -
How to connect to Docker:
-
Connection String:
mongodb://localhost:27017(Ensure that port27017matches the one mapped in yourdocker-compose.yml).
-
Connection String:
-
Pro Tip: In Compass, you can visualize the
price_historycollection in either tabular or JSON format, which makes quick inspections much easier.
πΈ Viewing in document mode:
πΈ Table view:
2. Offset Explorer (Formerly Kafka Tool)
Kafka can be intimidating to manage via the command line (CLI). Offset Explorer is a desktop client that makes visualizing messages within your topics incredibly easy.
-
What it solves: It allows me to monitor, in real-time, the messages arriving in the
trading-assets-market-data-v1topic. -
Configuration for the Docker environment:
- Cluster Name: MyBrokerKafka (or any name you prefer).
-
Zookeeper/Broker:
localhost:9092.
- Crucial Tip: To read the message content sent as JSON from Python, go to the tool's settings and change the Content Type (for both Key and Value) from Byte Array to String. This ensures the data is displayed in a human-readable format.
πΈ Visualizando as mensagens
π‘ Why is this important?
Mastering these support tools significantly speeds up the development cycle. In our case, Offset Explorer was essential to validate one of the project's most important decisions: using the Ticker as the message key in Kafka to ensure per-asset ordering.
π Conclusion
Having full visibility over your data is the first step toward building resilient systems. In the next post of the main series, we will head back to the Java ecosystem to start consuming these events!
What about you? What tools do you use to debug your distributed systems? Let me know in the comments!
π About the series
β¬ οΈ Previous Post: Market Data Integrator.
πβ Series Index: Series Guide.
Links:





Top comments (0)