When we talk about Data Visualization and Dashboards, enterprise tools like Tableau or PowerBI often dominate the conversation. However, for Data Scientists and Developers, these GUI-based tools can feel restrictive. What if you need complex machine learning integration, custom UI logic, or automated CI/CD deployments?
Enter the holy trinity of Python visualization tools: Streamlit, Dash, and Bokeh.
In this article, we will explore the differences between these powerful frameworks, build a real-world financial dashboard, and completely automate its deployment to the Cloud using Docker and GitHub Actions.
🛠️ The Contenders: Streamlit vs. Dash vs. Bokeh
Before writing code, let's understand which tool fits your use case:
1. Streamlit (The Sprinter) 🏃♂️
Streamlit turns data scripts into shareable web apps in minutes. All in pure Python. No frontend experience required.
- Best for: Rapid prototyping, internal tools, and quick data exploration.
- Pros: Incredibly low learning curve, fully reactive script execution.
2. Plotly Dash (The Architect) 🏢
Dash is written on top of Flask, Plotly.js, and React.js. It requires more boilerplate than Streamlit but offers enterprise-grade customization.
- Best for: Production-grade analytics dashboards with complex layouts and interactivity.
- Pros: Highly customizable UI, stateless callback architecture perfect for scaling.
3. Bokeh (The Artist) 🎨
Bokeh targets modern web browsers for presentation, providing elegant, concise construction of versatile graphics.
- Best for: Massive datasets, streaming data, and highly custom interactive plots.
- Pros: Incredible granularity over how visualizations render in the DOM.
🚀 Building a Real-World Example: A Stock Market Dashboard with Streamlit
Because we want to move from zero to deployed as fast as possible, we will use Streamlit to build a dynamic Stock Price Explorer.
The Application Code (app.py)
This simple script downloads historical financial data, visualizes it, and calculates moving averages interactively.
# app.py
import streamlit as st
import pandas as pd
import numpy as np
# Page configuration
st.set_page_config(page_title="Financial Dashboard", page_icon="📈", layout="wide")
st.title("📈 Interactive Financial Explorer")
st.markdown("Built with **Streamlit** to demonstrate rapid dashboard development.")
# Sidebar for user inputs
st.sidebar.header("User Parameters")
ticker = st.sidebar.selectbox("Select Asset", ("AAPL", "GOOGL", "MSFT", "BTC-USD"))
days = st.sidebar.slider("Number of days", 10, 365, 100)
ma_window = st.sidebar.number_input("Moving Average Window", 5, 50, 20)
# Simulate fetching data (Replacing heavy API calls for this demo)
@st.cache_data
def get_data(ticker, days):
dates = pd.date_range(end=pd.Timestamp.today(), periods=days)
prices = np.random.normal(loc=150, scale=10, size=days)
df = pd.DataFrame({'Date': dates, 'Price': prices})
df.set_index('Date', inplace=True)
return df
data = get_data(ticker, days)
data['Moving Average'] = data['Price'].rolling(window=ma_window).mean()
# Visualization
st.subheader(f"Price History for {ticker}")
st.line_chart(data[['Price', 'Moving Average']])
# Raw Data Table
with st.expander("Show Raw Data"):
st.dataframe(data.tail(10))
The Dependencies (requirements.txt)
streamlit==1.31.0
pandas==2.2.0
numpy==1.26.3
☁️ Deployment & Automation (CI/CD)
To make this a professional-grade project, we won't just run it on localhost. We will containerize it with Docker and automate its deployment using GitHub Actions.
Step 1: Containerizing the App (Dockerfile)
Create a file named Dockerfile in the root directory:
FROM python:3.10-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app.py .
EXPOSE 8501
HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health
ENTRYPOINT ["streamlit", "run", "app.py", "--server.port=8501", "--server.address=0.0.0.0"]
Step 2: GitHub Actions Automation (.github/workflows/deploy.yml)
We will set up a pipeline that lints our Python code and pushes the Docker container to GitHub Packages (which can then be pulled by any cloud provider like AWS, Render, or DigitalOcean).
name: Deploy Streamlit Dashboard
on:
push:
branches: [ "main" ]
jobs:
build-and-deploy:
runs-on: ubuntu-latest
permissions:
packages: write
contents: read
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies and Lint
run: |
pip install flake8
flake8 app.py --count --select=E9,F63,F7,F82 --show-source --statistics
- name: Log in to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push Docker image
uses: docker/build-push-action@v4
with:
context: .
push: true
tags: ghcr.io/${{ github.repository_owner }}/financial-dashboard:latest
Every time you push to the main branch, GitHub Actions will automatically check your code for errors, build the Docker container, and host it on the cloud!
🎯 Conclusion
While Tableau and PowerBI are fantastic, code-based visualization tools like Streamlit, Dash, and Bokeh unlock infinite potential for data scientists. By leveraging Python, we can integrate machine learning models, build custom UIs, and seamlessly deploy our applications into modern DevOps pipelines.
Top comments (0)