Introduction
Hey everyone! Quick off-topic confession: I've recently gotten into collecting Topps trading cards - MLB and NBA mostly. There's something incredibly satisfying about pulling a rare rookie card after opening a bunch of packs. It's become my new stress-relief hobby and honestly, my happiness levels are through the roof! πβΎ Anyway, back to the tech stuff...
It's finally here! Container Runtime for Streamlit in Snowflake (SiS) is now available in Public Preview!
Previously, SiS used warehouses as the runtime environment. While warehouses are optimized for query execution and work great for data-heavy analytical apps, they were often overkill for many applications. Running a simple visualization app shouldn't require spinning up a full warehouse - and I know many of you felt the same way about the cost implications.
Container Runtime changes everything. It allows you to allocate smaller, more appropriate resources and separates app execution from query execution. This is a game-changer for cost optimization.
In this article, I'll walk you through SiS Container Runtime - from its core concepts and differences from Warehouse Runtime, to costs, considerations, package management, and a hands-on tutorial building a simple data visualization app.
Note (2024/12/24): SiS Container Runtime is currently in Public Preview. Features may be significantly updated in the future.
Note: This article represents my personal views and not those of Snowflake.
What is SiS Container Runtime?
SiS Container Runtime is a new runtime environment for running Streamlit apps on Snowflake. Unlike the traditional Warehouse Runtime, it takes a container-based approach, executing app code on a compute pool.
Key Features
| Feature | Description |
|---|---|
| Long-running Service | Apps run as persistent services where all users accessing the same app share a single instance |
| Fast Startup | Users connect to an already-running app, resulting in faster access |
| PyPI Package Support | Install packages from PyPI using pyproject.toml or requirements.txt
|
| Latest Streamlit | Use Streamlit 1.49+ (including streamlit-nightly) |
| Full Cache Support | Leverage Streamlit's caching features across user sessions |
Runtime environments for Streamlit apps (Snowflake Documentation)
Warehouse Runtime vs Container Runtime
Here's a comprehensive comparison of the two runtime environments:
| Feature | Warehouse Runtime | Container Runtime |
|---|---|---|
| Compute | App code and internal queries run on warehouse | App code runs on compute pool node, internal queries run on warehouse |
| Base Image | Linux in Python stored procedure | Linux in Snowpark container |
| Python Version | 3.9, 3.10, 3.11 | 3.11 |
| Streamlit Version | 1.22+ (select from Snowflake-provided versions) | 1.49+ (any PyPI version, including streamlit-nightly) |
| Dependency Management |
environment.yml from Snowflake Conda channel |
pyproject.toml or requirements.txt from PyPI |
| Version Pinning |
= operator, * wildcard |
== operator, <, <=, >=, > with comma-separated lists |
| Entry Point Location | Root of source directory | Root or subdirectory |
| Streamlit Server | Temporary individual instance per user session | Persistent shared instance for all user sessions |
| Caching | No cross-session caching | Full Streamlit caching support |
| Startup Time | Slower (on-demand app creation per user) | Faster per user (but slower initial deployment) |
The Value of Container Runtime
The true value of Container Runtime lies in resource optimization.
1. Separation of App and Query Execution
With Warehouse Runtime, both app code and queries ran on the same warehouse. This meant warehouse resources were consumed even for simple UI rendering.
Container Runtime separates these concerns: app code runs on a compute pool while queries run on a warehouse. You can now allocate minimal compute pool resources for app execution and appropriately-sized warehouses for query execution.
2. Lightweight Resource Options
Compute pools offer more affordable options than warehouses. For example:
- Gen1 XS Warehouse: 1 credit/hour
- Compute Pool
CPU_X64_XS: 0.06 credits/hour
That's approximately 1/16th the cost! For lightweight visualization apps, this difference is significant.
3. Shared Instance Efficiency
With Warehouse Runtime, each user got their own app instance. Container Runtime shares a single instance across all users. The second user onward connects instantly to an already-running app, improving perceived performance.
4. Cross-Session Caching
With Warehouse Runtime, Streamlit's @st.cache_data and @st.cache_resource decorators only cached within individual sessions. If User A cached a query result, User B would still execute the query again.
With Container Runtime, caches are shared across users. When User A caches a query result, User B can reuse that cache, reducing query execution and improving both performance and cost efficiency.
Cost Considerations
Container Runtime involves two types of resources:
1. Compute Pool
Compute pools are billed per node. Costs vary by node size (Instance Family), but options cheaper than the smallest warehouse (XS) are available.
Tip: Streamlit runs as a single process, so multiple CPUs don't provide much benefit. Choose the smallest node size that meets your memory requirements to optimize costs.
Recommended compute pool settings:
| Parameter | Recommendation | Reason |
|---|---|---|
INSTANCE_FAMILY |
Small sizes like CPU_X64_XS
|
Streamlit is single-process, start small |
MIN_NODES |
Number of different apps to run simultaneously | Each app uses one node |
MAX_NODES |
Higher than MIN_NODES
|
Prevent node shortages when adding new apps |
2. Query Warehouse
The warehouse for internal queries uses traditional credit-based billing. Choose an appropriate size based on query complexity and data volume.
Working with compute pools (Snowflake Documentation)
Important Considerations
Here are some things to keep in mind when using Container Runtime.
1. Python Version Compatibility
Container Runtime currently supports Python 3.11 only. Ensure your app and all dependencies are compatible with Python 3.11.
2. _snowflake Module Not Supported
The _snowflake module available in Warehouse Runtime is not available in Container Runtime. Use native Python libraries like Snowflake Python Connector instead.
For example, to get the active session (commonly used in SiS), change your import:
# NG: Not available in Container Runtime
from _snowflake import get_active_session
# OK: Use this instead
from snowflake.snowpark.context import get_active_session
3. Shared Resources
All users share disk, compute, and memory resources in Container Runtime. Design your app with efficient resource usage in mind.
4. Deployment Time
Container Runtime requires container startup during initial deployment, which may take longer than Warehouse Runtime.
5. External Access Integration
Installing packages from external package indexes like PyPI requires External Access Integration configuration.
External access integrations (Snowflake Documentation)
6. Sleep Timer and Compute Pool Auto-Stop
Warehouse Runtime includes a sleep timer feature that automatically stops apps after a period of inactivity. Container Runtime does not currently support this feature. Additionally, compute pools don't auto-stop when app sessions end.
To minimize costs, consider setting up a Task to stop compute pools during off-hours:
-- Example: Stop compute pool daily at 10 PM
CREATE OR REPLACE TASK STOP_STREAMLIT_POOL_TASK
WAREHOUSE = SIS_QUERY_WH
SCHEDULE = 'USING CRON 0 22 * * * America/New_York'
AS
ALTER COMPUTE POOL STREAMLIT_POOL STOP ALL;
That said, compute pools are very affordable, so running them continuously has limited cost impact. Choose the approach that fits your operational style.
Managing Dependencies
Container Runtime uses pyproject.toml or requirements.txt instead of environment.yml.
requirements.txt Example
streamlit==1.41.0
pandas>=2.0.0
plotly>=5.18.0
pyproject.toml Example
[project]
name = "my-streamlit-app"
version = "1.0.0"
[project.dependencies]
streamlit = "==1.41.0"
pandas = ">=2.0.0"
plotly = ">=5.18.0"
Note: Warehouse Runtime uses
=for version pinning, while Container Runtime uses==.
| Item | Warehouse Runtime | Container Runtime |
|---|---|---|
| Config File | environment.yml |
pyproject.toml or requirements.txt
|
| Package Source | Snowflake Conda channel | PyPI or other external indexes |
| Version Pinning | = |
== |
| Range Specification |
* wildcard |
<, <=, >=, >, != with comma-separated lists |
Dependency management for Streamlit apps (Snowflake Documentation)
Hands-on: Building a Sales Dashboard
Let's build a simple data visualization app using SiS Container Runtime!
Prerequisites
- Snowflake account (AWS, Azure, or GCP commercial region)
- Role with the following privileges:
| Privilege | Object | Notes |
|---|---|---|
| USAGE | Database | Database where you create the app |
| CREATE STREAMLIT, USAGE | Schema | Schema where you create the app |
| USAGE | Warehouse | Warehouse for query execution |
| USAGE | Compute pool | Compute pool for app execution |
| CREATE COMPUTE POOL | Account | If creating a new compute pool |
| CREATE INTEGRATION | Account | If creating a new External Access Integration |
Privileges required to create and use a Streamlit app (Snowflake Documentation)
Step 1: Set Up the Environment
Let's run the following SQL in Snowsight SQL Worksheets to set up our demo environment!
First, create the database, schema, and stage for source code:
-- Create database and schema
CREATE DATABASE IF NOT EXISTS SIS_CONTAINER_DB;
CREATE SCHEMA IF NOT EXISTS SIS_CONTAINER_DB.APP_SCHEMA;
-- Create stage for source code
CREATE OR REPLACE STAGE SIS_CONTAINER_DB.APP_SCHEMA.APP_STAGE
ENCRYPTION = (TYPE = 'SNOWFLAKE_SSE')
DIRECTORY = (ENABLE = TRUE);
-- Create sample data table
CREATE OR REPLACE TABLE SIS_CONTAINER_DB.APP_SCHEMA.SALES_DATA (
SALE_DATE DATE,
PRODUCT VARCHAR(100),
REGION VARCHAR(50),
SALES_AMOUNT NUMBER(10, 2)
);
-- Insert sample data
INSERT INTO SIS_CONTAINER_DB.APP_SCHEMA.SALES_DATA VALUES
('2024-01-01', 'Product A', 'East', 1000.00),
('2024-01-01', 'Product B', 'East', 1500.00),
('2024-01-01', 'Product A', 'West', 800.00),
('2024-01-02', 'Product B', 'West', 1200.00),
('2024-01-02', 'Product A', 'East', 1100.00),
('2024-01-03', 'Product B', 'East', 1800.00),
('2024-01-03', 'Product A', 'West', 950.00),
('2024-01-04', 'Product B', 'West', 1400.00),
('2024-01-04', 'Product A', 'East', 1050.00),
('2024-01-05', 'Product B', 'East', 1600.00);
Step 2: Create Compute Pool
Create a compute pool to run app code:
-- Create compute pool
CREATE COMPUTE POOL IF NOT EXISTS STREAMLIT_POOL
MIN_NODES = 1
MAX_NODES = 3
INSTANCE_FAMILY = CPU_X64_XS;
-- Check compute pool status
SHOW COMPUTE POOLS;
DESCRIBE COMPUTE POOL STREAMLIT_POOL;
Note: Compute pool startup may take a few minutes. Wait until
SHOW COMPUTE POOLSshowsstateasACTIVEorIDLE.
Step 3: Create External Access Integration
Create an External Access Integration to allow package installation from PyPI.
Snowflake provides a default network rule snowflake.external_access.pypi_rule for PyPI access, so you don't need to create your own:
-- Create External Access Integration (using Snowflake-managed PyPI network rule)
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION PYPI_ACCESS_INTEGRATION
ALLOWED_NETWORK_RULES = (snowflake.external_access.pypi_rule)
ENABLED = TRUE;
External network access examples (Snowflake Documentation)
Step 4: Prepare Application Files
Create these two files and upload them to the stage.
streamlit_app.py
import streamlit as st
import pandas as pd
from snowflake.snowpark.context import get_active_session
# Page configuration
st.set_page_config(layout="wide")
# Get Snowflake session
session = get_active_session()
# Title
st.title("π Sales Dashboard")
st.markdown("A simple sales visualization app running on Container Runtime.")
# Load data
@st.cache_data(ttl=60)
def load_data():
df = session.sql("SELECT * FROM SALES_DATA").to_pandas()
return df
df = load_data()
# Sidebar filters
st.sidebar.header("Filter Settings")
selected_products = st.sidebar.multiselect(
"Select Products",
options=df['PRODUCT'].unique(),
default=df['PRODUCT'].unique()
)
selected_regions = st.sidebar.multiselect(
"Select Regions",
options=df['REGION'].unique(),
default=df['REGION'].unique()
)
# Filter data
filtered_df = df[
(df['PRODUCT'].isin(selected_products)) &
(df['REGION'].isin(selected_regions))
]
# Display metrics
col1, col2, col3 = st.columns(3)
with col1:
st.metric("Total Sales", f"${filtered_df['SALES_AMOUNT'].sum():,.2f}")
with col2:
st.metric("Transaction Count", f"{len(filtered_df):,}")
with col3:
st.metric("Average Sale", f"${filtered_df['SALES_AMOUNT'].mean():,.2f}")
# Charts
st.subheader("Daily Sales Trend")
daily_sales = filtered_df.groupby('SALE_DATE')['SALES_AMOUNT'].sum().reset_index()
st.line_chart(daily_sales, x='SALE_DATE', y='SALES_AMOUNT')
st.subheader("Sales by Product")
product_sales = filtered_df.groupby('PRODUCT')['SALES_AMOUNT'].sum().reset_index()
st.bar_chart(product_sales, x='PRODUCT', y='SALES_AMOUNT')
# Data table
st.subheader("Detailed Data")
st.dataframe(filtered_df, use_container_width=True)
# Footer
st.markdown("---")
st.markdown("π Powered by Streamlit in Snowflake Container Runtime")
requirements.txt
streamlit>=1.41.0
pandas>=2.0.0
Step 5: Upload Files to Stage
Upload the files via Snowsight UI or use PUT commands:
-- Upload via SnowSQL or local environment
PUT file:///path/to/streamlit_app.py @SIS_CONTAINER_DB.APP_SCHEMA.APP_STAGE AUTO_COMPRESS=FALSE OVERWRITE=TRUE;
PUT file:///path/to/requirements.txt @SIS_CONTAINER_DB.APP_SCHEMA.APP_STAGE AUTO_COMPRESS=FALSE OVERWRITE=TRUE;
-- Verify stage contents
LIST @SIS_CONTAINER_DB.APP_SCHEMA.APP_STAGE;
Step 6: Create Streamlit App
Create the Streamlit app with Container Runtime:
-- Create query warehouse (skip if using existing warehouse)
CREATE WAREHOUSE IF NOT EXISTS SIS_QUERY_WH
WAREHOUSE_SIZE = 'XSMALL'
AUTO_SUSPEND = 60
AUTO_RESUME = TRUE;
-- Create Streamlit app
CREATE OR REPLACE STREAMLIT SIS_CONTAINER_DB.APP_SCHEMA.SALES_DASHBOARD_APP
FROM '@SIS_CONTAINER_DB.APP_SCHEMA.APP_STAGE'
MAIN_FILE = 'streamlit_app.py'
QUERY_WAREHOUSE = SIS_QUERY_WH
RUNTIME_NAME = 'SYSTEM$ST_CONTAINER_RUNTIME_PY3_11'
COMPUTE_POOL = STREAMLIT_POOL
EXTERNAL_ACCESS_INTEGRATIONS = (PYPI_ACCESS_INTEGRATION);
Key CREATE STREAMLIT Parameters
| Parameter | Description |
|---|---|
FROM |
Stage path containing source code |
MAIN_FILE |
Entry point Python file |
QUERY_WAREHOUSE |
Warehouse for query execution |
RUNTIME_NAME |
Specify SYSTEM$ST_CONTAINER_RUNTIME_PY3_11
|
COMPUTE_POOL |
Compute pool for app code execution |
EXTERNAL_ACCESS_INTEGRATIONS |
External access integration (for PyPI access) |
Step 7: Run the App
In Snowsight, click 'Streamlit' in the left pane and select your SALES_DASHBOARD_APP.
Your app is running! π You've successfully deployed a Streamlit app on Container Runtime!
Note: Initial startup may take a few minutes for container initialization and package installation. Subsequent startups will be faster due to caching.
Migration Checklist
When migrating existing Warehouse Runtime apps to Container Runtime, use this checklist:
| Item | Action |
|---|---|
| Python 3.11 Compatibility | Verify app and all dependencies work with Python 3.11 |
_snowflake Module Usage |
Replace with snowflake.snowpark.context etc. |
environment.yml Conversion |
Convert to requirements.txt or pyproject.toml
|
| Version Specification Syntax | Change = to ==, * to >= etc. |
| Compute Pool Creation | Create appropriately-sized compute pool |
| External Access Integration | Create if PyPI access is needed |
Migrate to a container runtime (Snowflake Documentation)
Conclusion
SiS Container Runtime is a long-awaited feature that separates app execution from query execution and enables lightweight resource allocation for apps.
The days of spinning up a warehouse just to run a simple visualization app are over! Container Runtime shines in use cases like:
- Always-on Dashboards: Dashboards with frequent multi-user access
- Lightweight Visualization Apps: Simple apps without heavy analytical processing
- Cache-Leveraging Apps: Apps that cache and reuse query results
While still in Public Preview, I encourage you to try this new feature and make your Snowflake app development more efficient!
I'm excited to explore more ideas using SiS Container Runtime and will share useful apps I create in future blog posts. Stay tuned!
Promotion
Snowflake What's New Updates on X
I share Snowflake What's New updates on X. Follow for the latest insights:
English Version
Snowflake What's New Bot (English Version)
Japanese Version
Snowflake's What's New Bot (Japanese Version)
Change Log
(20251223) Initial post





Top comments (1)
Thanks for the article β separating UI execution (compute pool) from queries (warehouse) + shared instance + cross-session cache is exactly what was needed to reduce costs and perceived latency. The comparison of 0.06 credits/hour (CPU_X64_XS) vs. 1 credit/hour (XS warehouse) makes the gain very tangible.
DEV Community
Practical question: how do you size MIN_NODES/MAX_NODES to avoid shortages during new deployments, and have you already seen any limitations in public preview (initial cold start, PyPI egress/external access integration, logs/monitoring)? A mini benchmark of "warehouse runtime vs container runtime" on a typical app would be great.