Google just open-sourced a time-series AI that runs locally. And it's incredible.
I just ran it on 365 days of Ethereum prices. Here's what happened:
→ Mean forecast: $2,154
→ Bear case (10%): $1,803
→ Bull case (90%): $2,671
But the real story? The tech behind it.
TimesFM by Google Research
This is a 200M parameter foundation model for time-series forecasting. Not a niche crypto model—trained on massive, diverse time-series data across domains.
Why I'm Genuinely Impressed
✅ Zero-shot forecasting
No training required. Just load the model, feed your data, get predictions. It's the GPT moment for time-series.
✅ Runs fully local
Download once (~800MB), run forever. Your financial data never leaves your machine. No API keys. No rate limits. No monthly fees.
✅ Ridiculously fast
30-day forecast on 365 data points? 30 seconds on my Mac Studio.
✅ Quantile predictions out of the box
Not just point estimates—you get the full uncertainty distribution (10th, 50th, 90th percentiles). This is critical for financial applications.
✅ Dead simple setup
pip install git+https://github.com/google-research/timesfm.git
Three lines of Python. That's it.
Quick Example: Ethereum Price Forecasting
Here's how I tested it:
import timesfm
import pandas as pd
import numpy as np
# Load the model (download happens once)
tfm = timesfm.TimesFm(
context_len=512,
horizon_len=30,
input_patch_len=32,
output_patch_len=128,
num_layers=20,
model_dims=1280,
backend="cpu",
)
tfm.load_from_google(repo_id="google/timesfm-1.0-200m")
# Load your data (365 days of Ethereum prices)
df = pd.read_csv('ethereum_prices.csv')
prices = df['close'].values
# Run prediction (30-day forecast)
forecast = tfm.forecast(
inputs=[prices],
freq=[1],
)
# Extract results
mean_forecast = forecast[0]['mean']
quantile_10 = forecast[0]['10.0']
quantile_90 = forecast[0]['90.0']
print(f"Mean forecast: ${mean_forecast[-1]:,.2f}")
print(f"Bear case (10%): ${quantile_10[-1]:,.2f}")
print(f"Bull case (90%): ${quantile_90[-1]:,.2f}")
Output:
Mean forecast: $2,154.23
Bear case (10%): $1,803.45
Bull case (90%): $2,671.89
Why This Matters for Privacy-First AI
The model isn't crypto-specific (important caveat), but that's also the point—foundation models generalize. The same model that forecasts demand for retail can forecast prices. Same code. Same local inference.
We're watching the LLM pattern repeat for time-series:
- Massive pretraining on diverse data
- Open weights released
- Local inference possible
- Zero-shot capabilities
This is the future of privacy-first AI for finance.
No more sending sensitive financial data to cloud APIs. No more rate limits. No more monthly subscription fees for basic forecasting.
Just download the model and run it locally. Your data never leaves your machine.
Installation & Setup
Prerequisites
- Python 3.8+
- ~2GB RAM minimum
- ~1GB disk space (for model weights)
Quick Start
# Install TimesFM
pip install git+https://github.com/google-research/timesfm.git
# Download pre-trained weights (automatic on first run)
# Or manually:
python -c "import timesfm; timesfm.TimesFm().load_from_google(repo_id='google/timesfm-1.0-200m')"
Hardware Requirements
- CPU: Works on any modern processor
- GPU: Optional (speeds up inference 2-3x)
- RAM: 2GB+ recommended
- Disk: ~1GB for model weights
Use Cases
-
Financial forecasting
- Stock prices
- Cryptocurrency trends
- Sales projections
-
Demand forecasting
- Retail inventory
- Resource planning
- Supply chain optimization
-
IoT & sensor data
- Predictive maintenance
- Energy consumption
- Network traffic
-
Scientific research
- Climate modeling
- Population dynamics
- Experimental data analysis
Caveats & Limitations
Not financial advice
The model is a general-purpose forecaster, not a crypto trading bot. Use it for research, not investment decisions.Training data matters
Foundation models generalize, but domain-specific models still have their place.Quantile predictions ≠ guarantees
The 90th percentile isn't a ceiling—it's a statistical estimate based on historical patterns.Local inference = your responsibility
You own the model, the data, and the results. No cloud provider to blame.
The Bigger Picture
This is part of a larger trend: privacy-first AI that runs locally.
We're seeing:
- LLMs that run on laptops (Llama, Mistral, Phi)
- Image generators that run offline (Stable Diffusion)
- Time-series models that need no cloud (TimesFM)
The pattern is clear:
- Train massive models on diverse data
- Release open weights
- Enable local inference
- Let developers build without data leakage
This is the future of AI: powerful, private, and accessible.
Resources
What would you forecast with local time-series AI? Drop a comment below! 👇
Top comments (0)