This is a submission for the Agentic Postgres Challenge with Tiger Data
π‘ What I Built
I built Agentic Bitcoin24, a Bitcoin price tracker that never goes down, even when its primary data source fails. It's a growing database that gains value over time.
Live Application: Agentic Bitcoin24
π― Zero-Downtime Resilience
When the CoinGecko API fails (rate limits, outages, network issues), the site automatically falls back to Tiger Data's TimescaleDB cache. Users never see an error (they don't even know the switch happened).
Key Benefits:
- π― Zero Downtime - Site stays live during external API outages
- π° 0.31% API Usage - Only 31 calls per month vs 10,000 limit
- β‘ Instant Response - Tiger Data cache = no external API latency
- π Transparent Fallback - Users are unaware of the data source switch
- π 10-Year Sustainability - Will run for the next decade on free tier
π’οΈ How I Used Agentic Postgres
Behind the scenes, three autonomous agents manage the entire database lifecycle - no manual SQL required.
π¬ The Agent Collaboration Model
| Agent | Responsibility | Actions |
|---|---|---|
| 1. Design Agent | Agnostic database design and ingestion. | β’ Reads external API response and automatically designs a matching SQL schema. β’ Creates general-purpose tables (e.g., standard SQL or JSONB) based on user input. |
| 2. Optimize Agent | Transforms and tunes existing database. | β’ Analyzes the Design Agent's generic schema for time-series patterns. β’ Enables TimescaleDB compression and implements automated compression policies. Safety Protocol: β’ Applies changes like indexing or compression policies only after visual confirmation and user approval. |
| 3. Monitoring Agent | Gathers database metrics. | β’ Real-time API health checks. β’ Performance monitoring and visualization. |
The agents autonomously:
- Monitor API health in real-time
- Switch tabs (SQL Editor β Charts β API Monitor)
- Execute optimizations (indexing, compression)
- Visualize results (Chart.js dashboards)
- Provide safety guidance before applying changes
ποΈ The Workflow:
Daily Ingestion (Vercel Cron)
1. Fetch 24 hours of Bitcoin price data (1 API call)
2. Design Agent creates/updates schema automatically
3. Optimize Agent analyzes and tunes performance
4. TimescaleDB compression stores historical record
Real-Time Monitoring
CoinGecko API Health Check (every 30s)
β
β
ONLINE β Fetch fresh data
β OFFLINE β Automatic fallback to Tiger Data cache
β
Zero downtime for users
π’οΈ How I Used Tiger Data + Claude
I used Tiger CLI (MCP) + Claude Code to build the entire system without writing manual SQL:
- Tiger CLI helped agents learn TimescaleDB-specific operations (
converttohypertable,add_compression) - Claude Code refined the
createzerocopyforklogic and intelligent fallback strategies - The agents operate in a chat interface where I can say: "Create a database for Bitcoin prices" and watch them work
Constraint-Aware Optimization
The Optimize Agent maximizes TimescaleDB's compression capabilities through deep reasoning about storage efficiency:
- Automatically enables compression with proper time-column ordering
- Implements compression policies (auto-compress data older than 30 days)
- Projects long-term capacity and recommends optimizations
When resource constraints prevent certain operations, the agent intelligently adapts by requiring user validation, ensuring all storage optimizations are reviewed before execution.
π The 10-Year Sustainability Model
The Math:
- Free tier: 10,000 API calls/month
- My usage: 31 calls/month (0.31%)
- Sustainability: 322 months = 26+ years
Why 10+ Years:
With TimescaleDB compression enabled on the time-series data:
- Daily Bitcoin prices (24 hourly data points) = ~2KB per day
- Compressed storage: ~730KB per year
- 750MB Γ· 730KB/year β 1,027 years of compressed data
But realistically, accounting for:
- Schema overhead
- Indexes and metadata
- Query logs
- Potential data expansion
Conservative estimate: 10+ years of continuous operation without hitting storage limits.
π Overall Experience
Most apps fail gracefully, this one doesn't fail at all.
We solved the data volatility problem by providing clean, 24-hour historical Bitcoin data, not by collecting data 24/7, but by ingesting 24 hourly data points every 24 hours.
The system is safe to run indefinitely and will store relevant data for 10+ years while costing nothing to maintain.
I basically hired agents who work for free and never sleep! π


Top comments (0)