DEV Community

Alex Spinov
Alex Spinov

Posted on

TimescaleDB Has a Free API: Time-Series Superpowers on Top of PostgreSQL

TimescaleDB is a PostgreSQL extension that makes time-series data fast and easy. It's not a separate database — it IS PostgreSQL with automatic partitioning, compression, and continuous aggregates built in. Free tier on Timescale Cloud: 25 GB compressed storage.

Why TimescaleDB?

  • It's PostgreSQL — all your Postgres tools, ORMs, and drivers work
  • Automatic partitioning — hypertables chunk data by time automatically
  • 10-40x compression — native columnar compression
  • Continuous aggregates — real-time materialized views that auto-refresh
  • Free cloud tier — 25 GB compressed (effectively 250-1000 GB raw)

Quick Start

# Docker (self-hosted)
docker run -d --name timescaledb \
  -p 5432:5432 \
  -e POSTGRES_PASSWORD=password \
  timescale/timescaledb:latest-pg16

# Connect with psql
psql -h localhost -U postgres
Enter fullscreen mode Exit fullscreen mode

Create a Hypertable

-- Enable extension
CREATE EXTENSION IF NOT EXISTS timescaledb;

-- Create regular table
CREATE TABLE sensor_data (
    time        TIMESTAMPTZ NOT NULL,
    sensor_id   INTEGER,
    temperature DOUBLE PRECISION,
    humidity    DOUBLE PRECISION
);

-- Convert to hypertable (magic happens here)
SELECT create_hypertable('sensor_data', by_range('time'));

-- Insert data normally
INSERT INTO sensor_data VALUES
    (now(), 1, 22.5, 65.0),
    (now() - interval '1 hour', 1, 21.8, 63.2),
    (now() - interval '2 hours', 2, 23.1, 68.5);
Enter fullscreen mode Exit fullscreen mode

Time-Series Queries

-- Average temperature per hour (last 24h)
SELECT time_bucket('1 hour', time) AS hour,
       sensor_id,
       AVG(temperature) as avg_temp,
       MAX(temperature) as max_temp
FROM sensor_data
WHERE time > now() - interval '24 hours'
GROUP BY hour, sensor_id
ORDER BY hour DESC;

-- Last known value per sensor
SELECT DISTINCT ON (sensor_id)
       sensor_id, time, temperature
FROM sensor_data
ORDER BY sensor_id, time DESC;
Enter fullscreen mode Exit fullscreen mode

Continuous Aggregates (Auto-Refreshing Views)

CREATE MATERIALIZED VIEW hourly_temps
WITH (timescaledb.continuous) AS
SELECT time_bucket('1 hour', time) AS hour,
       sensor_id,
       AVG(temperature) as avg_temp,
       COUNT(*) as readings
FROM sensor_data
GROUP BY hour, sensor_id;

-- Auto-refresh policy
SELECT add_continuous_aggregate_policy('hourly_temps',
    start_offset => INTERVAL '3 hours',
    end_offset => INTERVAL '1 hour',
    schedule_interval => INTERVAL '1 hour'
);

-- Query the aggregate (instant, pre-computed)
SELECT * FROM hourly_temps ORDER BY hour DESC LIMIT 10;
Enter fullscreen mode Exit fullscreen mode

Compression (10-40x)

-- Enable compression
ALTER TABLE sensor_data SET (
    timescaledb.compress,
    timescaledb.compress_segmentby = 'sensor_id',
    timescaledb.compress_orderby = 'time DESC'
);

-- Auto-compress data older than 7 days
SELECT add_compression_policy('sensor_data', INTERVAL '7 days');

-- Check compression stats
SELECT * FROM hypertable_compression_stats('sensor_data');
Enter fullscreen mode Exit fullscreen mode

Python Example

import psycopg2
from datetime import datetime, timedelta

conn = psycopg2.connect('postgresql://postgres:password@localhost:5432/postgres')
cur = conn.cursor()

# Insert IoT data
cur.execute("""
    INSERT INTO sensor_data (time, sensor_id, temperature, humidity)
    VALUES (%s, %s, %s, %s)
""", (datetime.now(), 1, 22.5, 65.0))

# Query hourly averages
cur.execute("""
    SELECT time_bucket('1 hour', time) AS hour, AVG(temperature)
    FROM sensor_data
    WHERE time > now() - interval '24 hours'
    GROUP BY hour ORDER BY hour DESC
""")

for row in cur.fetchall():
    print(f"{row[0]}: {row[1]:.1f}°C")

conn.commit()
conn.close()
Enter fullscreen mode Exit fullscreen mode

Use Cases

Use Case Why TimescaleDB
IoT sensor data Hypertables + compression = massive scale
Application metrics Continuous aggregates for dashboards
Financial tick data Microsecond precision + fast range queries
Log analytics Compression turns TBs into GBs

Resources


Working with time-series or IoT data? I build custom data collection pipelines and scrapers. Explore my tools on Apify or email spinov001@gmail.com for custom solutions.

Top comments (0)