DEV Community

Alex Spinov
Alex Spinov

Posted on

CockroachDB Has a Free API — Here's How to Build Globally Distributed Apps

A fintech startup I know ran PostgreSQL on a single server. One day AWS us-east-1 went down. Their app was dead for 6 hours. They lost $40,000 in transactions.

They switched to CockroachDB. It survived the next AWS outage without a single dropped query.

What CockroachDB Offers for Free

CockroachDB Serverless free tier:

  • 10 GiB storage and 50M Request Units/month
  • Multi-region by default — data survives zone failures
  • PostgreSQL-compatible — use any Postgres driver
  • Automatic scaling — no capacity planning
  • Built-in CDC (Change Data Capture)

Quick Start

# Sign up at cockroachlabs.cloud
# Create a Serverless cluster (free)
# Download the connection string

export DATABASE_URL='postgresql://user:pass@cluster.cockroachlabs.cloud:26257/defaultdb?sslmode=verify-full'

# Connect with psql
psql $DATABASE_URL
Enter fullscreen mode Exit fullscreen mode

Create Tables (Standard SQL)

CREATE TABLE users (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  email STRING UNIQUE NOT NULL,
  name STRING NOT NULL,
  region STRING DEFAULT 'us-east',
  created_at TIMESTAMPTZ DEFAULT now()
);

CREATE TABLE orders (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  user_id UUID REFERENCES users(id),
  amount DECIMAL(10,2) NOT NULL,
  status STRING DEFAULT 'pending',
  created_at TIMESTAMPTZ DEFAULT now()
);

CREATE INDEX idx_orders_user ON orders(user_id);
CREATE INDEX idx_orders_status ON orders(status);
Enter fullscreen mode Exit fullscreen mode

Node.js Integration

const { Pool } = require('pg');

const pool = new Pool({
  connectionString: process.env.DATABASE_URL,
  ssl: { rejectUnauthorized: true }
});

async function createOrder(userId, amount) {
  const client = await pool.connect();
  try {
    await client.query('BEGIN');

    const { rows: [order] } = await client.query(
      'INSERT INTO orders (user_id, amount) VALUES ($1, $2) RETURNING *',
      [userId, amount]
    );

    await client.query('COMMIT');
    return order;
  } catch (e) {
    await client.query('ROLLBACK');
    throw e;
  } finally {
    client.release();
  }
}

async function getOrdersByStatus(status) {
  const { rows } = await pool.query(
    'SELECT o.*, u.email FROM orders o JOIN users u ON o.user_id = u.id WHERE o.status = $1 ORDER BY o.created_at DESC LIMIT 100',
    [status]
  );
  return rows;
}
Enter fullscreen mode Exit fullscreen mode

Multi-Region Survivability

-- CockroachDB can pin data to specific regions
ALTER TABLE users SET LOCALITY REGIONAL BY ROW AS region;

-- Now each row lives closest to its user
INSERT INTO users (email, name, region) VALUES
  ('alice@example.com', 'Alice', 'us-east'),
  ('bob@example.de', 'Bob', 'eu-west'),
  ('yuki@example.jp', 'Yuki', 'ap-northeast');
-- Alice's data stays in US, Bob's in EU (GDPR!), Yuki's in Asia
Enter fullscreen mode Exit fullscreen mode

Change Data Capture (CDC)

-- Stream changes to a webhook
CREATE CHANGEFEED FOR TABLE orders
  INTO 'webhook-https://your-api.com/webhook'
  WITH format = json, updated, resolved = '10s';

-- Or to Kafka
CREATE CHANGEFEED FOR TABLE orders
  INTO 'kafka://broker:9092?topic_name=order_events'
  WITH format = json;
Enter fullscreen mode Exit fullscreen mode

When to Use CockroachDB

CockroachDB Regular PostgreSQL
Need multi-region resilience Single-region is fine
GDPR data residency requirements No compliance needs
Zero-downtime deployments Maintenance windows OK
Horizontal scaling needed Vertical scaling sufficient

Need to collect data from distributed sources? Check out my web scraping actors on Apify — pre-built scrapers for any website.

Need a custom data pipeline? Email me at spinov001@gmail.com — I build scalable data collection systems.

Top comments (0)