A developer I know ran a migration on production MySQL. It locked the table for 4 minutes. 50,000 users got 500 errors. His Slack exploded.
PlanetScale solves this with non-blocking schema changes. It's MySQL, but with Git-like branching for your database schema.
What PlanetScale Offers for Free
PlanetScale Hobby plan:
- 5 GB storage and 1 billion row reads/month
- MySQL 8.0 compatible — use any MySQL client
- Branching — test schema changes on a branch before merging to production
- Non-blocking schema changes — no table locks, ever
- Connection strings with built-in connection pooling
- Automatic backups
Quick Start
# Install CLI
brew install planetscale/tap/pscale
# Login and create database
pscale auth login
pscale database create my-app --region us-east
# Connect
pscale shell my-app main
Create Schema
CREATE TABLE users (
id BIGINT AUTO_INCREMENT PRIMARY KEY,
email VARCHAR(255) UNIQUE NOT NULL,
name VARCHAR(255) NOT NULL,
plan ENUM('free', 'pro', 'enterprise') DEFAULT 'free',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE posts (
id BIGINT AUTO_INCREMENT PRIMARY KEY,
user_id BIGINT NOT NULL,
title VARCHAR(500) NOT NULL,
content TEXT,
published BOOLEAN DEFAULT FALSE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
KEY idx_user_id (user_id),
KEY idx_published (published, created_at)
);
Git-Like Schema Branching
# Create a branch for schema changes
pscale branch create my-app add-comments
# Connect to the branch
pscale shell my-app add-comments
# Make changes safely on the branch
# > CREATE TABLE comments (...);
# > ALTER TABLE posts ADD COLUMN comment_count INT DEFAULT 0;
# Create deploy request (like a PR)
pscale deploy-request create my-app add-comments
# Review and merge — zero downtime, no locks
pscale deploy-request deploy my-app 1
Node.js Integration
const mysql = require('mysql2/promise');
const pool = mysql.createPool(process.env.DATABASE_URL);
async function createPost(userId, title, content) {
const [result] = await pool.execute(
'INSERT INTO posts (user_id, title, content, published) VALUES (?, ?, ?, ?)',
[userId, title, content, false]
);
return result.insertId;
}
async function getPublishedPosts(page = 1, perPage = 20) {
const offset = (page - 1) * perPage;
const [rows] = await pool.execute(
`SELECT p.*, u.name as author
FROM posts p JOIN users u ON p.user_id = u.id
WHERE p.published = TRUE
ORDER BY p.created_at DESC
LIMIT ? OFFSET ?`,
[perPage, offset]
);
return rows;
}
async function searchPosts(query) {
const [rows] = await pool.execute(
`SELECT * FROM posts
WHERE MATCH(title, content) AGAINST(? IN BOOLEAN MODE)
AND published = TRUE
LIMIT 20`,
[query]
);
return rows;
}
Prisma Integration
// schema.prisma
datasource db {
provider = "mysql"
url = env("DATABASE_URL")
relationMode = "prisma" // Required for PlanetScale
}
model User {
id Int @id @default(autoincrement())
email String @unique
posts Post[]
}
model Post {
id Int @id @default(autoincrement())
title String
userId Int
user User @relation(fields: [userId], references: [id])
createdAt DateTime @default(now())
@@index([userId])
}
Why PlanetScale Over Regular MySQL
| PlanetScale | Regular MySQL |
|---|---|
| Non-blocking migrations | ALTER TABLE locks |
| Schema branching | Hope and pray |
| Auto connection pooling | Manual Pgbouncer-like setup |
| Built-in read replicas | Complex replication config |
| Usage-based pricing | Pay for idle capacity |
Need to scrape data into your database? Check out my web scraping actors on Apify — automated data collection from any website.
Building a custom data pipeline? Email me at spinov001@gmail.com — I connect web scraping to databases.
Top comments (0)