Why I Stopped Using PostgreSQL for Side Projects (And You Should Too)
PostgreSQL is amazing. But for side projects, it's overkill. Here's what I use instead.
The Problem with Postgres for Small Projects
Setting up PostgreSQL:
1. Install: apt install postgresql (or Docker)
2. Create database: createdb myapp
3. Create user: createuser appuser
4. Grant permissions: GRANT ALL PRIVILEGES
5. Configure connection: host/port/user/password in .env
6. Install ORM: npm install prisma/typeorm/knex
7. Write schema: define tables, relations, indexes
8. Run migrations: npx prisma migrate dev
9. Seed data: npx prisma db seed
10. Backup strategy: pg_dump for backups
... and you just wanted to store a few users and blog posts π
What I Use Instead: SQLite
// No setup! Just require it:
const Database = require('better-sqlite3');
const db = new Database('data.db');
// It's a real SQL database β just file-based!
const users = db.prepare('SELECT * FROM users WHERE id = ?').get(userId);
db.prepare('INSERT INTO users (name, email) VALUES (?, ?)').run(name, email);
// Transactions work!
const insertUser = db.transaction((name, email) => {
const result = db.prepare('INSERT INTO users (name, email) VALUES (?, ?)').run(name, email);
db.prepare('INSERT INTO audit_log (action, user_id) VALUES (?, ?)').run('create', result.lastInsertRowid);
return result;
});
Why SQLite Wins for Side Projects
| Feature | PostgreSQL | SQLite |
|---|---|---|
| Setup | 10+ minutes | 0 seconds |
| Server process | Background daemon | None (in-process) |
| Connection string | Required | Just a filename |
| Backups | pg_dump / custom scripts | Copy the .db file |
| Portability | Need server | File works anywhere |
| Memory | 50-100MB base | <1MB |
| Concurrency | Excellent | Good enough (<100 req/s) |
| Max DB size | Unlimited | 140TB (yes, really!) |
| Full-text search | Built-in | With extension |
| JSON support | β | β (JSON1 since 3.38) |
When to Use What
β
Use SQLite when:
- Side projects / MVPs / prototypes
- < 100 concurrent writes/second
- Single server deployment
- Want zero-ops database
- Need simple backup (copy file)
- Building tools/utilities
- Embedded/IoT applications
β
Use PostgreSQL when:
- Multiple servers need access
- High write concurrency (1000+ req/s)
- Complex relational queries at scale
- Need specific PostGIS/PG features
- Team of developers sharing one DB
- Enterprise compliance requirements
- Need row-level security
My SQLite Setup (Production-Ready)
// db.js
import Database from 'better-sqlite3';
import path from 'path';
const db = new Database(path.join(process.cwd(), 'data.db'));
// Enable WAL mode for better concurrent reads
db.pragma('journal_mode = WAL');
db.pragma('synchronous = NORMAL');
db.pragma('foreign_keys = ON');
db.pragma('cache_size = -64000'); // 64MB cache
// Auto-create tables on startup
db.exec(`
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
email TEXT UNIQUE NOT NULL,
role TEXT DEFAULT 'user',
active INTEGER DEFAULT 1,
created_at TEXT DEFAULT (datetime('now')),
updated_at TEXT DEFAULT (datetime('now'))
);
CREATE TABLE IF NOT EXISTS posts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
title TEXT NOT NULL,
body TEXT DEFAULT '',
published INTEGER DEFAULT 0,
created_at TEXT DEFAULT (datetime('now')),
FOREIGN KEY (user_id) REFERENCES users(id)
);
CREATE INDEX IF NOT EXISTS idx_users_email ON users(email);
CREATE INDEX IF NOT EXISTS idx_posts_user ON posts(user_id);
CREATE INDEX IF NOT EXISTS idx_posts_published ON posts(published, created_at DESC);
`);
export default db;
Migrations Made Simple
// migrations/001_initial.js
export function up(db) {
db.exec(`
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
email TEXT UNIQUE NOT NULL,
created_at TEXT DEFAULT (datetime('now'))
)
`);
}
export function down(db) {
db.exec(`DROP TABLE IF EXISTS users`);
}
// migrate.js
import Database from 'better-sqlite3';
import { readdir } from 'fs/promises';
import path from 'path';
async function migrate() {
const db = new Database('data.db');
// Track applied migrations
db.exec(`CREATE TABLE IF NOT EXISTS _migrations (name TEXT PRIMARY KEY, applied_at TEXT)`);
const files = (await readdir('./migrations'))
.filter(f => f.endsWith('.js'))
.sort();
for (const file of files) {
const applied = db.prepare('SELECT * FROM _migrations WHERE name = ?').get(file);
if (!applied) {
console.log(`Running migration: ${file}`);
const mod = await import(path.join('./migrations', file));
await mod.up(db);
db.prepare('INSERT INTO _migrations (name, applied_at) VALUES (?, datetime("now"))').run(file);
}
}
console.log('Migrations complete!');
}
migrate();
Performance Tips
// 1. Prepared statements (reusable, fast)
const getUserById = db.prepare('SELECT * FROM users WHERE id = ?');
const findUserByEmail = db.prepare('SELECT * FROM users WHERE email = ?');
const createUser = db.prepare('INSERT INTO users (name, email) VALUES (?, ?)');
const updateUser = db.prepare('UPDATE users SET name = ?, updated_at = datetime("now") WHERE id = ?');
// 2. Transactions for multiple writes
const transferFunds = db.transaction((fromId, toId, amount) => {
db.prepare('UPDATE accounts SET balance = balance - ? WHERE id = ?').run(amount, fromId);
db.prepare('UPDATE accounts SET balance = balance + ? WHERE id = ?').run(amount, toId);
});
// 3. WAL mode (already enabled above)
// Allows readers while someone is writing!
// 4. Indexes on frequently queried columns
db.exec('CREATE INDEX IF NOT EXISTS idx_posts_published ON posts(published)');
// 5. Batch inserts for bulk data
const insertMany = db.transaction((items) => {
const stmt = db.prepare('INSERT INTO items (name, value) VALUES (?, ?)');
for (const item of items) stmt.run(item.name, item.value);
});
insertMany(largeArray); // Much faster than individual inserts
Real-World Examples Using SQLite
Projects I've built with SQLite:
- Blog engine (posts, comments, tags) β 10K+ posts, no issues
- Task manager (users, tasks, projects) β 500 active users
- Analytics dashboard (events, sessions) β 50K events/day
- Configuration service (key-value settings) β Instant lookups
- API rate limiter (request logs) β Writes every request, still fine
All running on $5 VPS with SQLite. Zero database maintenance.
What database do you use for side projects? Still loyal to Postgres?
Follow @armorbreak for more developer content.
Top comments (0)