Okay so real talk. I've done it twice.
The first time was during college. I was trying to drop a test table, typed the wrong database name, and wiped out actual user data from a project I was working on with friends. The second time was worse. I was 11pm deep into shipping a feature, ran a migration script, and it quietly nuked 3 months of production records.
Both times I just sat there staring at my screen like ๐๏ธ๐๐๏ธ
And both times, I had no backup.
So I built ChronoBase a free, open source, local-first database backup manager for PostgreSQL and MongoDB. It runs on your laptop, has a proper web UI, schedules automatic backups, and lets you restore to any database with one click.
No monthly fees. No cloud account. No random company holding your data hostage.
Why does every backup tool either cost money or suck?
Genuinely asked myself this for way too long.
pgAdmin has manual backup buried in menus. MongoDB Compass doesn't really do scheduled backups. Navicat is like $250. Every cloud backup tool wants your credit card just to get started.
For solo devs and indie hackers running multiple side projects, all you really need is:
- A place to store all your database connection strings
- Something that dumps them automatically on a schedule
- The ability to restore any snapshot to any target
That's it. That's the whole thing. Why does this not exist as a simple, free, open source tool?
So I built it.
What ChronoBase actually does
Here's the quick rundown:
Connections โ you paste in your PostgreSQL or MongoDB connection URLs, give them a name, group them by project (like Production, Staging, MySaaS). The passwords are masked in the UI so you're not staring at credentials all day.
One-click backup โ hit the button, it runs pg_dump or mongodump under the hood, saves the .sql file or MongoDB BSON folder to your local disk. It takes like 2 seconds for small DBs.
Scheduled backups โ set a cron expression (0 2 * * * = daily at 2am) and forget about it. The server runs in the background and fires backups while you sleep. You wake up and they're just... there.
Restore โ pick any snapshot, choose "same database" (overwrites with the old snapshot) or "push to different DB" (clone to staging, new server, whatever). One click.
Wipe DB โ this one's spicy. Sometimes you want to completely nuke the contents of a database before restoring a backup. The Wipe DB button drops every table, every schema (DROP SCHEMA public CASCADE for Postgres), then you restore clean. No leftover junk data.
Danger Zone โ full reset with typed confirmation. You literally have to type DELETE EVERYTHING to confirm. Because I've seen what happens when buttons don't have confirmation dialogs.
The tech stack (and why I kept it stupid simple)
Okay so I went with Node.js + Express on the backend and vanilla ES modules on the frontend. No React, no bundler, no build step.
Why? Because I wanted anyone to be able to clone this, read the code, and immediately understand what's happening. No npm run build, no compiled bundle, no mystery.
Data is stored in a plain JSON file (chronobase-data.json). No SQLite, no native modules that break on Node v24 (learned that one the hard way mid-build). If you want to move to a new machine, you literally just copy the folder.
The project structure is modular โ one file per concern:
src/
database.js โ JSON read/write
tools.js โ finds pg_dump/mongodump on your system
backup.js โ runs the actual dumps and restores
scheduler.js โ manages cron jobs
routes/ โ one route file per resource
Frontend is the same โ public/js/pages/ has one file per page. Clean, readable, zero magic.
The actual hard part: detecting CLI tools across Windows, macOS, and Linux
You know what's annoying? pg_dump lives in completely different places depending on OS, PostgreSQL version, and whether you installed it through an installer, Homebrew, or apt.
The tools.js module checks a list of known paths:
const PG_DUMP_CANDIDATES = [
'pg_dump',
'/opt/homebrew/bin/pg_dump',
'C:\\Program Files\\PostgreSQL\\18\\bin\\pg_dump.exe',
'C:\\Program Files\\PostgreSQL\\17\\bin\\pg_dump.exe',
// ... and more
];
It probes each one by running "<path>" --version with a 5 second timeout. First one that responds gets used. Works on every platform without configuration.
The UI shows green/red dots in the sidebar telling you instantly which tools are detected. If something's missing, the Setup page gives you exact install commands for Windows, macOS, and Linux.
How to run it right now (for real, it takes 3 minutes)
git clone https://github.com/Subham-Maity/chronobase.git
cd chronobase
npm install
node server.js
Open http://localhost:3420.
That's it. It's running. Add your first connection string and click Backup.
For Windows you can also double-click START.bat which auto-installs dependencies, starts the server, and opens the browser for you.
Why local-first matters more than you think
Every time you send your database connection string to a cloud backup service, that's another place your credentials live that isn't yours.
ChronoBase keeps everything local. Your .sql files are on your disk. Your connection URLs are in a JSON file on your machine. Nothing is uploaded, transmitted, or stored anywhere external.
For solo devs this is usually fine as-is. For anything with real user data, you obviously want encryption at rest โ that's on the roadmap. But the local-first baseline is the right default.
What I want to add next
Current roadmap (also in the README if you want to contribute):
- S3 / R2 / Backblaze upload โ offsite backup support
- Backup compression โ gzip .sql files, they can get large
- Slack/Discord notifications โ ping when a scheduled backup fails
- MySQL / MariaDB โ same engine, just different CLI tools
-
Docker image โ
docker run -p 3420:3420 chronobase - Encryption โ AES-256 for stored connection URLs
If any of these sound useful to you, PRs are open. The codebase is intentionally small and readable โ a decent Node dev can add a new route in like 20 minutes.
The real reason I'm sharing this
I know there are thousands of devs running side projects and SaaS apps on $5 droplets who don't have a proper backup strategy. It's not because they don't care. It's because setting one up is annoying and all the existing tools are either overkill or paywalled.
This is meant to be the thing that just works. Run it on your laptop or your home server, add your connection strings, set a schedule, and stop thinking about it.
Because the backup you don't have is the one you'll wish you did at 1am when things go wrong.
GitHub: github.com/Subham-Maity/chronobase
If it saves you from a disaster, drop a โญ โ it genuinely helps the project get discovered.
And if you've had your own database horror story, tell me in the comments. Misery loves company.

Top comments (0)