For my hobby projects I like to download live, production db and load it on my localhost environment so that I can quickly explore data or test potentially dangerous migrations.
Assumptions
This post assumes your hosting is on Railway, and you use Postgresql as a database.
Optional, but I'd also assume that you use dotenv for your Ruby on Rails backend and your app is running locally on Docker.
Credentials
In my app I have two env files ./backend/.env.development
:
# localhost
PGDATABASE=some-local-database
PGUSER=postgres
PGPASSWORD=somepassword
PGHOST=db
PGPORT=5432
# docker:db container defaults
POSTGRES_PASSWORD=${PGPASSWORD}
Thanks to this config, my application will be able to connect to local database using the same set of environment variables as Railway.
My Rails db config then looks like backend/config/database.yml
:
default: &default
adapter: postgresql
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
timeout: 5000
encoding: unicode
database: <%= ENV["PGDATABASE"] %>
username: <%= ENV["PGUSER"] %>
password: <%= ENV["PGPASSWORD"] %>
host: <%= ENV["PGHOST"] %>
production:
<<: *default
development:
<<: *default
test:
<<: *default
database: some_app_test
The aliasing of POSTGRESQL_PASSWORD
then takes advantage of default Postgresql Docker setup which will then set up respective user on container creation.
One thing to note is that Railway uses TimescaleDB so your docker-compose.yml
will need to:
services:
db:
env_file:
- ./backend/.env.development
image: timescale/timescaledb-postgis:latest-pg13
Second file looks very similar ./backend/.env.production
:
# railway.app
PGDATABASE=some-railway-database
PGUSER=postgres
PGHOST=some-railway-url.railway.app
PGPORT=1234
You need to fill this one with credentials found in your Railway -> PostgreSQL -> Variables
.
The script
Now for the actual dump/download/load I'm using following script ./bin/database_pull.sh
:
#!/bin/bash
DUMP_FILENAME='./some-app.com_dump.sql'
source ./backend/.env.production || exit $?
echo '💾 Creating production dump'
pg_dump -U $PGUSER -h $PGHOST -p $PGPORT -W -F t $PGDATABASE > $DUMP_FILENAME
source ./backend/.env.development || exit $?
echo '🎡 Loading dump on localhost'
dropdb $PGDATABASE -p $PGPORT -U $PGUSER -h $PGHOST
createdb $PGDATABASE -p $PGPORT -U $PGUSER -h $PGHOST
pg_restore -U $PGUSER -h $PGHOST -p $PGPORT -W -F t -d $PGDATABASE $DUMP_FILENAME
echo '✨ Success!'
This script needs to be provided the values of ${PGPASSWORD}
manually. You have to provide production first and local second:
./bin/database_pull.sh
💾 Creating production dump
Password:
🎡 Loading dump on localhost
Password:
✨ Success!
Conclusion
Developer Experience is important part of every developer life. Being able to use standard configuration and run one command to get my hands on the data I need in the format I prefer is IMO a huge DX improvement.
Image generated by Midjourney prompt: railway hosting; blog post; downloading database; coding; ruby on rails --ar 16:9
Top comments (1)
very helpful!