Quick Reference Table
# | Operation | Makefile Command | Direct Command (Shortened) | Safety | Confirmation |
---|---|---|---|---|---|
1 | Copy staging → local | make local-import-staging |
npx supabase db reset --local + dumps |
✅ | None |
2 | Copy production → local | make local-import-production |
npx supabase db reset --local + dumps |
✅ | None |
3 | Create new migration | make local-create-migration |
npx supabase db diff -f name |
✅ | None |
4 | Diff local to file | make local-get-migration |
npx supabase db diff -f name --local |
✅ | None |
5 | Apply migrations locally | make local-apply-migration |
npx supabase db push --local |
✅ | None |
6 | Reset local DB | make local-reset |
npx supabase db reset --local |
⚠️ | yes |
7 | CLI authentication | make supabase-login |
npx supabase login |
✅ | None |
8 | Diff staging to file | make staging-get-migration |
npx supabase db diff -f name --db-url "$STAGING_DB_URL" |
✅ | None |
9 | Deploy to staging | make staging-apply-migration |
npx supabase db push --db-url "$STAGING_DB_URL" |
⚠️ | yes |
10 | Reset staging DB | make staging-reset |
psql "$STAGING_DB_URL" -c "DROP SCHEMA..." |
🔶 | RESET-STAGING |
11 | Diff production to file | make prod-get-migration |
npx supabase db diff -f name --db-url "$PROD_DB_URL" |
✅ | None |
12 | Deploy to production | make prod-apply-migration |
npx supabase db push --db-url "$PROD_DB_URL" |
🔴 | APPLY-TO-PRODUCTION |
13 | Promote staging → prod | make prod-apply-staging-migration |
npx supabase db dump + psql "$PROD_DB_URL"
|
🔴 | APPLY-TO-PRODUCTION |
14 | Reset production DB | make prod-reset |
psql "$PROD_DB_URL" -c "DROP SCHEMA..." |
🔴🔴 | DELETE-ALL-PRODUCTION-DATA |
15 | Verify tools installed | make check-deps |
node --version && npx supabase --version |
✅ | None |
16 | Remove dump files | make clean |
rm -f *.sql |
✅ | None |
17 | Show DB URLs | make check-env |
echo $STAGING_DB_URL |
✅ | None |
18 | Clear migration history | make delete-migrations |
psql -c "TRUNCATE supabase_migrations..." |
⚠️ | yes |
Introduction
Managing database migrations across multiple environments (Local, Staging, Production) can be challenging. This comprehensive guide shows you two approaches: using a Makefile-based workflow system for automation and safety, or running direct commands for more control.
Choose your approach:
- 🔧 With Makefile: Automated, safe, with built-in confirmations (requires setup files)
- ⚡ Direct Commands: Manual, flexible, no setup required (jump to the "Direct Commands" section)
Why Use a Migration Management System?
Traditional database migration management often leads to:
- 🔴 Accidental production changes
- 🔴 Manual command errors
- 🔴 Inconsistent environments
- 🔴 Lost migration history
A structured approach provides:
- ✅ Safety confirmations for destructive operations
- ✅ Automated workflows
- ✅ Clear environment separation
- ✅ Version-controlled migrations
Prerequisites
Before getting started, ensure you have:
- Node.js and npm installed
- Supabase CLI installed:
npm install supabase --save-dev
- Docker Desktop running (required for local Supabase)
- PostgreSQL client tools (psql) - optional but recommended
Initialize your project:
npx supabase init
npx supabase start
If using the Makefile approach, additionally run:
chmod +x scripts/*.sh # Make scripts executable
Two Approaches to Choose From
Approach 1: Using Makefile (Requires Setup Files)
If you have access to the Makefile and helper scripts, this approach provides automation and safety features.
Pros:
- ✅ Simple, memorable commands
- ✅ Built-in safety confirmations
- ✅ Automated workflows
- ✅ Less chance of errors
Cons:
- ❌ Requires Makefile and script files
- ❌ Less flexibility for custom operations
Continue reading the sections below for Makefile commands.
Approach 2: Direct Commands (No Setup Required)
If you don't have the Makefile or prefer manual control, you can run Supabase CLI and psql commands directly.
Pros:
- ✅ No setup files needed
- ✅ Full control over operations
- ✅ Easy to integrate into CI/CD
- ✅ Works anywhere
Cons:
- ❌ More verbose commands
- ❌ Manual safety checks
- ❌ Higher chance of typos
Skip to the "Direct Commands (Without Makefile)" section below.
Project Structure (For Makefile Approach)
your-project/
├── Makefile # Main command interface
├── .env # Environment variables (gitignored)
├── scripts/
│ ├── apply-to-local.sh # Apply dumps to local
│ ├── dump-staging-migrations.sh # Export staging schema
│ ├── apply-migrations-to-prod.sh # Push to production
│ └── reset-remote-db.sh # Reset remote databases
└── supabase/
└── migrations/ # Migration files
Note: If you don't have these files, use the Direct Commands approach instead.
Environment Setup
Create a .env
file in your project root:
# Local Environment (default)
LOCAL_DB_URL=postgresql://postgres:postgres@localhost:54322/postgres
# Staging Environment
STAGING_DB_URL=postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres
STAGING_PROJECT_REF=your-staging-ref
# Production Environment
PROD_DB_URL=postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres
PROD_PROJECT_REF=your-production-ref
⚠️ Security Note: Never commit .env
to version control! Add it to .gitignore
.
Complete Command Reference
🖥️ Local Environment Commands
1. Import Staging to Local
With Makefile:
make local-import-staging
Direct Command:
# Set staging URL
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
# Reset and import
npx supabase db reset --local
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_schema.sql
npx supabase db dump --db-url "$STAGING_DB_URL" --data-only -f staging_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_data.sql
What it does:
- Resets local database (drops existing data)
- Dumps staging schema and data
- Applies both to local environment
- Perfect for testing staging data locally
Use case: You want to debug an issue reported in staging with real data.
2. Import Production to Local
With Makefile:
make local-import-production
Direct Command:
# Set production URL
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# Reset and import
npx supabase db reset --local
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > production_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f production_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < production_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < production_data.sql
What it does:
- Resets local database
- Dumps production schema and data
- Applies both to local environment
Use case: Investigating a production bug or testing a hotfix with production data.
⚠️ Warning: Production data may contain sensitive information. Handle with care!
3. Create New Migration
With Makefile:
make local-create-migration
# Enter migration name when prompted
Direct Command:
npx supabase db diff -f add_user_profiles_table
What it does:
- Creates a new timestamped migration file in
supabase/migrations/
- Captures your local database changes
Example workflow:
# Make changes to your local database via Supabase Studio
# Then capture them:
npx supabase db diff -f add_user_profiles
# Creates: 20251004123045_add_user_profiles.sql
4. Get Migration from Local Changes
With Makefile:
make local-get-migration
# Enter migration name when prompted
Direct Command:
npx supabase db diff -f capture_all_changes --local
What it does:
- Diffs your local database against the migration history
- Generates a new migration file with the differences
Use case: You've made multiple changes and want to capture them all as one migration.
5. Apply Local Migrations
With Makefile:
make local-apply-migration
Direct Command:
npx supabase db push --local
What it does:
- Applies all pending migrations to local database
Use case: After pulling new migrations from Git, sync your local database.
6. Reset Local Database
With Makefile:
make local-reset
# Type 'yes' to confirm
Direct Command:
npx supabase db reset --local
What it does:
- Drops all local data
- Reapplies all migrations from scratch
Use case: Clean slate for testing or recovering from a corrupted local state.
7. Supabase Login
With Makefile:
make supabase-login
Direct Command:
npx supabase login
What it does:
- Authenticates with Supabase CLI
- Required before running remote operations
🔶 Staging Environment Commands
8. Get Migration from Staging
With Makefile:
make staging-get-migration
# Enter migration name when prompted
Direct Command:
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
npx supabase db diff -f migration_from_staging --db-url "$STAGING_DB_URL"
What it does:
- Diffs staging database against local migrations
- Creates a migration file capturing staging-specific changes
Use case: Someone made manual changes in staging that need to be migrated to code.
9. Apply Migration to Staging
With Makefile:
make staging-apply-migration
# Type 'yes' to confirm
Direct Command:
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
npx supabase db push --db-url "$STAGING_DB_URL"
What it does:
- Pushes local migrations to staging
Use case: Deploying tested changes from local to staging for QA testing.
Safety level: ⚠️ Medium - affects staging environment
10. Reset Staging Database
With Makefile:
make staging-reset
# Type 'RESET-STAGING' to confirm
Direct Command:
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
# Execute reset
psql "$STAGING_DB_URL" -c "DROP SCHEMA IF EXISTS public CASCADE; CREATE SCHEMA public; GRANT ALL ON SCHEMA public TO postgres; GRANT ALL ON SCHEMA public TO public;"
What it does:
- Drops public schema
- Recreates empty schema
Use case: Starting fresh after testing or clearing corrupted staging data.
Safety level: 🔶 High - destroys staging data
🔴 Production Environment Commands
11. Get Migration from Production
With Makefile:
make prod-get-migration
# Enter migration name when prompted
Direct Command:
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db diff -f migration_from_production --db-url "$PROD_DB_URL"
What it does:
- Diffs production database against local migrations
- Creates migration file capturing production state
Use case: Documenting manual production hotfixes in code.
12. Apply Migration to Production
With Makefile:
make prod-apply-migration
# Type 'APPLY-TO-PRODUCTION' to confirm
Direct Command:
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db push --db-url "$PROD_DB_URL"
What it does:
- Pushes local migrations to production
Use case: Final deployment after successful staging tests.
Safety level: 🔴 Critical - affects production
13. Apply Staging Migration to Production
With Makefile:
make prod-apply-staging-migration
# Type 'APPLY-TO-PRODUCTION' to confirm
Direct Command:
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# Dump staging
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_to_prod_$(date +%Y%m%d_%H%M%S).sql
# Review the file first!
cat staging_to_prod_*.sql
# Apply to production
psql "$PROD_DB_URL" < staging_to_prod_*.sql
What it does:
- Dumps current staging schema
- Creates timestamped migration file
- Applies staging state to production
Use case: Promoting a tested staging environment to production.
Safety level: 🔴 Critical - affects production
14. Reset Production Database
With Makefile:
make prod-reset
# Type 'DELETE-ALL-PRODUCTION-DATA' to confirm
Direct Command:
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# ⚠️⚠️⚠️ EXTREMELY DANGEROUS ⚠️⚠️⚠️
psql "$PROD_DB_URL" -c "DROP SCHEMA IF EXISTS public CASCADE; CREATE SCHEMA public; GRANT ALL ON SCHEMA public TO postgres; GRANT ALL ON SCHEMA public TO public;"
What it does:
- Drops public schema in production
Use case: Rare! Only for complete environment resets or catastrophic recovery.
Safety level: 🔴🔴 Extremely Dangerous - destroys all production data
🔧 Utility Commands
15. Check Dependencies
With Makefile:
make check-deps
Direct Command:
node --version
npx --version
npx supabase --version
psql --version
What it does:
- Verifies Node.js, npm, and Supabase CLI are installed
- Confirms all required tools are available
16. Clean Dump Files
With Makefile:
make clean
Direct Command:
rm -f staging_schema.sql staging_data.sql production_schema.sql production_data.sql staging_migration_*.sql
What it does:
- Removes generated SQL dump files
17. Check Environment Configuration
With Makefile:
make check-env
Direct Command:
echo "Local: postgresql://postgres:postgres@localhost:54322/postgres"
echo "Staging: $STAGING_DB_URL"
echo "Production: $PROD_DB_URL"
What it does:
- Displays configured database URLs
- Helps verify environment setup
18. Delete Migration Records
With Makefile:
make delete-migrations
# Type 'yes' to confirm
Direct Command:
# View current migrations
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "SELECT * FROM supabase_migrations.schema_migrations;"
# Delete all migration records
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "TRUNCATE supabase_migrations.schema_migrations;"
# Or delete specific migration
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "DELETE FROM supabase_migrations.schema_migrations WHERE version = '20251004123045';"
What it does:
- Clears migration history from database
Use case: When migration history is corrupted or needs cleanup.
Real-World Workflows
Workflow 1: Feature Development
With Makefile:
# 1. Start with fresh staging data
make local-import-staging
# 2. Develop your feature locally
# Make database changes via Supabase Studio or SQL
# 3. Create migration
make local-create-migration
# Enter: add_comments_feature
# 4. Test locally
npm run dev
# 5. Deploy to staging
make staging-apply-migration
# 6. After QA approval, deploy to production
make prod-apply-migration
Direct Commands:
# 1. Import staging data
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
npx supabase db reset --local
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_schema.sql
npx supabase db dump --db-url "$STAGING_DB_URL" --data-only -f staging_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_data.sql
# 2. Develop feature (make changes)
# 3. Create migration
npx supabase db diff -f add_comments_feature
# 4. Test locally
npm run dev
# 5. Deploy to staging
npx supabase db push --db-url "$STAGING_DB_URL"
# 6. Deploy to production
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db push --db-url "$PROD_DB_URL"
Workflow 2: Hotfix in Production
With Makefile:
# 1. Import production data locally
make local-import-production
# 2. Reproduce and fix the issue
# 3. Create migration for the fix
make local-create-migration
# Enter: hotfix_user_permissions
# 4. Test thoroughly locally
# 5. Apply directly to production (after approval)
make prod-apply-migration
Direct Commands:
# 1. Import production data
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db reset --local
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f prod_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_data.sql
# 2. Fix the issue
# 3. Create migration
npx supabase db diff -f hotfix_user_permissions
# 4. Test thoroughly
# 5. Apply to production
npx supabase db push --db-url "$PROD_DB_URL"
Workflow 3: Environment Synchronization
With Makefile:
# Sync staging to production
make prod-apply-staging-migration
# Or sync production back to staging
make staging-reset
make prod-get-migration
make staging-apply-migration
Direct Commands:
# Sync staging to production
STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# Dump staging
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_to_prod.sql
# Apply to production
psql "$PROD_DB_URL" < staging_to_prod.sql
# Or sync production to staging
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_to_staging.sql
psql "$STAGING_DB_URL" < prod_to_staging.sql
Workflow 4: Debugging with Real Data
With Makefile:
# Import production data for investigation
make local-import-production
# Investigate the issue
# No changes needed? Just clean up
make local-reset
Direct Commands:
# Import production data
PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db reset --local
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f prod_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_data.sql
# Investigate
# Clean up
npx supabase db reset --local
Script Deep Dive (For Makefile Approach)
Note: This section is only relevant if you're using the Makefile approach with helper scripts. If you're using direct commands, skip this section.
apply-to-local.sh
This script intelligently applies SQL files to your local database:
Features:
- Accepts multiple SQL files as arguments
- Defaults to
staging_schema.sql
andstaging_data.sql
- Prefers local
psql
if available - Falls back to Docker exec if psql isn't installed
- Applies files in order with error handling
Usage examples:
# Apply default files
./scripts/apply-to-local.sh
# Apply specific files in order
./scripts/apply-to-local.sh schema.sql data.sql
# Apply single migration
./scripts/apply-to-local.sh supabase/migrations/20251004_my_migration.sql
dump-staging-migrations.sh
Exports staging database schema to a timestamped file:
What it captures:
- Public schema structure
- Tables, columns, constraints
- Indexes and foreign keys
- Functions and triggers
Output example:
staging_migration_20251004_143052.sql
apply-migrations-to-prod.sh
Safely applies staging migrations to production:
Safety features:
- Finds latest staging migration file
- Displays file information
- Requires explicit confirmation
- Uses PostgreSQL transaction handling
- Reports success/failure clearly
reset-remote-db.sh
Resets remote databases with extreme caution:
How it works:
- Accepts DB URL and environment name
- Drops public schema entirely
- Recreates empty public schema
- Restores default permissions
- Uses Node.js pg client for reliability
Safety Features Explained
Confirmation Levels
Safety Level | Confirmation Required | Use Case |
---|---|---|
✅ Safe | None | Read-only or local operations |
⚠️ Medium | Type yes
|
Staging operations |
🔶 High | Type RESET-STAGING
|
Destructive staging operations |
🔴 Critical | Type APPLY-TO-PRODUCTION
|
Production modifications |
🔴🔴 Extreme | Type DELETE-ALL-PRODUCTION-DATA
|
Production destruction |
Best Practices
- Always Test Locally First
make local-apply-migration # Test here first
make staging-apply-migration # Then here
make prod-apply-migration # Finally here
- Review Migrations Before Applying
# Check what will be applied
cat supabase/migrations/20251004_my_migration.sql
- Backup Before Major Changes
# Makefile creates backups automatically during imports
# But you can also manually dump production
make prod-get-migration
- Version Control Everything
git add supabase/migrations/
git commit -m "feat: add user profiles table"
-
Document Database Changes
- Use descriptive migration names
- Add comments in migration files
- Update README with schema changes
Troubleshooting
Common Issues
1. "Permission denied" on scripts
chmod +x scripts/*.sh
2. "psql: command not found"
# macOS
brew install postgresql
# Ubuntu/Debian
sudo apt-get install postgresql-client
# Windows
# Download from postgresql.org
3. "Connection refused" to local DB
# Check if Supabase is running
npx supabase status
# Start if not running
npx supabase start
4. "Docker container not found"
# Ensure Docker Desktop is running
# Restart Supabase
npx supabase stop
npx supabase start
5. Migration conflicts
# Reset local and reimport
make local-reset
make local-import-staging
# Or manually resolve conflicts in migration files
6. ".env variables not loading"
# Check file exists
ls -la .env
# Check variable names match exactly
make check-env
Advanced Tips
Custom Migrations
Create complex migrations with multiple statements:
-- supabase/migrations/20251004_custom.sql
BEGIN;
-- Add new table
CREATE TABLE user_profiles (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES auth.users(id),
display_name TEXT,
avatar_url TEXT,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Add RLS policies
ALTER TABLE user_profiles ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Users can view all profiles"
ON user_profiles FOR SELECT
TO authenticated
USING (true);
CREATE POLICY "Users can update own profile"
ON user_profiles FOR UPDATE
TO authenticated
USING (auth.uid() = user_id);
-- Add indexes
CREATE INDEX idx_user_profiles_user_id ON user_profiles(user_id);
COMMIT;
Selective Data Import
Modify apply-to-local.sh
to exclude specific tables:
# Dump without sensitive tables
npx supabase db dump \
--db-url "$STAGING_DB_URL" \
--exclude-table=user_secrets \
--exclude-table=payment_info \
-f staging_data_safe.sql
Automated Backups
Add to your deployment pipeline:
# Before production deployment
timestamp=$(date +%Y%m%d_%H%M%S)
npx supabase db dump \
--db-url "$PROD_DB_URL" \
-f "backups/prod_backup_${timestamp}.sql"
Quick Reference Table
# | Command | Description | Safety | Confirmation |
---|---|---|---|---|
1 | local-import-staging |
Copy staging → local | ✅ | None |
2 | local-import-production |
Copy production → local | ✅ | None |
3 | local-create-migration |
Create new migration | ✅ | None |
4 | local-get-migration |
Diff local to file | ✅ | None |
5 | local-apply-migration |
Apply migrations locally | ✅ | None |
6 | local-reset |
Reset local DB | ⚠️ | yes |
7 | supabase-login |
CLI authentication | ✅ | None |
8 | staging-get-migration |
Diff staging to file | ✅ | None |
9 | staging-apply-migration |
Deploy to staging | ⚠️ | yes |
10 | staging-reset |
Reset staging DB | 🔶 | RESET-STAGING |
11 | prod-get-migration |
Diff production to file | ✅ | None |
12 | prod-apply-migration |
Deploy to production | 🔴 | APPLY-TO-PRODUCTION |
13 | prod-apply-staging-migration |
Promote staging → prod | 🔴 | APPLY-TO-PRODUCTION |
14 | prod-reset |
Reset production DB | 🔴🔴 | DELETE-ALL-PRODUCTION-DATA |
15 | check-deps |
Verify tools installed | ✅ | None |
16 | clean |
Remove dump files | ✅ | None |
17 | check-env |
Show DB URLs | ✅ | None |
18 | delete-migrations |
Clear migration history | ⚠️ | yes |
Conclusion
This Makefile-based system provides a robust, safe, and efficient way to manage Supabase databases across multiple environments. Key benefits:
- Safety First: Multiple confirmation levels prevent accidents
- Automation: Reduces manual errors and saves time
- Clarity: Clear command names and workflows
- Flexibility: Supports complex migration scenarios
- Team-Friendly: Consistent commands across team members
Next Steps
- Set up your
.env
file with connection strings - Make scripts executable:
chmod +x scripts/*.sh
- Start with local operations to familiarize yourself
- Gradually adopt staging and production workflows
- Customize scripts to fit your specific needs
Resources
Direct Commands Reference (Complete List)
Below is a comprehensive reference of all commands you can use without the Makefile. These commands give you full control and can be used in any environment.
Setup & Authentication
# Install Supabase CLI
npm install supabase --save-dev
# Initialize Supabase
npx supabase init
# Start local Supabase
npx supabase start
# Stop local Supabase
npx supabase stop
# Check status
npx supabase status
# Login to Supabase
npx supabase login
# Check Supabase version
npx supabase --version
Local Database Operations
# Reset local database (applies all migrations from scratch)
npx supabase db reset --local
# Push local migrations to local database
npx supabase db push --local
# Create a new migration file
npx supabase db diff -f add_user_profiles
# Generate migration from local database changes
npx supabase db diff -f capture_changes --local
# Apply a specific migration file
psql "postgresql://postgres:postgres@localhost:54322/postgres" < supabase/migrations/20251004_your_migration.sql
# Dump local database schema
npx supabase db dump --local --schema public > local_schema.sql
# Dump local database data
npx supabase db dump --local --data-only -f local_data.sql
# Dump specific tables only
npx supabase db dump --local --table=users --table=posts > specific_tables.sql
Staging Database Operations
# Set your staging DB URL (do this once per session)
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
# Dump staging schema
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_schema.sql
# Dump staging data
npx supabase db dump --db-url "$STAGING_DB_URL" --data-only -f staging_data.sql
# Dump staging schema and data together
npx supabase db dump --db-url "$STAGING_DB_URL" > staging_full.sql
# Generate migration from staging database
npx supabase db diff -f migration_from_staging --db-url "$STAGING_DB_URL"
# Push local migrations to staging
npx supabase db push --db-url "$STAGING_DB_URL"
# Apply specific migration to staging
psql "$STAGING_DB_URL" < supabase/migrations/20251004_your_migration.sql
# Reset staging database (⚠️ drops all data)
psql "$STAGING_DB_URL" -c "DROP SCHEMA IF EXISTS public CASCADE; CREATE SCHEMA public; GRANT ALL ON SCHEMA public TO postgres; GRANT ALL ON SCHEMA public TO public;"
# Backup staging before changes
npx supabase db dump --db-url "$STAGING_DB_URL" > staging_backup_$(date +%Y%m%d_%H%M%S).sql
Production Database Operations
# Set your production DB URL (do this once per session)
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# Dump production schema
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > production_schema.sql
# Dump production data
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f production_data.sql
# Dump production schema and data together
npx supabase db dump --db-url "$PROD_DB_URL" > production_full.sql
# Generate migration from production database
npx supabase db diff -f migration_from_production --db-url "$PROD_DB_URL"
# Push local migrations to production (⚠️ CRITICAL)
npx supabase db push --db-url "$PROD_DB_URL"
# Apply specific migration to production (⚠️ CRITICAL)
psql "$PROD_DB_URL" < supabase/migrations/20251004_your_migration.sql
# Reset production database (⚠️⚠️⚠️ EXTREMELY DANGEROUS)
psql "$PROD_DB_URL" -c "DROP SCHEMA IF EXISTS public CASCADE; CREATE SCHEMA public; GRANT ALL ON SCHEMA public TO postgres; GRANT ALL ON SCHEMA public TO public;"
# Always backup production before changes!
npx supabase db dump --db-url "$PROD_DB_URL" > prod_backup_$(date +%Y%m%d_%H%M%S).sql
Import Database to Local
# Import staging to local
npx supabase db reset --local
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_schema.sql
npx supabase db dump --db-url "$STAGING_DB_URL" --data-only -f staging_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_data.sql
# Import production to local
npx supabase db reset --local
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > production_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f production_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < production_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < production_data.sql
# Import with error handling (stops on first error)
psql "postgresql://postgres:postgres@localhost:54322/postgres" -v ON_ERROR_STOP=1 < your_file.sql
Using Docker (if psql not installed locally)
# Get Docker container ID for Supabase DB
DB_CONTAINER_ID=$(docker ps -q --filter "label=com.docker.compose.service=db" --filter "name=supabase-db" | head -n 1)
# Check if container is found
echo $DB_CONTAINER_ID
# Apply SQL file via Docker
cat your_file.sql | docker exec -i $DB_CONTAINER_ID psql -U postgres -d postgres -v ON_ERROR_STOP=1
# Execute SQL command via Docker
docker exec -i $DB_CONTAINER_ID psql -U postgres -d postgres -c "SELECT * FROM your_table;"
# Dump database via Docker
docker exec $DB_CONTAINER_ID pg_dump -U postgres -d postgres > dump.sql
# Interactive psql session via Docker
docker exec -it $DB_CONTAINER_ID psql -U postgres -d postgres
Advanced Operations
# Dump specific tables only
npx supabase db dump --db-url "$STAGING_DB_URL" --table=users --table=posts -f specific_tables.sql
# Exclude specific tables from dump (useful for sensitive data)
npx supabase db dump --db-url "$STAGING_DB_URL" --exclude-table=sensitive_data --exclude-table=secrets --data-only -f safe_data.sql
# Dump with custom schema (not just public)
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public --schema auth --schema storage > full_schema.sql
# Create migration comparing two databases
npx supabase db diff --linked --schema public
# View migration history in database
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "SELECT * FROM supabase_migrations.schema_migrations ORDER BY version;"
# Manually add migration record (if migration was applied outside Supabase)
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "INSERT INTO supabase_migrations.schema_migrations (version) VALUES ('20251004123045');"
# Delete specific migration record
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "DELETE FROM supabase_migrations.schema_migrations WHERE version = '20251004123045';"
# Clear all migration records (⚠️ use with caution)
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "TRUNCATE supabase_migrations.schema_migrations;"
# View all migration records
psql "$STAGING_DB_URL" -c "SELECT version, inserted_at FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 10;"
Database Inspection Commands
# Test database connection
psql "$STAGING_DB_URL" -c "SELECT version();"
# List all databases
psql "$STAGING_DB_URL" -c "\l"
# List all tables in public schema
psql "$STAGING_DB_URL" -c "\dt public.*"
# Describe table structure
psql "$STAGING_DB_URL" -c "\d public.users"
# Get database size
psql "$STAGING_DB_URL" -c "SELECT pg_size_pretty(pg_database_size('postgres'));"
# Get table sizes (sorted by size)
psql "$STAGING_DB_URL" -c "
SELECT
schemaname,
tablename,
pg_size_pretty(pg_total_relation_size(schemaname||'.'||tablename)) AS size,
pg_total_relation_size(schemaname||'.'||tablename) AS bytes
FROM pg_tables
WHERE schemaname = 'public'
ORDER BY pg_total_relation_size(schemaname||'.'||tablename) DESC;
"
# Count rows in all tables
psql "$STAGING_DB_URL" -c "
SELECT
schemaname,
tablename,
n_live_tup AS row_count
FROM pg_stat_user_tables
WHERE schemaname = 'public'
ORDER BY n_live_tup DESC;
"
# View active connections
psql "$STAGING_DB_URL" -c "
SELECT
datname,
usename,
client_addr,
state
FROM pg_stat_activity
WHERE datname = 'postgres';
"
# Check PostgreSQL version
psql "$STAGING_DB_URL" -c "SHOW server_version;"
Utility Commands
# Check if required tools are installed
node --version
npm --version
npx --version
npx supabase --version
psql --version
docker --version
# Set environment variables temporarily
export LOCAL_DB_URL="postgresql://postgres:postgres@localhost:54322/postgres"
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# Display current environment variables
echo "Local: $LOCAL_DB_URL"
echo "Staging: $STAGING_DB_URL"
echo "Production: $PROD_DB_URL"
# Clean up dump files
rm -f staging_schema.sql staging_data.sql production_schema.sql production_data.sql staging_migration_*.sql
# Create backup directory
mkdir -p backups
# Backup with timestamp
npx supabase db dump --db-url "$PROD_DB_URL" > backups/prod_backup_$(date +%Y%m%d_%H%M%S).sql
Complete Workflow Examples (No Makefile)
Example 1: Create and Deploy a Migration
# 1. Make changes locally (via Supabase Studio or direct SQL)
# For example, create a table in Supabase Studio
# 2. Create migration from your changes
npx supabase db diff -f add_user_profiles
# 3. Review the generated migration file
cat supabase/migrations/*_add_user_profiles.sql
# 4. Apply to local (test it works)
npx supabase db push --local
# 5. Verify it worked
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "\d public.user_profiles"
# 6. Apply to staging
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
npx supabase db push --db-url "$STAGING_DB_URL"
# 7. Test in staging environment
# Run your application tests
# 8. Apply to production (after thorough testing)
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db push --db-url "$PROD_DB_URL"
Example 2: Import Production Data Locally
# 1. Set production URL
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# 2. Reset local database
npx supabase db reset --local
# 3. Dump production schema
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_schema.sql
# 4. Dump production data
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f prod_data.sql
# 5. Apply schema to local
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_schema.sql
# 6. Apply data to local
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_data.sql
# 7. Verify import
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "SELECT COUNT(*) FROM users;"
# 8. Clean up dump files
rm prod_schema.sql prod_data.sql
Example 3: Sync Staging to Production
# 1. Set environment variables
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
# 2. Backup production first!
npx supabase db dump --db-url "$PROD_DB_URL" > prod_backup_$(date +%Y%m%d_%H%M%S).sql
# 3. Dump staging schema
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_to_prod.sql
# 4. Review the file (IMPORTANT!)
cat staging_to_prod.sql
# or use a text editor
# code staging_to_prod.sql
# 5. Apply to production (⚠️ CRITICAL OPERATION)
psql "$PROD_DB_URL" < staging_to_prod.sql
# 6. Verify production
psql "$PROD_DB_URL" -c "\dt public.*"
# 7. Clean up
rm staging_to_prod.sql
Example 4: Hotfix Production Issue
# 1. Import production data locally to reproduce issue
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
npx supabase db reset --local
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f prod_data.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_data.sql
# 2. Reproduce the bug locally
# 3. Fix the issue (modify database via Studio or SQL)
# 4. Create migration
npx supabase db diff -f hotfix_user_permissions
# 5. Test the fix locally
npm run dev
# 6. Review migration file
cat supabase/migrations/*_hotfix_user_permissions.sql
# 7. Apply to production (after approval)
npx supabase db push --db-url "$PROD_DB_URL"
# 8. Verify fix in production
psql "$PROD_DB_URL" -c "SELECT * FROM your_affected_table LIMIT 5;"
# 9. Apply same fix to staging for consistency
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
npx supabase db push --db-url "$STAGING_DB_URL"
Environment Variables Setup
For easier command usage, create a .env
file or add to your shell profile:
Option 1: Using .env file
Create .env
in your project root:
# Local Environment
LOCAL_DB_URL=postgresql://postgres:postgres@localhost:54322/postgres
# Staging Environment
STAGING_DB_URL=postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres
STAGING_PROJECT_REF=your-staging-ref
# Production Environment
PROD_DB_URL=postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres
PROD_PROJECT_REF=your-production-ref
Then load it:
# Load .env file
export $(cat .env | xargs)
# Or use in a script
source .env
Option 2: Add to shell profile
Add to ~/.bashrc
or ~/.zshrc
:
# Supabase Database URLs
export LOCAL_DB_URL="postgresql://postgres:postgres@localhost:54322/postgres"
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
Reload your shell:
source ~/.bashrc
# or
source ~/.zshrc
Tips for Direct Commands
- Always use full DB URLs - Don't rely on shortcuts or aliases
- Backup before destructive operations:
npx supabase db dump --db-url "$PROD_DB_URL" > backup_$(date +%Y%m%d).sql
- Use transactions for multiple changes:
BEGIN;
-- Your changes here
ALTER TABLE users ADD COLUMN age INTEGER;
UPDATE users SET age = 25 WHERE age IS NULL;
COMMIT;
-- Or ROLLBACK; if something goes wrong
- Check migration history before applying:
psql "$DB_URL" -c "SELECT * FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 5;"
- Review migration files before applying:
cat supabase/migrations/20251004_your_migration.sql
- Use error handling:
psql "$DB_URL" -v ON_ERROR_STOP=1 < your_migration.sql
- Test in local first, then staging, then production:
# Local
npx supabase db push --local
# Staging
npx supabase db push --db-url "$STAGING_DB_URL"
# Production (after testing)
npx supabase db push --db-url "$PROD_DB_URL"
- Create aliases for common commands (add to shell profile):
alias sp-local="npx supabase db push --local"
alias sp-staging="npx supabase db push --db-url \$STAGING_DB_URL"
alias sp-prod="npx supabase db push --db-url \$PROD_DB_URL"
alias sp-diff="npx supabase db diff -f"
Quick Command Cheatsheet
Task | Command |
---|---|
Start Supabase | npx supabase start |
Stop Supabase | npx supabase stop |
Check status | npx supabase status |
Login | npx supabase login |
Reset local | npx supabase db reset --local |
Create migration | npx supabase db diff -f name |
Apply locally | npx supabase db push --local |
Apply to staging | npx supabase db push --db-url "$STAGING_DB_URL" |
Apply to prod | npx supabase db push --db-url "$PROD_DB_URL" |
Dump schema | npx supabase db dump --db-url "$URL" --schema public > file.sql |
Dump data | npx supabase db dump --db-url "$URL" --data-only -f file.sql |
Apply SQL | psql "$DB_URL" < file.sql |
Execute SQL | psql "$DB_URL" -c "SQL" |
View migrations | psql "$DB_URL" -c "SELECT * FROM supabase_migrations.schema_migrations;" |
--- Manually add migration record
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "INSERT INTO supabase_migrations.schema_migrations (version) VALUES ('20251004123045');"
Delete migration record
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "DELETE FROM supabase_migrations.schema_migrations WHERE version = '20251004123045';"
Clear all migration records
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "TRUNCATE supabase_migrations.schema_migrations;"
### Utility Commands
bash
Check PostgreSQL version
psql --version
Test database connection
psql "$STAGING_DB_URL" -c "SELECT version();"
List all databases
psql "$STAGING_DB_URL" -c "\l"
List all tables in public schema
psql "$STAGING_DB_URL" -c "\dt public.*"
Describe table structure
psql "$STAGING_DB_URL" -c "\d public.your_table"
Get database size
psql "$STAGING_DB_URL" -c "SELECT pg_size_pretty(pg_database_size('postgres'));"
Get table sizes
psql "$STAGING_DB_URL" -c "SELECT schemaname, tablename, pg_size_pretty(pg_total_relation_size(schemaname||'.'||tablename)) AS size FROM pg_tables WHERE schemaname = 'public' ORDER BY pg_total_relation_size(schemaname||'.'||tablename) DESC;"
### Environment Variables for Scripts
Create a `.env` file or export these variables:
bash
Local
export LOCAL_DB_URL="postgresql://postgres:postgres@localhost:54322/postgres"
Staging
export STAGING_DB_URL="postgresql://postgres:YOUR_PASSWORD@db.YOUR_STAGING_REF.supabase.co:5432/postgres"
export STAGING_PROJECT_REF="your-staging-ref"
Production
export PROD_DB_URL="postgresql://postgres.YOUR_PROD_REF:YOUR_PASSWORD@aws-1-ap-south-1.pooler.supabase.com:6543/postgres"
export PROD_PROJECT_REF="your-production-ref"
### Quick Command Reference
| Operation | Command |
|-----------|---------|
| Start local Supabase | `npx supabase start` |
| Stop local Supabase | `npx supabase stop` |
| Check status | `npx supabase status` |
| Reset local DB | `npx supabase db reset --local` |
| Create migration | `npx supabase db diff -f name` |
| Apply migrations locally | `npx supabase db push --local` |
| Apply to staging | `npx supabase db push --db-url "$STAGING_DB_URL"` |
| Apply to production | `npx supabase db push --db-url "$PROD_DB_URL"` |
| Dump schema | `npx supabase db dump --db-url "$URL" --schema public > file.sql` |
| Dump data | `npx supabase db dump --db-url "$URL" --data-only -f file.sql` |
| Apply SQL file | `psql "$DB_URL" < file.sql` |
| Execute SQL | `psql "$DB_URL" -c "SQL COMMAND"` |
### Complete Workflow Examples (No Makefile)
**Example 1: Create and Deploy a Migration**
bash
1. Make changes locally (via Supabase Studio or SQL)
2. Create migration
npx supabase db diff -f add_user_profiles
3. Apply to local (test)
npx supabase db push --local
4. Apply to staging
npx supabase db push --db-url "$STAGING_DB_URL"
5. Apply to production (after testing)
npx supabase db push --db-url "$PROD_DB_URL"
**Example 2: Import Production Data Locally**
bash
1. Reset local database
npx supabase db reset --local
2. Dump production
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_schema.sql
npx supabase db dump --db-url "$PROD_DB_URL" --data-only -f prod_data.sql
3. Apply to local
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_schema.sql
psql "postgresql://postgres:postgres@localhost:54322/postgres" < prod_data.sql
4. Verify
psql "postgresql://postgres:postgres@localhost:54322/postgres" -c "SELECT COUNT(*) FROM your_table;"
**Example 3: Sync Staging to Production**
bash
1. Dump staging schema
npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_to_prod.sql
2. Review the file
cat staging_to_prod.sql
3. Backup production first
npx supabase db dump --db-url "$PROD_DB_URL" --schema public > prod_backup_$(date +%Y%m%d).sql
4. Apply to production
psql "$PROD_DB_URL" < staging_to_prod.sql
### Tips for Direct Commands
1. **Always use full DB URLs** - Don't rely on shortcuts
2. **Test with --dry-run** when available
3. **Backup before destructive operations**
4. **Use transactions** for multiple changes:
sql
BEGIN;
-- Your changes here
COMMIT;
-- Or ROLLBACK; if something goes wrong
5. **Check migration history** before applying:
bash
psql "$DB_URL" -c "SELECT * FROM supabase_migrations.schema_migrations;"
---
## Conclusion
This guide provides two powerful approaches to managing Supabase databases across multiple environments:
### Choose Your Approach
**Use Makefile if you:**
- ✅ Want simplified, memorable commands
- ✅ Need built-in safety confirmations
- ✅ Prefer automation over manual control
- ✅ Work in a team with consistent workflows
**Use Direct Commands if you:**
- ✅ Don't have access to setup files
- ✅ Need maximum flexibility
- ✅ Integrate with CI/CD pipelines
- ✅ Prefer explicit control over each operation
### Key Benefits
**Safety First**:
- Multiple confirmation levels prevent accidents (with Makefile)
- Explicit commands reduce ambiguity (with direct commands)
**Automation**:
- Reduces manual errors and saves time
- Consistent workflows across team members
**Clarity**:
- Clear command names and purposes
- Easy to understand what each operation does
**Flexibility**:
- Supports complex migration scenarios
- Adaptable to different workflows
### Best Practices Summary
1. **Always Test Locally First**
bash
# Test here first
npx supabase db push --local
# Then staging
npx supabase db push --db-url "$STAGING_DB_URL"
# Finally production
npx supabase db push --db-url "$PROD_DB_URL"
2. **Backup Before Major Changes**
bash
npx supabase db dump --db-url "$PROD_DB_URL" > backup_$(date +%Y%m%d).sql
3. **Review Migrations Before Applying**
bash
cat supabase/migrations/20251004_your_migration.sql
4. **Version Control Everything**
bash
git add supabase/migrations/
git commit -m "feat: add user profiles table"
git push
5. **Document Database Changes**
- Use descriptive migration names
- Add comments in migration files
- Update documentation with schema changes
6. **Use Transactions for Complex Changes**
sql
BEGIN;
-- Multiple operations
ALTER TABLE users ADD COLUMN status TEXT;
UPDATE users SET status = 'active';
COMMIT;
7. **Monitor Migration Status**
bash
psql "$DB_URL" -c "SELECT * FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 10;"
### Common Gotchas to Avoid
❌ **Don't skip environments** - Always test in local → staging → production order
❌ **Don't bypass confirmations** - They exist to prevent disasters
❌ **Don't forget backups** - Especially before production changes
❌ **Don't commit sensitive data** - Never commit `.env` files or connection strings
❌ **Don't apply untested migrations** - Review and test thoroughly
❌ **Don't ignore migration conflicts** - Resolve them properly
❌ **Don't modify existing migrations** - Create new ones instead
### Troubleshooting Quick Reference
| Problem | Solution |
|---------|----------|
| Permission denied on scripts | `chmod +x scripts/*.sh` |
| psql not found | Install PostgreSQL client tools |
| Connection refused | Check if Supabase is running: `npx supabase status` |
| Docker container not found | Ensure Docker Desktop is running |
| Migration conflicts | Reset and reimport: `npx supabase db reset --local` |
| .env not loading | Check file exists and variable names match |
| Slow migrations | Check database size and indexes |
| Authentication failed | Run `npx supabase login` |
### Next Steps
1. **Set up your environment**
- Install required tools
- Configure `.env` file (or export variables)
- Initialize Supabase locally
2. **Choose your approach**
- With Makefile: Set up project structure and scripts
- Direct commands: Bookmark this guide for reference
3. **Start with local operations**
- Practice creating and applying migrations
- Get comfortable with the workflow
4. **Gradually adopt staging workflow**
- Test deployments in staging first
- Build confidence with the process
5. **Implement production workflow**
- Always backup before changes
- Follow the testing pyramid: local → staging → production
6. **Customize for your needs**
- Adapt workflows to your team's requirements
- Add custom scripts or aliases
- Integrate with CI/CD pipelines
### Additional Resources
- **[Supabase CLI Documentation](https://supabase.com/docs/guides/cli)** - Official CLI reference
- **[PostgreSQL Documentation](https://www.postgresql.org/docs/)** - Database fundamentals
- **[Supabase Migration Guide](https://supabase.com/docs/guides/cli/local-development#database-migrations)** - Official migration docs
- **[Database Migration Best Practices](https://supabase.com/blog/database-migrations)** - Supabase blog
- **[GNU Make Manual](https://www.gnu.org/software/make/manual/)** - Makefile reference
### Getting Help
If you encounter issues:
1. **Check Supabase logs**:
bash
npx supabase logs
2. **Verify connection**:
bash
psql "$DB_URL" -c "SELECT version();"
3. **Check migration status**:
bash
psql "$DB_URL" -c "SELECT * FROM supabase_migrations.schema_migrations;"
4. **Review migration files**:
bash
ls -la supabase/migrations/
5. **Consult Supabase support**:
- Discord: [https://discord.supabase.com](https://discord.supabase.com)
- GitHub Discussions: [https://github.com/supabase/supabase/discussions](https://github.com/supabase/supabase/discussions)
- Documentation: [https://supabase.com/docs](https://supabase.com/docs)
---
## Final Thoughts
Database migrations don't have to be scary. With the right tools and workflows, you can safely manage changes across multiple environments with confidence.
**Remember the golden rules:**
1. 🧪 **Test thoroughly** - Local → Staging → Production
2. 💾 **Backup always** - Before any production change
3. 📝 **Document everything** - Future you will thank you
4. 🔒 **Review carefully** - Double-check before applying
5. 🚀 **Deploy confidently** - With proper testing and backups
Whether you use the Makefile approach for convenience or direct commands for flexibility, the most important thing is consistency and safety in your workflow.
**With great power comes great responsibility. Always verify which environment you're working with, especially when running production commands!** 🚀
---
*Happy migrating! Have questions or suggestions? Feel free to adapt this workflow to your specific needs!*
---
## Appendix: Command Comparison Table
### Side-by-Side Comparison
| Operation | Makefile | Direct Command |
|-----------|----------|----------------|
| **Import Staging** | `make local-import-staging` | `npx supabase db reset --local && npx supabase db dump --db-url "$STAGING_DB_URL" --schema public > staging_schema.sql && npx supabase db dump --db-url "$STAGING_DB_URL" --data-only -f staging_data.sql && psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_schema.sql && psql "postgresql://postgres:postgres@localhost:54322/postgres" < staging_data.sql` |
| **Create Migration** | `make local-create-migration` | `npx supabase db diff -f migration_name` |
| **Apply Locally** | `make local-apply-migration` | `npx supabase db push --local` |
| **Apply to Staging** | `make staging-apply-migration` | `npx supabase db push --db-url "$STAGING_DB_URL"` |
| **Apply to Production** | `make prod-apply-migration` | `npx supabase db push --db-url "$PROD_DB_URL"` |
| **Reset Local** | `make local-reset` | `npx supabase db reset --local` |
### Pros and Cons
| Aspect | Makefile | Direct Commands |
|--------|----------|-----------------|
| **Ease of Use** | ✅ Very simple | ⚠️ More complex |
| **Setup Required** | ⚠️ Yes (files needed) | ✅ No setup |
| **Safety** | ✅ Built-in confirmations | ⚠️ Manual checks |
| **Flexibility** | ⚠️ Fixed workflows | ✅ Full control |
| **CI/CD Integration** | ⚠️ Requires files | ✅ Easy integration |
| **Learning Curve** | ✅ Low | ⚠️ Medium |
| **Customization** | ⚠️ Requires editing files | ✅ Highly customizable |
| **Team Consistency** | ✅ Enforced | ⚠️ Manual |
---
**That's it! You're now equipped with everything you need to manage Supabase migrations like a pro!** 🎉
Top comments (0)