Before you begin, make sure you have a PostgreSQL Docker image and container ready, along with your SQL dump file. If you don’t already have a PostgreSQL container set up, you’ll need to create one with proper credentials.
$ docker ps
This command lists all active containers. You should see your PostgreSQL container listed. For instance, one named exampledb.
In my case, a friend shared a large dump database with me, and I needed to explore and understand its contents. Since the dataset was quite large, I decided to work with Docker because it offers a faster, persistent, and reusable setup for managing SQL files.
To copy the dump file into your Docker container, use the following command:
$ docker cp path_of_dump_file.sql <container_name>:tmp/<name_stored.sql>
For instance:
$ docker cp ~/Downloads/example_db.sql exampledb:tmp/example_db.sql
If successful, you’ll see a confirmation message like:
Successfully copied 23MB to exampledb:/tmp/example_db.sql
At this point, your SQL file is available inside the container, and you can use it to populate your PostgreSQL database. Once the data is loaded, you can start inspecting, identifying, and debugging your database as needed.
If you ever need to export (dump) your PostgreSQL database from Docker back to your local machine, use the following command:
$ docker exec <container_name> pg_dump -U <db_user> <db_name> > file.sql
For instance:
$ docker exec exampledb pg_dump -U root exampledb > example_db_dump.sql
This will create a example_db_dump.sql file on your local system containing the full backup of your database.
Thank you.
Remember to keep learning and exploring new things.
Top comments (0)