DEV Community

Cover image for Dump and Restore PostgreSQL Tables
Andrew Atkinson
Andrew Atkinson

Posted on • Edited on

5

Dump and Restore PostgreSQL Tables

Sometimes it's helpful to dump an exact copy of a database table so it can be loaded elsewhere. How can you do that?

In this example, we will dump and restore a table and assume there is no sensitive content in the table being dumped.

pg_dump can be used to dump table content to a file. By using compression the file size can be kept small even for large tables.

Specify the output format with --format (or -F). The c format is the default. Refer to pg_dump Docs for each of the options and values.

Without compression, the file would be 144MB file from a users table with 1042368 rows. With compression it is 19MB. Compression is continually being improved so this might be even better in newer versions of PostgreSQL or by using external compression programs.

In the example below, the database name is rideshare_development.

Run pg_dump as follows.

pg_dump \
    --format=c \
    --host localhost \
    --table users \
    rideshare_development > users_table.dump
Enter fullscreen mode Exit fullscreen mode
$ du -h users_table.dump
19M    users_table.dump
Enter fullscreen mode Exit fullscreen mode

From the .dump file, the table and all the data rows can be populated into a database where neither exists.

Create a database to create the table and populate it. Call the database temp_load. The command below uses the createdb command line program included with PostgreSQL. The next command loads the content of the dump file using pg_restore.

createdb temp_load;

pg_restore --dbname temp_load users_table.dump
Enter fullscreen mode Exit fullscreen mode
-- Confirm all the rows are there
SELECT count(*) FROM users;
  count
---------
 1042368
Enter fullscreen mode Exit fullscreen mode

That's it! Note that this is only one of the ways to dump and restore table rows. Explore the PostgreSQL documentation for more options.

API Trace View

How I Cut 22.3 Seconds Off an API Call with Sentry

Struggling with slow API calls? Dan Mindru walks through how he used Sentry's new Trace View feature to shave off 22.3 seconds from an API call.

Get a practical walkthrough of how to identify bottlenecks, split tasks into multiple parallel tasks, identify slow AI model calls, and more.

Read more →

Top comments (0)

Billboard image

Try REST API Generation for Snowflake

DevOps for Private APIs. Automate the building, securing, and documenting of internal/private REST APIs with built-in enterprise security on bare-metal, VMs, or containers.

  • Auto-generated live APIs mapped from Snowflake database schema
  • Interactive Swagger API documentation
  • Scripting engine to customize your API
  • Built-in role-based access control

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay