DEV Community

Cover image for Generating nicely parsed schema files with pg_dump_schema_parser
Bolaji Wahab
Bolaji Wahab

Posted on • Updated on

Generating nicely parsed schema files with pg_dump_schema_parser

Ever had to go through thousands of lines of a schema file generated by pg_dump, it can be tedious and tasking. A little background 4-thousand-lines-of-code-in-a-single-file

A proper database design/development should be version controlled with a proper schema migration in place but some times we may want a schema dump for several reasons.

pg_dump is a tool supplied with PostgreSQL, used in taking bakckups and schema dumps of a single database.

I have had to look through 7000 lines of a schema file and it wasn't funny at all, I lost interest along the way. This prompted me to do some search for a tool I could use to split the schema dump into smaller pieces. I stumbled upon few stackoverflow questions around it, notably the following

While the above questions were limited to tables, I needed a tool that targets all object types.

After an exhaustive search and I couldn't find a tool to do the job properly, I decided to write one to do the job. This gave birth to pg_schema_dump_parser.

pg_schema_dump_parser is a tool written in python which runs pg_dump in the background and parses the generated schema into the respective object name and according to the type of the object. Have a look at how to use pg_schema_dump_parser

Having a schema dump which is nicely parsed into smaller pieces can greatly improve developer experience, imagine a developer having to go through a schema dump of 7000 thousands of lines just to find the definition for a particular table and another developer locating the specific file with the definition of a table of interest. The second developer would have a nice experience and find the table's definition in lesser time.

Top comments (0)