DEV Community

Cover image for Using Postgres Notifications To Enhance Dev Testing
Joshua Matthews
Joshua Matthews

Posted on • Edited on

Using Postgres Notifications To Enhance Dev Testing

tl;dr Link to Github repo

The Setup

Sometimes your database can seem like a black box. Especially as you are running your application and it makes tons of calls and actions out to it. So many inserts and updates it's hard to keep track of what's happening.

"What was the ID of that latest entry?"

"What does this foreign key point to again?"

All valid questions as you debug your latest changes to your application. The DB keeps it's curtain drawn while you have to search for these new updates. Usually your front-end can help grab them all for you, but that won't be the case where you can view your DB changes easily through the front-end all the time.

So what can you do?

The Investigation

In my case, our database is Postgres and Postgres supports a nifty feature called pg_notify. I'm sure other DBs support this feature in some form or another, but for the purposes of this article I will be focusing on my use case.

The basic usage of this function is as follows:

pg_notify('channel', 'notification text');
Enter fullscreen mode Exit fullscreen mode

You have the channel you'd like to post the notification on and the notification text data you'd like to send on that channel.

So we know how to notify, how do we listen?

In Postgres you can listen on a channel LISTEN 'channel'. However, we don't want to listen in Postgres, we want to listen in our listener application.

Stomping on the Curtains

To setup our own notifications for our tables we just need to setup a reusable trigger.

create or replace function person_notif_fn()returns trigger as $f$
begin perform pg_notify('person', to_jsonb(new)::text);
    return new;
end $f$ language plpgsql;
Enter fullscreen mode Exit fullscreen mode

Here we have a simple function that returns a trigger that performs a pg_notify call to our channel (person) and takes the incoming new object and transforms it into a JSON string. Then we return new back out.

Now that we have our trigger function, we need to attach a trigger to call our trigger on our table.

create trigger person_notif_tr after insert or update on person
for each row execute function table_notif_fn();
Enter fullscreen mode Exit fullscreen mode

That's it! Now our database will notify us of any inserts or deletes on our person table.

The Hookup

Creating an application that can tap into this is pretty easy with certain libraries. I created my app in Zig so I'll be using pg.zig but you can use whatever you'd like.

First we establish our connection to our DB

const db = try pg.Pool.init(std.heap.page_allocator, .{
            .size = 5,
            .connect = .{
                .port = 5432,
                .host = "127.0.0.1",
            },
            .auth = .{
                .username = "postgres",
                .password = "postgres",
                .database = "postgres",
                .timeout = 10_000,
            },
        });
Enter fullscreen mode Exit fullscreen mode

We've created our Pool now we just need to create a listener and connect to it.

const listener = db.newListener();
listener.listen("person");
Enter fullscreen mode Exit fullscreen mode

Now our DB driver is listening on the person channel for notifications. So let's listen for them.

// create a fixed buffer allocator for our JSON object
var out_buffer: [4096 * 6]u8 = undefined;
var fixed_alloc = std.heap.FixedBufferAllocator.init(&out_buffer);
// infinite loop to continually listen for updates
while (true) {
    // iterate on our listener
    while(listener.next()) |notif| {
        // reset the allocator
        fixed_alloc.reset();
        // create a dynamic string
        var string_writer = std.ArrayList(u8).init(fixed_alloc.allocator());
        // this is a custom object we are creating to store our info
            const info = notification{
                .channel = notif.channel,
                .payload = notif.payload,
            };
        // stringify the data
            try std.json.stringify(
                info,
                .{},
                string_writer.writer(),
            );
            // write out to buffer/websocket/etc.
    }
}
Enter fullscreen mode Exit fullscreen mode

Here we've created an infinite loop to continually check our listener poll. It's an infinite loop because our listener times out every now and then when polling for updates. (There could be a way to fix this but I couldn't find it quickly). Then we stringify the data and from there we can send it off to a buffer/web-socket/ whatever you want the data to be sent to.

And that's pretty much it! From here we can let our imagination run wild with how we want to improve or customize on top of this.

The Next Level

Looking further into some of the niceties that Postgres gives us with it's built in functions and metadata from it's schema tables, we can automate getting all table data for any DB and adding triggers to them dynamically when connecting. This could streamline our listener application and become a really powerful development tool for us to use.

I won't go into full detail because you can reference my Github project for the full implementation, but here is a simple snippet to grab all base tables in a Postgres DB.

SELECT table_schema || '.' || table_name
FROM information_schema.tables
WHERE table_type = 'BASE TABLE' AND table_schema NOT IN (
  'pg_catalog', 'information_schema', 'typeorm'
);
Enter fullscreen mode Exit fullscreen mode

This query will return a list of tables in the format <table_schema>.<table_name> for the entire DB -- excluding pg_catalog, information_schema, and typeorm. Below is an example output.

public.acconuts
public.mail_info
public.log_info
Enter fullscreen mode Exit fullscreen mode

Now I'm going to do some hand wavy magic to skip over setting up the pipeline of getting the Postgres metadata into my server and displaying on a webpage, since the purpose of this article was just to be an introduction to DB notifications.

Nevertheless, here is a demo of the final prototype of setting up notifications and grabbing the metadata of all tables in my DB.

The Conclusion

There's probably a lot of untapped potential in the tools you integrate with that you could be using to amplify your developer experience. Don't leave any stone unturned and always ask if your tool could do more, because it probably can.

Top comments (0)