DEV Community

Cover image for DBOS-Cloud: Minimal Effort Change Data Capture (CDC) Tool
Vince Fulco (It / It's)
Vince Fulco (It / It's)

Posted on

8

DBOS-Cloud: Minimal Effort Change Data Capture (CDC) Tool

This is a simple DBOS example focusing on remote deployment to DBOS Cloud, their hosted solution with a generous free tier for devs.

The github repo sets up: 1) a workflow with an http api endpoint that receives events from Supabase when an INSERT is made to a table, 2) the same workflow archives the event payload to a DBOS hosted Postgres table, and 3) additional endpoints as utilities to view and delete all events while in development.

Prerequisites

  1. Make sure you have node.js 21.x

  2. Sign up for DBOS Cloud (https://www.dbos.dev/dbos-cloud)

  3. Have a Supabase account and a table you are inserting in to.

Getting Started

  1. Clone the repository and navigate to the project directory

  2. Install the dependencies

    Be sure not to commit / hard code your secrets to a public repo! This setup is meant for local development and direct deployment to the dbos-cloud service.

  3. To deploy to DBOS-cloud, login with this, "npx dbos-cloud login" and follow the instructions to match the uuid given in the console to the one in the browser, then standard login user/password applies.

  4. Next provision a d'base instance: "npx dbos-cloud db provision database-instance-name -U database-username"

  5. Register your app with the d'base instance: "npx dbos-cloud app register -d database-instance-name"

  6. To use secrets in DBOS, add your variables in the cli like this:

export PGPASSWORD=put-the-password-you-created-when-you-setup-the-remote-database-here

These will be picked up at build time and inserted into the dbos-config.yaml fields: ${PGPASSWORD}.

You will notice on line #56 of operations.ts, there is a @PostApi decorator which sets up a url with a randomly generated endpoint. This will be the receiver of events from Supabase. I created the randomly generated endpoint to protect the api (sometimes called protection by obfuscation).

**Note: this is not a recommended method! It was done as a quick and dirty dev hack. For real production use, you will want to use DBOS' authentication / authorization features **

Take the base url that DBOS returned when the deploy finished and add the randomly generated endpoint you create (instructions are in the code).

It should look something like this: https://foo-bar-dbos.cloud.dbos.dev/OBstAqG6qOv7cWXCqgg. You'll use this in the next step.


The Supabase settings

On the Supabase side, set up the trigger to publish changes externally by: 1) choosing the project you want to use, then on the left hand side menu go to "Database" --> "Webhooks". 2) create a new webhook, give it a name, choose the table to watch, and the events (in our case insert, but update and delete are also available). Make sure the http request method is set to POST, then enter the URL created by DBOS-Cloud plus the randomly generated stub. Hit "Create Webhook".

Staying in the Supabase console, go to the "Table Editor", choose your table, and add a new row with the required inputs.

Almost immediately, you should be able to query in Insomnia / Postman or curl given your specific DBOS supplied url, https://foo-bar-dbos.cloud.dbos.dev/getEventData, and see the event.

For reference, here is the format of a sample event body sent to DBOS-Cloud:

{

    "type": "INSERT",

    "table": "fooData",

    "record": {

        "id": "tempId-e71m1dtxan6b5c0dqxkbhoez",

        "email": "joe@company.com",

        "lastName": "Blow",

        "firstName": "Joe",

        "created_at": "2024-07-25T06:03:10.792595+00:00",

        "updated_at": "2024-07-25T06:03:10.792595"

},

    "schema": "public",

    "old_record": null

}
Enter fullscreen mode Exit fullscreen mode

Record fields will vary based on how your table is structured. For simplicity purposes, in this example, I archive the JSON object in total vs. parsing the records further.


Reference Docs (From Official Repo)


Resources to learn more --

Awesome-dbos

The first article in this series uses cron and Postmark email service provider.

The second article in this series aggregates and archives data from Supabase tables.

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read full post →

Top comments (0)

Image of Docusign

🛠️ Bring your solution into Docusign. Reach over 1.6M customers.

Docusign is now extensible. Overcome challenges with disconnected products and inaccessible data by bringing your solutions into Docusign and publishing to 1.6M customers in the App Center.

Learn more