DEV Community

RAXXO Studios
RAXXO Studios

Posted on • Originally published at raxxo.shop

The 8 Drizzle ORM Patterns I Use in Every Shopify Backend

  • Schema-first pgTable definitions and drizzle-kit migrations keep my Shopify backend predictable across 6 projects.

  • Type-safe joins with the relations API and partial selects shaved 380ms off my product list endpoint.

  • Prepared statements and conditional and/or builders handle hot-path webhook traffic without query rewrites.

  • Transactions and JSONB metafield mirroring keep Shopify webhooks idempotent under retry storms.

  • Drizzle index hints plus the Neon serverless driver dropped my cold start from 1.2s to 180ms.

I run a one-person studio. My Shopify backends sit behind a Postgres database (Neon, mostly) and need to handle webhooks, mirror catalog data, and serve a few admin dashboards. After shipping six of these in the past year, I keep reaching for the same Drizzle ORM patterns. They are boring on purpose. Boring code survives webhook retries at 3am.

I picked Drizzle ORM after a year on Prisma and three months on raw pg. Prisma was nice until I tried to deploy to Vercel and watched my cold start balloon past two seconds. Raw pg was fast but I kept reinventing helpers and chasing typos. Drizzle sits in the middle. The query builder is thin enough that I can read the SQL it generates, and the type inference is good enough that refactors do not turn into bug hunts.

Here are the 8 patterns I use in every Shopify backend, grouped into the four areas where they actually pay off.

Schema and migrations with Drizzle

Every project starts with a schema.ts file. I never write SQL by hand for tables. The schema file is the source of truth, and drizzle-kit generates migrations from it.


import { pgTable, text, timestamp, jsonb, integer, index } from "drizzle-orm/pg-core";

export const products = pgTable("products", {
  id: text("id").primaryKey(),
  shopifyId: text("shopify_id").notNull().unique(),
  handle: text("handle").notNull(),
  title: text("title").notNull(),
  priceCents: integer("price_cents").notNull(),
  metafields: jsonb("metafields").$type>().default({}),
  createdAt: timestamp("created_at").defaultNow().notNull(),
  updatedAt: timestamp("updated_at").defaultNow().notNull(),
}, (table) => ({
  handleIdx: index("products_handle_idx").on(table.handle),
  shopifyIdIdx: index("products_shopify_id_idx").on(table.shopifyId),
}));

Enter fullscreen mode Exit fullscreen mode

Two things matter here. The $type<>() annotation on JSONB gives me typed access to metafields without a separate type file. The index definitions live in the schema, so I never forget them when I rebuild a database.

Migrations are one command. bunx drizzle-kit generate writes a SQL file. I read it, commit it, then bunx drizzle-kit migrate applies it. No magic. No silent column drops.


// drizzle.config.ts
import { defineConfig } from "drizzle-kit";

export default defineConfig({
  schema: "./src/db/schema.ts",
  out: "./drizzle",
  dialect: "postgresql",
  dbCredentials: { url: process.env.DATABASE_URL! },
  verbose: true,
  strict: true,
});

Enter fullscreen mode Exit fullscreen mode

Strict mode catches the dumb stuff. If I rename a column without telling drizzle-kit it is a rename, it asks me before generating a destructive diff. That single prompt has saved me twice.

I keep migration files in the repo and run them in CI before the deploy step. Vercel's build command on every project ends with bunx drizzle-kit migrate so the database schema and the deployed code are never out of sync. If a migration fails, the deploy fails, and the previous version stays live. That alone removed a class of incidents I used to dread on Friday afternoons.

Type-safe joins and partial selects

Shopify endpoints are heavy. A product page query that returns 30 columns when I need 4 is the easiest 200ms to recover.

I use partial select on every query that hits a list endpoint:


import { eq } from "drizzle-orm";

const rows = await db
  .select({
    id: products.id,
    handle: products.handle,
    title: products.title,
    priceCents: products.priceCents,
  })
  .from(products)
  .where(eq(products.handle, handle))
  .limit(1);

Enter fullscreen mode Exit fullscreen mode

The return type is inferred from the select object, so rows[0].metafields would be a TypeScript error. That stops me from accidentally serializing a 40KB JSONB blob into an API response.

For joins, the relations API is where Drizzle earns its place. I declare the relation once and the query read stops looking like SQL.


import { relations } from "drizzle-orm";

export const productsRelations = relations(products, ({ many }) => ({
  variants: many(variants),
}));

export const variantsRelations = relations(variants, ({ one }) => ({
  product: one(products, {
    fields: [variants.productId],
    references: [products.id],
  }),
}));

const product = await db.query.products.findFirst({
  where: eq(products.handle, handle),
  with: {
    variants: {
      columns: { id: true, title: true, priceCents: true, sku: true },
    },
  },
});

Enter fullscreen mode Exit fullscreen mode

The nested columns object scopes the partial select to the joined table. On a product detail page, this dropped my response payload from 18KB to 2.4KB and shaved 380ms off the slowest endpoint.

The bit I appreciate most is that there is no separate generated client. The schema file is the type source. Rename a column in schema.ts, run bunx drizzle-kit generate, and TypeScript flags every call site that still uses the old name. No prisma generate step, no stale client cached in node_modules, no fight with Vercel build cache.

Prepared statements and conditional builders for hot paths

Webhook handlers run thousands of times a day. Every one re-parses the same query plan unless I tell Postgres otherwise. Prepared statements fix that.


import { sql } from "drizzle-orm";

const findProductByShopifyId = db
  .select()
  .from(products)
  .where(eq(products.shopifyId, sql.placeholder("shopifyId")))
  .prepare("find_product_by_shopify_id");

// In the webhook handler:
const product = await findProductByShopifyId.execute({ shopifyId });

Enter fullscreen mode Exit fullscreen mode

The query plan is cached on the Postgres side. On my busiest webhook (orders/create on a flash sale day), prepared statements cut average query time from 14ms to 3ms. At 80,000 webhooks that morning, the maths added up.

For admin dashboards I need conditional filters. Status, date range, search string, all optional. The and/or builders compose cleanly without a string concatenation mess.


import { and, or, eq, gte, ilike } from "drizzle-orm";

function buildFilters(input: { status?: string; since?: Date; search?: string }) {
  const filters = [];
  if (input.status) filters.push(eq(orders.status, input.status));
  if (input.since) filters.push(gte(orders.createdAt, input.since));
  if (input.search) {
    filters.push(or(
      ilike(orders.email, `%${input.search}%`),
      ilike(orders.name, `%${input.search}%`),
    ));
  }
  return filters.length ? and(...filters) : undefined;
}

const result = await db.select().from(orders).where(buildFilters(input));

Enter fullscreen mode Exit fullscreen mode

If no filters apply, where gets undefined and Drizzle skips the clause. No special case. No empty WHERE 1=1 hack.

A small thing I learned the hard way. Always wrap user input that hits ilike in length checks. A blank search string with a %% pattern will scan the whole table and lock up the dashboard. I cap the search to 64 characters and skip the filter entirely if the trimmed input is empty. That one guard turned a 9-second admin page into a 40ms one when an intern pasted a paragraph by mistake.

Transactions and JSONB for webhook safety

Shopify webhooks retry. Sometimes they retry while the original is still running. If my handler does three writes and crashes after the second, the database ends up in a state I cannot reason about. Transactions solve this.


await db.transaction(async (tx) => {
  await tx.insert(orders).values(orderRow).onConflictDoUpdate({
    target: orders.shopifyId,
    set: { status: orderRow.status, updatedAt: new Date() },
  });

  await tx.insert(orderItems).values(itemRows).onConflictDoNothing();

  await tx.update(inventory)
    .set({ quantity: sql`${inventory.quantity} - ${decrement}` })
    .where(eq(inventory.sku, sku));
});

Enter fullscreen mode Exit fullscreen mode

Three writes, one atomic unit. If any step throws, all three roll back. Combined with onConflictDoUpdate, the same webhook can fire ten times and leave the database identical every time.

For metafields I mirror the Shopify shape into a JSONB column. Querying inside JSONB beats joining a separate table for any data Shopify owns.


import { sql } from "drizzle-orm";

const featured = await db
  .select()
  .from(products)
  .where(sql`${products.metafields}->>'featured' = 'true'`)
  .limit(20);

Enter fullscreen mode Exit fullscreen mode

I add a GIN index on the JSONB column when a specific key gets queried often:


gin: index("products_metafields_gin").using("gin", table.metafields),

Enter fullscreen mode Exit fullscreen mode

The Neon serverless driver is the last piece. On Vercel my routes are short-lived. A traditional Postgres pool times out or leaks connections. Neon's HTTP driver makes each query a fetch call, which fits serverless cleanly.


import { drizzle } from "drizzle-orm/neon-http";
import { neon } from "@neondatabase/serverless";

const sql = neon(process.env.DATABASE_URL!);
export const db = drizzle(sql, { schema });

Enter fullscreen mode Exit fullscreen mode

That swap alone dropped my cold start from 1.2 seconds to 180ms on the dashboard project.

For long-running scripts (backfills, nightly reconciliation jobs) I still use the pg driver because the HTTP route does not stream and a single Neon HTTP call has a 60 second budget. Drizzle ships both adapters from the same package, so the switch is one import line. I keep db.ts for the request path and db-pool.ts for jobs, and the schema imports stay identical.


// db-pool.ts
import { drizzle } from "drizzle-orm/node-postgres";
import { Pool } from "pg";

const pool = new Pool({ connectionString: process.env.DATABASE_URL!, max: 4 });
export const dbPool = drizzle(pool, { schema });

Enter fullscreen mode Exit fullscreen mode

Cap the pool at 4 connections per worker. Neon's free tier sits at 100 total. Four backfill workers times four connections leaves headroom for the live API. I learned this when a backfill on a Sunday took down my own dashboard.

Bottom Line

Drizzle is not the fastest ORM. It is the one I trust at 3am when a webhook is looping. Pick the patterns that match your traffic shape, write them once into a db/ folder, and stop reinventing them per project. The whole point of a boring stack is that it leaves brain cycles for the parts that actually move the business.

Top comments (0)