(This is part of my series on building scalable infrastructure. If you missed it, check out Part 1: Bridging Drizzle & TanStack).
If you use an ORM like Drizzle or Prisma, you eventually run into a wall: How do you safely convert dynamic URL query strings into type-safe SQL queries?
Imagine a user hits this endpoint from a data table on your frontend:
GET /api/users?filter[name][ilike]=%jack%&filter[age][gte]=18&sort=-createdAt
You need to convert that string into this Drizzle ORM execution:
db.select()
.from(users)
.where(
and(
ilike(users.name, "%jack%"),
gte(users.age, 18)
)
)
.orderBy(desc(users.createdAt));
Most developers write a giant, brittle switch statement for every single API endpoint. It's unscalable, error-prone, and a massive security risk if not sanitized properly.
In this post, we are going to look at the Dynamic Query Builder Architecture. This is the exact pattern I used to build the engine behind TableCraft, and you can use it to build your own generic API endpoints.
🧠The Core Problem: String to AST
You cannot just pass user input into database functions. You need an intermediate layer. The architecture looks like this:
URL Query -> Parser -> Abstract Syntax Tree (AST) -> ORM Builder -> SQL
Let's break down how to actually build this in TypeScript with Drizzle.
Step 1: The Operator Dictionary
The first step is mapping string-based operators from the URL (e.g., ilike, gte, eq) to actual Drizzle functions safely.
Instead of if/else hell, we use an Operator Dictionary.
import { eq, ne, gt, gte, lt, lte, like, ilike, inArray } from "drizzle-orm";
// A secure map of allowed operations
export const operatorMap = {
eq: (col, val) => eq(col, val),
ne: (col, val) => ne(col, val),
gt: (col, val) => gt(col, val),
gte: (col, val) => gte(col, val),
lt: (col, val) => lt(col, val),
lte: (col, val) => lte(col, val),
like: (col, val) => like(col, val),
ilike: (col, val) => ilike(col, val),
in: (col, val) => inArray(col, Array.isArray(val) ? val : [val]),
} as const;
export type AllowedOperator = keyof typeof operatorMap;
If a user passes ?filter[name][DROP TABLE]=..., the engine immediately rejects it because DROP TABLE is not a valid key in operatorMap.
Step 2: Drizzle Schema Introspection
How do we know if users.age is actually a column in the database? We have to extract the metadata from the Drizzle table object dynamically.
Drizzle tables hold their column definitions internally. We can build a utility to extract them:
import { getTableColumns } from "drizzle-orm";
import type { AnyPgTable } from "drizzle-orm/pg-core";
export function getColumnMap(table: AnyPgTable) {
const columns = getTableColumns(table);
const map = new Map<string, any>();
for (const [key, col] of Object.entries(columns)) {
// We map the string name to the actual Drizzle column object
map.set(key, col);
}
return map;
}
Now, when the URL asks to filter by age, we check getColumnMap(users).has("age"). If it doesn't exist, we throw a 400 Bad Request. Zero SQL injection risk.
Step 3: The Filter Builder (The Magic)
Now we write the parser. It takes the raw JSON/Query object, validates it against the schema, and builds the Drizzle where array.
import { and, SQL } from "drizzle-orm";
function buildFilters(rawFilters: Record<string, any>, table: AnyPgTable): SQL | undefined {
const columnMap = getColumnMap(table);
const conditions: SQL[] = [];
for (const [columnName, operations] of Object.entries(rawFilters)) {
const dbColumn = columnMap.get(columnName);
// Security check: Does the column exist?
if (!dbColumn) continue;
for (const [op, value] of Object.entries(operations)) {
const dbOp = operatorMap[op as AllowedOperator];
// Security check: Is the operator allowed?
if (!dbOp) continue;
// Construct the Drizzle AST
conditions.push(dbOp(dbColumn, value));
}
}
return conditions.length > 0 ? and(...conditions) : undefined;
}
Step 4: Connecting it to the ORM
With our buildFilters function, our API endpoint becomes incredibly clean. It doesn't matter if there is 1 filter or 100 filters, the code remains the same.
// Example inside an Express or Hono handler
const filters = parseUrlToFilters(req.url); // Extracts { name: { ilike: '%jack%' } }
const conditions = buildFilters(filters, schema.users);
const results = await db.select()
.from(schema.users)
.where(conditions) // Safely inject the dynamic AST
.limit(50);
🚀 Why This Matters
When you understand this pattern, you stop writing repetitive backend code. You stop writing custom endpoints for every table.
Instead, you write the Engine once, and it automatically handles filtering, sorting, and pagination for every table in your database.
This exact architecture is how I built TableCraft. It abstracts this entire engine so you can generate these dynamic, secure endpoints in one line of code:
const usersTable = defineTable(schema.users).search("email").sort("-createdAt");
If you want to see how deep this rabbit hole goes (handling relational joins recursively, cursor pagination, and input validation), dive into the open-source source code here:
👉 Explore the Architecture in TableCraft on GitHub (jacksonkasi1/TableCraft)
🔄 What's Next in the Series?
Now you know how to map URL queries to safe SQL ASTs. But what if you want to expose this logic across multiple frameworks like Express, Hono, and Next.js without rewriting the core engine?
In the next article, How to Build Framework-Agnostic Open Source Tools, I break down the Engine-Adapter pattern that makes TableCraft universally compatible.
Hit the Follow button so you don't miss the rest of the masterclass.
Top comments (0)