One of the core features of data-peek is supporting multiple database engines:
- PostgreSQL - My primary database
- MySQL - Still everywhere, especially in legacy systems
- Microsoft SQL Server - Enterprise environments
The challenge: each database has different drivers, APIs, SQL dialects, and quirks. How do you support them all without turning your codebase into spaghetti?
The answer: the Adapter Pattern.
The Problem
Without abstraction, multi-database support looks like this:
// DON'T DO THIS
async function executeQuery(connection, sql) {
if (connection.dbType === 'postgresql') {
const client = new pg.Client(connection)
await client.connect()
const result = await client.query(sql)
await client.end()
return result.rows
} else if (connection.dbType === 'mysql') {
const conn = await mysql.createConnection(connection)
const [rows] = await conn.execute(sql)
await conn.end()
return rows
} else if (connection.dbType === 'mssql') {
await mssql.connect(connection)
const result = await mssql.query(sql)
return result.recordset
}
}
Now multiply this by every operation: connect, query, execute, getSchemas, explain, getTableDDL...
You end up with:
- Massive if/else chains everywhere
- Duplicated logic
- Nightmare to add a new database
- Impossible to test properly
The Adapter Pattern
The solution is to define a common interface and implement it for each database:
// db-adapter.ts
/**
* Database adapter interface - abstracts database-specific operations
*/
export interface DatabaseAdapter {
/** Database type identifier */
readonly dbType: DatabaseType
/** Test connection */
connect(config: ConnectionConfig): Promise<void>
/** Execute a query and return results */
query(config: ConnectionConfig, sql: string): Promise<AdapterQueryResult>
/** Execute multiple SQL statements and return results for each */
queryMultiple(
config: ConnectionConfig,
sql: string,
options?: QueryOptions
): Promise<AdapterMultiQueryResult>
/** Execute a statement (for INSERT/UPDATE/DELETE) */
execute(
config: ConnectionConfig,
sql: string,
params: unknown[]
): Promise<{ rowCount: number | null }>
/** Execute multiple statements in a transaction */
executeTransaction(
config: ConnectionConfig,
statements: Array<{ sql: string; params: unknown[] }>
): Promise<{ rowsAffected: number; results: Array<{ rowCount: number | null }> }>
/** Fetch database schemas, tables, and columns */
getSchemas(config: ConnectionConfig): Promise<SchemaInfo[]>
/** Get query execution plan */
explain(config: ConnectionConfig, sql: string, analyze: boolean): Promise<ExplainResult>
/** Get table definition (reverse engineer DDL) */
getTableDDL(config: ConnectionConfig, schema: string, table: string): Promise<TableDefinition>
/** Get available sequences (PostgreSQL-specific) */
getSequences(config: ConnectionConfig): Promise<SequenceInfo[]>
/** Get custom types (enums, etc.) */
getTypes(config: ConnectionConfig): Promise<CustomTypeInfo[]>
}
Now the rest of the application just uses this interface:
// Anywhere in the app
const adapter = getAdapter(connection)
const result = await adapter.query(connection, sql)
No if/else. No database-specific code leaking everywhere.
Implementing an Adapter
Each database gets its own adapter class. Here's a simplified PostgreSQL adapter:
// adapters/postgres-adapter.ts
import pg from 'pg'
export class PostgresAdapter implements DatabaseAdapter {
readonly dbType = 'postgresql'
async connect(config: ConnectionConfig): Promise<void> {
const client = new pg.Client({
host: config.host,
port: config.port,
database: config.database,
user: config.username,
password: config.password,
ssl: config.ssl ? { rejectUnauthorized: false } : false,
})
await client.connect()
await client.end()
}
async query(config: ConnectionConfig, sql: string): Promise<AdapterQueryResult> {
const client = new pg.Client(this.buildClientConfig(config))
try {
await client.connect()
const result = await client.query(sql)
return {
rows: result.rows,
fields: result.fields.map(f => ({
name: f.name,
dataType: this.mapOidToType(f.dataTypeID),
})),
rowCount: result.rowCount,
}
} finally {
await client.end()
}
}
async getSchemas(config: ConnectionConfig): Promise<SchemaInfo[]> {
const client = new pg.Client(this.buildClientConfig(config))
try {
await client.connect()
// Query information_schema for tables and columns
const tablesResult = await client.query(`
SELECT
table_schema as schema,
table_name as name,
table_type as type
FROM information_schema.tables
WHERE table_schema NOT IN ('pg_catalog', 'information_schema')
ORDER BY table_schema, table_name
`)
// ... fetch columns, primary keys, foreign keys
return this.buildSchemaInfo(tablesResult.rows, columnsResult.rows, ...)
} finally {
await client.end()
}
}
async explain(config: ConnectionConfig, sql: string, analyze: boolean): Promise<ExplainResult> {
const explainSql = analyze
? `EXPLAIN (ANALYZE, FORMAT JSON) ${sql}`
: `EXPLAIN (FORMAT JSON) ${sql}`
const result = await this.query(config, explainSql)
return {
plan: result.rows[0]['QUERY PLAN'],
durationMs: 0, // Extracted from ANALYZE output
}
}
}
The MySQL adapter is similar but handles MySQL-specific quirks:
// adapters/mysql-adapter.ts
import mysql from 'mysql2/promise'
export class MySQLAdapter implements DatabaseAdapter {
readonly dbType = 'mysql'
async query(config: ConnectionConfig, sql: string): Promise<AdapterQueryResult> {
const connection = await mysql.createConnection({
host: config.host,
port: config.port,
database: config.database,
user: config.username,
password: config.password,
})
try {
const [rows, fields] = await connection.execute(sql)
return {
rows: rows as Record<string, unknown>[],
fields: fields?.map(f => ({
name: f.name,
dataType: this.mapMySQLType(f.type),
})) || [],
rowCount: Array.isArray(rows) ? rows.length : null,
}
} finally {
await connection.end()
}
}
async explain(config: ConnectionConfig, sql: string, analyze: boolean): Promise<ExplainResult> {
// MySQL uses different EXPLAIN syntax
const explainSql = analyze
? `EXPLAIN ANALYZE ${sql}`
: `EXPLAIN FORMAT=JSON ${sql}`
const result = await this.query(config, explainSql)
return {
plan: result.rows[0],
durationMs: 0,
}
}
}
The Adapter Registry
A simple factory function returns the right adapter:
// Adapter instances (singletons)
const adapters: Record<DatabaseType, DatabaseAdapter> = {
postgresql: new PostgresAdapter(),
mysql: new MySQLAdapter(),
mssql: new MSSQLAdapter(),
sqlite: new PostgresAdapter(), // Placeholder
}
/**
* Get the appropriate database adapter for a connection
*/
export function getAdapter(config: ConnectionConfig): DatabaseAdapter {
const dbType = config.dbType || 'postgresql'
const adapter = adapters[dbType]
if (!adapter) {
throw new Error(`Unsupported database type: ${dbType}`)
}
return adapter
}
Adding a new database is now trivial:
- Implement the
DatabaseAdapterinterface - Add it to the registry
- Done.
The SQL Dialect Problem
The adapter pattern handles fetching data, but what about mutations?
When a user edits a row in the UI, I need to generate SQL:
-- PostgreSQL
UPDATE "users" SET "name" = $1 WHERE "id" = $2 RETURNING *
-- MySQL
UPDATE `users` SET `name` = ? WHERE `id` = ?
-- MSSQL
UPDATE [users] SET [name] = @p1 WHERE [id] = @p2
Same operation, different:
| Aspect | PostgreSQL | MySQL | MSSQL |
|---|---|---|---|
| Parameter placeholders | $1, $2, $3 |
?, ?, ? |
@p1, @p2, @p3 |
| Identifier quotes | "table" |
`table` |
[table] |
| RETURNING clause | Yes | No | No (uses OUTPUT) |
SQL Dialect Configuration
I created a dialect system to handle these differences:
// sql-builder.ts
interface SqlDialect {
/** Parameter placeholder format */
parameterPlaceholder: (index: number) => string
/** Identifier quote character */
identifierQuote: string
/** Whether RETURNING clause is supported */
supportsReturning: boolean
}
const DIALECTS: Record<DatabaseType, SqlDialect> = {
postgresql: {
parameterPlaceholder: (i) => `$${i}`,
identifierQuote: '"',
supportsReturning: true
},
mysql: {
parameterPlaceholder: () => '?',
identifierQuote: '`',
supportsReturning: false
},
sqlite: {
parameterPlaceholder: () => '?',
identifierQuote: '"',
supportsReturning: true // SQLite 3.35+
},
mssql: {
parameterPlaceholder: (i) => `@p${i}`,
identifierQuote: '[',
supportsReturning: false
}
}
Building Parameterized Queries
The SQL builder generates safe, parameterized queries:
function buildUpdate(
operation: RowUpdate,
context: EditContext,
dialect: SqlDialect
): ParameterizedQuery {
const params: unknown[] = []
let paramIndex = 1
const tableRef = buildTableRef(context, dialect)
// SET clause
const setClauses = operation.changes.map((change) => {
const col = quoteIdentifier(change.column, dialect)
const placeholder = dialect.parameterPlaceholder(paramIndex++)
params.push(serializeValue(change.newValue, change.dataType))
return `${col} = ${placeholder}`
})
// WHERE clause (primary keys)
const whereClauses = operation.primaryKeys.map((pk) => {
const col = quoteIdentifier(pk.column, dialect)
const placeholder = dialect.parameterPlaceholder(paramIndex++)
params.push(serializeValue(pk.value, pk.dataType))
return `${col} = ${placeholder}`
})
let sql = `UPDATE ${tableRef} SET ${setClauses.join(', ')} WHERE ${whereClauses.join(' AND ')}`
if (dialect.supportsReturning) {
sql += ' RETURNING *'
}
return { sql, params }
}
Given this update operation:
{
type: 'update',
changes: [{ column: 'name', newValue: 'Alice', dataType: 'text' }],
primaryKeys: [{ column: 'id', value: 42, dataType: 'integer' }]
}
The builder outputs:
// PostgreSQL
{ sql: 'UPDATE "users" SET "name" = $1 WHERE "id" = $2 RETURNING *', params: ['Alice', 42] }
// MySQL
{ sql: 'UPDATE `users` SET `name` = ? WHERE `id` = ?', params: ['Alice', 42] }
// MSSQL
{ sql: 'UPDATE [users] SET [name] = @p1 WHERE [id] = @p2', params: ['Alice', 42] }
Handling Special Types
Different databases have different data types. The builder handles serialization:
function serializeValue(value: unknown, dataType: string): unknown {
if (value === null || value === undefined) {
return null
}
// Handle JSON/JSONB - stringify objects
if ((dataType === 'json' || dataType === 'jsonb') && typeof value === 'object') {
return JSON.stringify(value)
}
// Handle arrays - PostgreSQL array syntax
if (dataType.endsWith('[]') && Array.isArray(value)) {
return value
}
// Handle booleans
if (dataType === 'boolean' || dataType === 'bool') {
return Boolean(value)
}
return value
}
Preview vs. Execution
For the UI, I also need human-readable SQL (not parameterized):
function buildPreviewSql(
operation: EditOperation,
context: EditContext,
dbType: DatabaseType
): string {
const { sql, params } = buildQuery(operation, context, dbType)
// Replace placeholders with actual values (for preview only)
let preview = sql
params.forEach((param, index) => {
const placeholder = getPlaceholder(dbType, index)
let displayValue: string
if (param === null) {
displayValue = 'NULL'
} else if (typeof param === 'string') {
displayValue = `'${param.replace(/'/g, "''")}'`
} else if (typeof param === 'boolean') {
displayValue = param ? 'TRUE' : 'FALSE'
} else {
displayValue = String(param)
}
preview = preview.replace(placeholder, displayValue)
})
return preview
}
This is only for display. Actual execution always uses parameterized queries to prevent SQL injection.
Transaction Support
For batch edits (multiple INSERTs/UPDATEs), transactions ensure atomicity:
async executeTransaction(
config: ConnectionConfig,
statements: Array<{ sql: string; params: unknown[] }>
): Promise<{ rowsAffected: number; results: Array<{ rowCount: number | null }> }> {
const client = new pg.Client(this.buildClientConfig(config))
try {
await client.connect()
await client.query('BEGIN')
const results: Array<{ rowCount: number | null }> = []
let totalRows = 0
for (const stmt of statements) {
const result = await client.query(stmt.sql, stmt.params)
results.push({ rowCount: result.rowCount })
totalRows += result.rowCount || 0
}
await client.query('COMMIT')
return { rowsAffected: totalRows, results }
} catch (error) {
await client.query('ROLLBACK')
throw error
} finally {
await client.end()
}
}
All edits succeed or none do.
The Benefits
This architecture has paid off:
1. Clean Separation
Database-specific code lives in adapters only. The rest of the app is database-agnostic.
2. Easy Testing
I can mock the DatabaseAdapter interface for unit tests:
const mockAdapter: DatabaseAdapter = {
dbType: 'postgresql',
query: jest.fn().mockResolvedValue({ rows: [...], fields: [...] }),
// ...
}
3. Type Safety
The interface enforces a consistent API. TypeScript catches mismatches at compile time.
4. Extensibility
Adding SQLite support means implementing one class. No changes to the rest of the codebase.
Lessons Learned
Start with the Interface
I designed the DatabaseAdapter interface before writing any implementation. This forced me to think about the common API across databases.
Don't Over-Abstract
I initially tried to make the SQL builder handle every edge case. That was a mistake. Some things (like EXPLAIN syntax) are so different that database-specific code is cleaner than complex abstractions.
Test the Dialects
Each dialect needs its own test suite:
describe('SQL Builder', () => {
describe('PostgreSQL dialect', () => {
it('uses $1, $2 placeholders', () => { ... })
it('uses double quotes for identifiers', () => { ... })
it('includes RETURNING clause', () => { ... })
})
describe('MySQL dialect', () => {
it('uses ? placeholders', () => { ... })
it('uses backticks for identifiers', () => { ... })
it('omits RETURNING clause', () => { ... })
})
})
What's Next
The adapter pattern handles query execution. But there's another SQL challenge: parsing.
When a user writes multiple statements in the editor:
SELECT * FROM users;
INSERT INTO logs (message) VALUES ('test');
UPDATE users SET name = 'Bob';
I need to split these into individual statements. Sounds easy... until you encounter:
- PostgreSQL dollar-quotes:
$$ SELECT 'foo;bar' $$ - MySQL backtick identifiers:
`table;name` - Nested block comments:
/* outer /* inner */ outer */ - String literals with semicolons:
'hello; world'
That's the subject of the next post.
Next up: Adding AI to a Desktop App: 5 Providers, 18+ Models, Zero Lock-in
GitHub: github.com/Rohithgilla12/data-peek
Web: datapeek.dev
Top comments (0)