PostgreSQL isn't just a database — it's a platform. With PostgREST, you can instantly turn any Postgres database into a RESTful API. Zero backend code, automatic CRUD, filtering, pagination, and JWT auth — all from your existing tables.
Why PostgREST?
- Instant API from any PostgreSQL database
- No backend code — the API is auto-generated from your schema
- Filtering, sorting, pagination built-in via URL parameters
- Row-level security using PostgreSQL policies
- Used by Supabase — it powers their entire API layer
Getting Started
# Install PostgREST
brew install postgrest # macOS
# or download from github.com/PostgREST/postgrest
# Create a config file
cat > postgrest.conf << 'EOF'
db-uri = "postgres://user:pass@localhost:5432/mydb"
db-schemas = "public"
db-anon-role = "web_anon"
server-port = 3000
EOF
# Start the API server
postgrest postgrest.conf
Now your tables are available as REST endpoints:
# List all products
curl -s http://localhost:3000/products | jq '.[0:3]'
# Filter by category
curl -s "http://localhost:3000/products?category=eq.electronics&price=lt.100" | jq .
# Full-text search
curl -s "http://localhost:3000/products?name=ilike.*phone*" | jq .
# Pagination
curl -s "http://localhost:3000/products?limit=10&offset=20&order=price.desc" | jq .
Python Client
import requests
class PostgrestClient:
def __init__(self, base_url, token=None):
self.base_url = base_url
self.headers = {'Content-Type': 'application/json'}
if token:
self.headers['Authorization'] = f'Bearer {token}'
def select(self, table, columns='*', filters=None, order=None, limit=None):
params = {'select': columns}
if filters:
params.update(filters)
if order:
params['order'] = order
if limit:
params['limit'] = str(limit)
resp = requests.get(f"{self.base_url}/{table}", params=params, headers=self.headers)
return resp.json()
def insert(self, table, data):
resp = requests.post(f"{self.base_url}/{table}", json=data, headers={**self.headers, 'Prefer': 'return=representation'})
return resp.json()
def update(self, table, filters, data):
params = filters
resp = requests.patch(f"{self.base_url}/{table}", params=params, json=data, headers={**self.headers, 'Prefer': 'return=representation'})
return resp.json()
def delete(self, table, filters):
resp = requests.delete(f"{self.base_url}/{table}", params=filters, headers=self.headers)
return resp.status_code == 204
# Usage
db = PostgrestClient('http://localhost:3000')
# Insert
new_product = db.insert('products', {
'name': 'Widget Pro',
'category': 'tools',
'price': 29.99,
'in_stock': True
})
print(f"Created: {new_product}")
# Query with filters
results = db.select('products',
filters={'category': 'eq.electronics', 'price': 'lt.500'},
order='price.asc',
limit=10
)
for p in results:
print(f"{p['name']:30s} ${p['price']:.2f}")
Advanced Filtering
# Comparison operators
curl "http://localhost:3000/products?price=gte.10&price=lte.100" # Between
curl "http://localhost:3000/products?category=in.(electronics,tools)" # IN list
curl "http://localhost:3000/products?name=ilike.*pro*" # Case-insensitive LIKE
curl "http://localhost:3000/products?tags=cs.{python,api}" # Array contains
curl "http://localhost:3000/products?metadata->>color=eq.red" # JSON field
# Logical operators
curl "http://localhost:3000/products?or=(price.lt.10,category.eq.free)" # OR
curl "http://localhost:3000/products?and=(price.gt.50,in_stock.is.true)" # AND
Row-Level Security
-- Create roles
CREATE ROLE web_anon NOLOGIN;
GRANT SELECT ON products TO web_anon;
CREATE ROLE authenticated NOLOGIN;
GRANT ALL ON products TO authenticated;
-- Row-level security: users see only their data
ALTER TABLE orders ENABLE ROW LEVEL SECURITY;
CREATE POLICY orders_policy ON orders
USING (user_id = current_setting('request.jwt.claims')::json->>'sub');
Database Functions as API Endpoints
-- Create a function
CREATE OR REPLACE FUNCTION search_products(search_term text, min_price numeric DEFAULT 0)
RETURNS SETOF products AS $$
SELECT * FROM products
WHERE (name ILIKE '%' || search_term || '%' OR description ILIKE '%' || search_term || '%')
AND price >= min_price
ORDER BY price ASC;
$$ LANGUAGE SQL;
# Call via API
curl -s "http://localhost:3000/rpc/search_products" \
-X POST -H "Content-Type: application/json" \
-d '{"search_term": "wireless", "min_price": 20}' | jq .
Real-Time with pg_notify
import psycopg2
import select
def listen_for_changes(connection_string, channel='changes'):
conn = psycopg2.connect(connection_string)
conn.autocommit = True
cur = conn.cursor()
cur.execute(f"LISTEN {channel};")
print(f"Listening on channel '{channel}'...")
while True:
select.select([conn], [], [], 5)
conn.poll()
while conn.notifies:
notify = conn.notifies.pop(0)
print(f"Event: {notify.payload}")
Real-World Use Case
A startup replaced their entire Node.js backend with PostgREST in a weekend. 50+ API endpoints, all auto-generated from their PostgreSQL schema. Development velocity tripled because adding a new endpoint meant adding a table or view — no backend code changes, no deployment. API response times improved 3x because PostgREST uses prepared statements and connection pooling.
What You Can Build
- Instant backend for any frontend app
- Admin panel API with full CRUD
- Data warehouse API for analytics dashboards
- Multi-tenant SaaS with row-level security
- Microservices gateway — one database, many schemas
Need custom database solutions? I build APIs, data pipelines, and automation tools.
Email me: spinov001@gmail.com
Check out my developer tools: https://apify.com/spinov001
Top comments (0)