Why ?
ℹ️ Recent article from ycombinator provides an idea about using postgres as a one stop shop for all data related needs for the application , quoting from the article
Use Postgres for caching instead of Redis with UNLOGGED tables and TEXT as a JSON data type.
thought for this experiment originated from above quote
Objective
Compare query execution time between postgres unlogged table and redis using API layer built in Golang
Procedure
For readers who are interested in codified version of the
experiment can checkout the repo
steps
- set up required data stores (postgres and redis)
- create an API layer in Golang for store interaction
- set up query performance measurement tools for stores
- run benchmark and operations on store
- collect and compare
pre-requisites
Thanks to docker, Both stores can be set up using official docker images available from docker hub.Below are queries used to set up the containers
pg_stat_statements
extension can be used to monitor query performance as far as redis is concerned the inbuilt redis-benchmark
can be used from the command line
docker run -d --name postgres \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-p 5432:5432 \
-v `pwd`/docker/init.sql:/docker-entrypoint-initdb.d/init.sql \
postgres:latest -c shared_preload_libraries='pg_stat_statements'
Above docker command can help to start postgres instance up and ready
init.sql contains basic setup for the database code associated with experiment
once after the server is ready and working either pgadmin or psql can be used to interact with it
init_table.sql holds the required statements for creation of a table and extension
redis setup is more straightforward
docker run -d --name redis -p 6379:6379 redis:latest
approach
naive and simple approach is being adopted for the purpose of experiment
One line summary of the approach is to build an API layer in golang and measure query statistics using golang's in-built benchmarking and testing
Code for API layer is available in repo
how it works - There are isolated adapters set up for both stores, All required operations for populating data into store and reading required data are carried using go test
Benchmark for get opeartion(postgres) and put(redis) are used to get query stats for the corresponding store
To execute store operation - go test -run ./... -tags=integration
which sets up required data in the stores
To measure performance of query a quick load test is carried out using the benchmark functions in test, to get load test and benchmark
postgres API query latency benchmark can be observed using
go test -bench . ./adapter/store/pgsql -count 10
in similar manner redis benchmark can be calculated using
go test -bench . ./adapter/store/redis -count 10
postgres query performance can be monitored via pg_stat_statements
extension using the SQL query
select calls,total_exec_time,min_exec_time,max_exec_time,mean_exec_time,rows,query
from pg_stat_statements where query='select * from public.test';
redis bench mark can be observed using inbuilt tool for redis
redis-benchmark -h localhost -p 6379 -n 10000
Query latency from both data layer and API layer can be measured using extensions and benchmark
observation
max exec time | db | no of rows |
---|---|---|
0.013056668 | postgres-unlogged | 27570000 |
0.029535 | redis | 27570000 |
conclusion
From original artical covering multiple aspects of postgres followed by the naive experiment between pg and redis.It is clearly evident and this acts as additional proof postgres can be one stop shop for all data related needs for any application
Bibiliography
Some Self promotion
Top comments (0)