When I work with API that affects many tables in the database, It kinda frustrating to track them one by one in my backend code. So this solution saves me a lot of time.
first, you need to get into a running Postgres container
use docker ps
to find your docker name
docker exec -it NAME_OF_CONTAINER bash
now you are in root path, go in side Postgres by running this
psql -U YOUR_POSTGRES_USERNAME
locate config file by this command then exit with \q
postgres=# SHOW config_file;
postgres-# \q
I gonna use vim to edit the file. you can use nano if you like. Install by this command
apt-get update
apt-get install vim
after you exit from Postgres now you are in root path again go config file that you locate. mine was
/var/lib/postgresql/data/postgresql.conf
cd /var/lib/postgresql/data/
vi postgresql.conf
press /
in vim to find ERROR REPORTING AND LOGGING
uncomment by remove #
in "Where to Log" section
log_destination = 'csvlog'
logging_collector = on
log_directory = 'pg_log'
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
in "What to Log" section
log_statement=all
read more about Where When What to log here
save it and restart Postgres or exit container by typing exit
then restart Postgres container instead.
docker restart NAME_OF_CONTAINER
Now it's time to test with your API !!
Log CSV file should be in pg_log
directory in this path /var/lib/postgresql/data/
To copy CSV file from container Docker to your computer use this command.
eg. I gonna copy to ~/Downloads/docker-log directory.
docker cp NAME_OF_CONTAINER:/var/lib/postgresql/data/pg_log ~/Downloads/docker-log
I recommend you add a header column to CSV file that we got so it's easy to implement and use a tool like csvkit to filter data.
Thanks for reading.
Top comments (0)