(Video is available at the end of this post, be sure to check it out for further details)
First off, check out this post if you're curious how I put together the csv file. There is a ton of data out there that you can just download that is in csv format, but I wanted to have data that was specific to a DDL (Data Definition Langauge) I wrote. Once I had a file, I went through the following steps to make it easily repeated through automation.
With that file make sure, at least for this example, that the first row of data has the column names as the commas seperated values. It should look something like this (per my example mentioned above), and obviously have far more data than the example below.
id,country,ip,created_at,updated_at,project_id 4cf25606-5f3e-4575-86ea-c585513fcc39,Solomon Islands,184.108.40.206,2020-12-25T05:04:26.124Z,2020-12-30T11:04:26.124Z,d6ef3dfd-9596-4391-b0ef-3d7a8a1a6d10 cbeaf6ab-23cc-43c5-a740-7191f59d8cc5,Solomon Islands,220.127.116.11,2020-12-25T05:04:26.124Z,2020-12-30T11:04:26.124Z,d6ef3dfd-9596-4391-b0ef-3d7a8a1a6d10 17a7ec0b-2758-4721-b104-f368c8109b6a,Solomon Islands,248.8.206.229,2020-12-25T05:04:26.124Z,2020-12-30T11:04:26.124Z,d6ef3dfd-9596-4391-b0ef-3d7a8a1a6d10
Now my next steps was to get this file copied to the server where the database is running. To check the containers I have running I call a quick
docker ps to get a list.
In this list, the PostgreSQL server is the container named
pgdb. To copy the file to that server, the following command will get the job done.
docker cp ./name_of_file.csv pgdb:/name_of_file.csv
If you'd like the destination name of the file to be different, that is possible in this command too. I like to keep them the same to prevent confusion. For example, if I log into the server and have a bunch of terminals open, seeing the file named on things one place and another in the other place, that'll throw a wrench into things.
With the file copied, I can now run the PostgreSQL command to bulk copy the data into a table. The command to get that done looks like this.
COPY the_table_where_the_data_will_go FROM '/name_of_file.csv' CSV HEADER;
That's it! Well ok not really, this is one of the simpliest use cases. Such as, if any columns don't map one to one with the column names of the csv file to the table column names, this command won't work as is. Check out the COPY documentation to get the details on how to use the command if you have various things you need to tweak for your own work. However, for this basic premise and for what I want to talk about next, that's it! 😁
To get into more details about how I automated this process, check out the video.