Image credit to Sies Kranen
Hi, I found some new discoveries to share with you on docker & SQLAlchemy, should be done before our second round of beer 🍺.
This is a series where I document discoveries in my Journey to SQLAlchemy. If you haven't seen my previous posts, check them out - they are somewhere at the top and bottom of this page.
Needed requirements to follow along are, basic knowledge of python, database and docker. HOLD ON, I'll be explaining each step so you should still be able to follow along.
Outline
- Docker Installation
- Postgres Engine Setup
- Database Setup
- Getting Started Dependencies Installation
Please note, this is a bit, but it'll be worth it. Do follow along, you might as well discover something new. Yoo, Barkeep !! a bottle of beer for my friend here please.
Docker Installation
Do we actually need docker to journey to SQLAlchemy ? No. The only reason I choose to use Docker, is because, it is fast to setup, lightweight + docker is also famous.
Docker provides platform which helps us deliver software packages in containers. The software package we need from docker is a database, postgres to be precise.
Installing docker is simple, see how to if you don't have it installed, I'll wait for you.
Alright, I'll assume you've already installed docker. Congratulations, you now have access to all software packages available on docker hub and with a simple command, you can make whichever you want available for you just like we'll do now.
Postgres Engine Setup
docker pull postgres:12-alpine
Run above command on your terminal and you should automatically have available for use, a postgres database image. An image is what an executable (those icons you click on your PC) is to desktop but for docker. Now that we have an image, lets launch our postgres program by running below command
docker run --env POSTGRES_USER=admin --env POSTGRES_PASSWORD=password -d -p 5432:5432 postgres:12-alpine
I know, I know !!! an executable file is way more easier to click on, I agree with you too. Lets try to understand what is happening though. Here are the bits and explanation
docker run
: is the main command saying we want to run a docker package available on our machine or docker hub.
.--env POSTGRES_USER=admin --env POSTGRES_PASSWORD=password
: Every database needs a user to access it and password to identify that user, so we're making one here
.-d
: d for Disappear - wow but in kindergarten, I was taught d is for Dog. Well Well, this is version 2.0
.-p 5432:5432
: Provisions port for which our database engine will be accessed and we're also making the port accessible to your PC. There are two statement here, hence why we have two5432
one for each statement respectively. You're free to liken a port to that single section where you fit your phone charger, if it makes the concept clear. Remember only a single charger can fit in there at a time, so it is for systems (another program cannot use 5432 port having provision it to postgres). We just did that by fitting our system to the database engine port5432:5432
.postgres:12-alpine
: this is the name identifying the docker image we are trying to run
Kindly Note:
For simplicity, I used some concepts for explanation e.g
-d
is actually for detached mode and not Disappear. When-d
is used, what ought to give an output is detached from the output renderer. Disappear simplifies all of these statement I just made to explain detached.
That was a lot. Freely give it another glance, no rush my friend.
To confirm we've an instance of postgres:12-alpine
image running, use below command
docker ps -a
You should see a container whose IMAGE column holds value postgres:12-alpine
and PORTS column hold values 0.0.0.0:5432->5432/tcp
. Give attention to either column, NAME or CONTAINER ID and take note of the values they've got as the below command will be needing either of them.
docker exec -it (NAME value | CONTAINER_ID value ) bash
Running above command will give bash access to the database engine. Kindly replace NAME or CONTAINER_ID with whatever value your postgres container had when your ran docker ps -a
, as this is different for everyone. Your command should look like this docker exec -it replace_me_with_the_value bash
.
Database Setup
To setup database we'll be working with, we need to access the default database environment using the user details we provided when creating our postgres instance.
psql --username=admin --password
Running above command would activate an interactive interface for which password given to the admin user on creation of postgres instance (which should be password
) should be entered.
To create a database, assign a user to it and grant all database privileges to the assigned user ( this is similar to what we did using --env=POSTGRES_USER & --env=POSTGRES_PASSWORD
), run next command
CREATE DATABASE learnsqlalchemy;
CREATE USER learner WITH PASSWORD 'StrongPassword123';
GRANT ALL PRIVILEGES ON DATABASE learnsqlalchemy TO learner;
The above command when run, would give us the following configuration
DATABASE_NAME = learnsqlalchemy
DATABASE_USER = learner
DATABASE_PASSWORD = StrongPassword123
Disclaimer:
This is just a learning process therefore, these values are not to be used in a real project. Kindly ensure to use a stronger value for each setup. Something only you knows
Getting Started Dependencies Installation
To get started, we need some dependencies installed, but before we install them, kindly note we'll be working in the the directory structure setup for this journey. Navigate to the directory (set it up if you haven't) before running any of the below commands, just to keep things organize.
NOTE:
The below commands should be run on a different terminal separate from that used in creating and setting up database. You can as well close the terminal used in setting up database as we're done with that
As a best practice, we'll need an environment to isolate any tools/dependencies we'll install in this journey from those already in our PC or in other future journey to avoid conflict. To do this, python comes to the rescue. Run below command
python3 -m venv env
venv
is the program that creates a virtual environment. In this case, we call it env
(you could name it whatever pleases you, but we'll keep it as env for this journey).
You've only created the virtual environment, there's need to activate it before proceeding to dependencies installation. Run the first command if you're on windows and second command if you're on Mac/Linux
first command
source env\Script\activate
second command
source env\bin\activate
Now that virtual environment is created and activated, run the below command to install dependencies.
pip3 install psycopg2-binary==2.9.3 SQLAlchemy==1.4.41
The dependencies installed are SQLAlchemy which is our focus point and postgres database adapter, psycopg2-binary, through which, SQLAlchemy can interact with postgres database. SQLAlchemy can interact with MySQL, MS Server e.t.c. For these different databases a different adapter is needed to tell SQLAlchemy which it'll be interacting with.
If you've made it this far, you definitely need another beer, may I place another request ?
We've covered a lot and here's a summary:
We started by installing docker. Docker gives us access to multiple software packages. We installed the postgres:12-alpine package image ( docker pull postgres:12-alpine
) and launched an instance of it with an open port connecting to our PC and provided default user credential ( docker run --env POSTGRES_USER=admin --env POSTGRES_PASSWORD=password -d -p 5432:5432 postgres:12-alpine
). Database for our journey was created by accessing the running postgres instance using the default user credentials given it. We proceeded to creating and activating a virtual environment within which we installed our getting started dependencies.
Thanks for sticking with me on this one. If you observed any errors or have any question, do leave it in the comment.
Like, Follow My Journey and Share it if you find it useful - cheers 🍻 . Ohh !! I will answer you now, yes I take coffee too
Top comments (0)