DEV Community

Carlos Armando Marcano Vargas
Carlos Armando Marcano Vargas

Posted on • Originally published at carlosmv.hashnode.dev on

Dockerizing a Django Application with Postgres as database | Compose

For this article, we are going to build two container images, one for a Django service and another for the Postgres database, and run them together using Compose.

Requirements

  • Docker Compose or Docker Desktop installed

We are not going to build the Django application from scratch. We will use a blank project from this repository. Or, you can create a new Django project if you want.

We can clone the app using GitHub CLI:

gh repo clone carlosm27/django-postgres-demo

Enter fullscreen mode Exit fullscreen mode

Or using Git CLI:

git clone https://github.com/carlosm27/django-postgres-demo

Enter fullscreen mode Exit fullscreen mode

Project structure.

django_auth/
django_postgres/
manage.py
Dockerfile
docker-compose.yml
requirements.txt

Enter fullscreen mode Exit fullscreen mode

What is Compose?

According to the Docker documentation:

Docker Compose is a tool that was developed to help define and share multi-container applications. With Compose, we can create a YAML file to define the services and with a single command, can spin everything up or tear it all down.

With Compose we can define an application stack in a file, keep it at the root of our project and enable someone else to contribute to our project.

Adding Postgres

We need to add a Postgres adapter to our Django application.

pip install psycopg2
pip freeze > requirements.txt

Enter fullscreen mode Exit fullscreen mode

Inside django_postgres/django_auth/settings.py file we go to the database section, and it will look like this:

DATABASES = {
    'default': {
            'ENGINE': 'django.db.backends.sqlite3',
            'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}

Enter fullscreen mode Exit fullscreen mode

Our Django project is configured to use sqlite3 as a database. We change the configuration to use Postgres.

DATABASES = {
    'default':{
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'postgres',
        'USER': 'postgres',
        'PASSWORD': '<YOUR PASSWORD>',
        'HOST': 'localhost',
        'PORT': 5432,
    }
}

Enter fullscreen mode Exit fullscreen mode

Dockerfile

FROM python:3.11

RUN mkdir /code
WORKDIR /code
RUN pip install --upgrade pip
COPY requirements.txt /code/

RUN pip install -r requirements.txt
COPY . /code/

EXPOSE 8000

CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]

Enter fullscreen mode Exit fullscreen mode

Instead of writing our own Dockerfile, we will copy the Django sample from the Docker samples repository.

Docker Compose file

As the documentation says, the Docker Compose file is a YAML file to define services, networks, and volumes for a Docker application. Allows us to define a platform-agnostic container-based application.

The computing components of an application are defined as Services. A Service is an abstract concept implemented on platforms by running the same container image (and configuration) one or more times. Services store and share persistent data in Volumes.

We are going to use Volumes further in this tutorial to store our Postgres data.

For more information about the Docker Compose file, visit its documentation.

docker-compose.yml

version: "2.13.0"
services:
  web:
    build: .

    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code 
    ports:
      - "8000:8000"   
    depends_on:
      - db
  db:
    image: postgres:13
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    environment:
      - "POSTGRES_HOST_AUTH_METHOD=trust"

volumes:
  postgres_data:

Enter fullscreen mode Exit fullscreen mode

Here we declare two services, a web service and a dbservice. This means we will have a container for the Django application and another for the Postgres database.

Also, we have to write that our web service depends on the db service to run. We add the line depends_on to specify this relationship.

While containers can create, update, and delete files, those changes are lost when we stop running the container because all changes are isolated to that container. With volumes, we can change all of this.

As the Docker documentation says:

Volumes provide the ability to connect specific filesystem paths of the container back to the host machine. If a directory in the container is mounted, changes in that directory are also seen on the host machine. If we mount that same directory across container restarts, wed see the same files.

As Will Vicent explains in this article, we need to create a volumes called postgres_data in our docker-compose.yml and then bind it to a dedicated directory within the container at the location /var/lib/postgresql/data/ .

Before we build the images, we have to modify the settings.py file of our Django service. We need to replace the host for db instead localhost.

settings.py

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': "<YOUR DATABASE NAME>",
        'USER': "<YOUR USER>",
        'PASSWORD': "<YOUR PASSWORD>",
        'HOST': db,
        'PORT': 5432,
    }
}

Enter fullscreen mode Exit fullscreen mode

Now, where docker-compose.yml file is located, we execute this command to start the containers:

docker-compose up -d --build

Enter fullscreen mode Exit fullscreen mode

When the containers start running, we go to http://localhost:8000/ , if we see the Django welcome page, it means it worked.

After we create a table, we run migrations. When a Django application is running within a container, we execute this command to run migrations:

 docker-compose exec web python manage.py migrate

Enter fullscreen mode Exit fullscreen mode

The difference is that we need to specify which service we want it to pass the command to. In this case, to web service.

Conclusion

We only build and run two services, but it is possible to add many more with Compose.

It was the first time I used Compose. It allows us to build images and run containers for more robust applications that need many services to run together, with databases, cache, message brokers, event streamers, etc.

Thank you for taking the time to read this article.

If you have any recommendations about other packages, architectures, how to improve my code, my English, or anything; please leave a comment or contact me through Twitter, or LinkedIn.

References

Top comments (0)