Setting up a Django project with Docker involves containerizing your application and its dependencies, which ensures consistency across different environments. Here's a concise guide.
1. Create a Dockerfile in your Django project root
Basically it would be initial set up of the project
# Use Python 3.11 or version that you need
FROM python:3.11
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set work directory
WORKDIR /app
# Install system dependencies
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
postgresql-client \
build-essential \
libpq-dev \
&& rm -rf /var/lib/apt/lists/*
# Install Python dependencies
COPY requirements.txt /app/
RUN pip install --no-cache-dir -r requirements.txt
# Copy project
COPY . /app/
# Create a non-root user
RUN adduser --disabled-password --gecos '' appuser && chown -R appuser /app
USER appuser
# Expose port
EXPOSE 8000
# Run the application
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
2. Create a docker-compose.yml file for managing multiple services
For example you need a database that would be only accessible via private network. With compose we could achieve that in just a few lines.
❗ Remember that keyword ports map container ports to host machine ports, enabling external access to your services from the host or other networks outside the Docker network and expose declares which ports a service listens on, making them available for communication between containers within the same Docker network. It does not publish these ports to the host machine.
version: '3.8' # In most cases version is obsolete and could give you an error
services:
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
POSTGRES_DB: django_db
POSTGRES_USER: django_user
POSTGRES_PASSWORD: django_password
ports:
- "5432:5432"
redis:
image: redis:7-alpine
ports:
- "6379:6379"
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- "8000:8000"
environment:
- DEBUG=1
- DATABASE_URL=postgresql://django_user:django_password@db:5432/django_db
- REDIS_URL=redis://redis:6379/0
depends_on:
- db
- redis
volumes:
postgres_data:
3. Add .dockerignore This step is optional!
But if you want to ship only essential data it would help! I would reveal that this works the same as .gitignore. WOW, I know! ;)
# Git
.git
.gitignore
README.md
# Docker
.docker
Dockerfile*
docker-compose*
# Python
__pycache__
*.pyc
*.pyo
*.pyd
.Python
env
pip-log.txt
pip-delete-this-directory.txt
.tox
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.log
.mypy_cache
.pytest_cache
.hypothesis
# Django
local_settings.py
db.sqlite3
db.sqlite3-journal
media
# OS
.DS_Store
.vscode
.idea
*.swp
*.swo
# Node (if using frontend tools)
node_modules/
npm-debug.log*
4. Create a docker_settings.py or modify your existing settings
import os
from pathlib import Path
import dj_database_url
# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = os.environ.get('SECRET_KEY', 'your-default-secret-key-change-in-production')
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = os.environ.get('DEBUG', 'False').lower() == 'true'
ALLOWED_HOSTS = ['localhost', '127.0.0.1', '0.0.0.0']
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
# Add your apps here
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'your_project.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'your_project.wsgi.application'
# Database
# Use dj-database-url to parse DATABASE_URL environment variable
DATABASES = {
'default': dj_database_url.parse(
os.environ.get('DATABASE_URL', 'sqlite:///db.sqlite3')
)
}
# Cache (Redis)
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': os.environ.get('REDIS_URL', 'redis://localhost:6379/0'),
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
}
}
}
# Static files (CSS, JavaScript, Images)
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')
# Media files
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
# Default primary key field type
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
5. Useful commands
Once you have these files set up:
# Build and start all services
docker-compose up --build
# Run in detached mode
docker-compose up -d
# If you have multiple compose file
docker-compose -f /path/to/file up -d
# Get inside of docker container
docker exec -it {container id | container name} bash
# Create migrations
docker-compose exec web python manage.py makemigrations
# Run migrations
docker-compose exec web python manage.py migrate
# Create superuser
docker-compose exec web python manage.py createsuperuser
# View logs
docker-compose logs web
# Stop all services
docker-compose down
# Stop and remove volumes
docker-compose down -v
6. Production considerations
For production, consider:
Using environment variables for sensitive data
Using a proper web server like Nginx or Apache
Setting up proper logging
Using Docker secrets for sensitive information
Implementing health checks
Setting up proper backup strategies for your database
This setup provides a complete Docker environment for Django development with PostgreSQL and Redis support. The configuration is flexible and can be easily modified for different environments.
And as always feel free to express your thoughts and criticize this article!
Top comments (0)