DEV Community

Chris
Chris

Posted on

How to Deploy Production Django 4 using Google Cloud Run, Cloud SQL, and Cloud Storage

In this tutorial, I will show you how to deploy Django 4 application on Google Cloud using Cloud Run.

We will cover the following:

  1. Create and connect to Cloud SQL database.
  2. Build Dockerfile image and push to Artifact Registry.
  3. Deploy app on Cloud Run using image from Artifact Registry.
  4. Use Secret Manager to handle Environment Variables securely.
  5. Serve static files from Cloud Storage.

For this tutorial I will be using Django 4.2 and Python 3.10. If you are using different versions, things may be slightly different. Also, I will be using Polls app from official Django 4.2 tutorials for demonstration, but feel free to use your own.

This is not a tutorial on Django, Docker, command line, or GCP. I will assume you know the basics, but I will show you how to put them together! :)

Create Cloud SQL Database (PostgreSQL)

For the tutorial, we will use PostgreSQL database. Go to Cloud SQL, and create an instance.

Cloud SQL

From the options, choose PostgreSQL. Give it an instance ID, Password, and set Region.

Under "Customize your instance" > "Connections" make sure you select "Public IP" (we will use public ip for this tutorial)

Instance IP assignment

Also Google your ip address, and add as an authorized network. (This is only needed for accessing database from local. You can use Cloud SQL Proxy as an alternative.)

Leave the rest as defaults, and create instance. It will take a minute or so. Once it completes, you will be taken to an Overview page. Take notice of the "Public IP address", and "Connection name". We will need them to connect to database from locally and Cloud Run, respectively.

Next, we need to create users. Go to "Users", and click "ADD USER ACCOUNT" and enter name and password:

Users

Create Users

Write down this name and password somewhere, these will be DB_USER and DB_PASSWORD, respectively.

Lastly, go to "Databases" and click "CREATE DATABASE". Give the database a name. (Will be our DB_NAME)

And that is it! Now we are ready to connect to Cloud SQL database!

Connect to Cloud SQL Database (Local)

In order to connect to PostgreSQL from Django, we need to install psycopg2-binary library. Type following in the terminal:

$ pip install psycopg2-binary==2.9.9
Enter fullscreen mode Exit fullscreen mode

or
$ pip3 install psycopg2-binary==2.9.9 if you are not using virtual environment.

In settings.py configure to use our Cloud SQL database instead of sqlite3 database:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'HOST': env("DB_HOST"),
        'USER': env("DB_USER"),
        'PASSWORD': env("DB_PASSWORD"),
        'NAME': env("DB_NAME"),
    }
}
Enter fullscreen mode Exit fullscreen mode

Make sure DB_HOST, DB_USER, DB_PASSWORD, and DB_NAME is set in the .env file (You can just hard code it, but using environment variables are recommended for production).

DB_HOST: use public ip address
DB_USER: user created ('Admin' for me)
DB_PASSWORD: password created for the user (Admin)
DB_NAME: database name ('django-demo-db' for me)

(I used environs. Check it out if you want to set your project to use environment variables.)

Before we start our application, we need to migrate and create our superuser. Type following in the terminal:

$ python manage.py migrate
$ python manage.py createsuperuser
Enter fullscreen mode Exit fullscreen mode

(If you are not connecting, check if you added your ip address to the trusted network in the previous section.)

Now when we run our app locally, we should be connected to Cloud SQL database!!

$ python manage.py runserver
Enter fullscreen mode Exit fullscreen mode

Now let's deploy our app on Cloud Run.

Push Docker Image to Artifact Registry

Before we go any farther, make sure you have Docker installed. And if you are using Docker Desktop, make sure you start it.

Create Dockerfile and .dockerignore in your project root folder (right beside manage.py).

This is what I used, but you can create your own (For CMD gunicorn, make sure you change mysite and use yours instead. For me mysite is the folder that contains settings.py and wsgi.py files. Yours would probably be different.):

# Dockerfile
FROM python:3.10.4-slim-bullseye

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# Set the working directory in the container
WORKDIR /app

# Install system dependencies
RUN apt-get update && apt-get install -y \
    netcat postgresql-client mime-support \
    && rm -rf /var/lib/apt/lists/*

# Install Python dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy the Django project code into the container
COPY . .

# Expose the port (Cloud Run uses the PORT environment variable)
ENV PORT 8080
EXPOSE 8080

# Run the Django development server
CMD exec gunicorn mysite.wsgi:application --bind :$PORT --workers 1 --threads 8
Enter fullscreen mode Exit fullscreen mode

.dockerignore:

.env
venv/
db.sqlite3
Enter fullscreen mode Exit fullscreen mode

For production, we are using gunicorn, so let's install that:

$ pip install gunicorn==21.2.0
Enter fullscreen mode Exit fullscreen mode

We also need to create requirements.txt file:

$ pip freeze > requirements.txt
Enter fullscreen mode Exit fullscreen mode

Make sure you update requirements.txt file every time you install a new pip library.

Ok, now let's create our Docker repository on Artifact Registry on GCP.

Artifact Registry

Give it a name and region:

name

location

Once you create your repo, click it from the list. You may need to initialize gcloud and configure Docker. Take a look at SETUP INSTRUCTIONS for more details.

Setup Instructions

Configuring Docker by typing following in terminal:

$ gcloud auth configure-docker <<your-region>>-docker.pkg.dev
Enter fullscreen mode Exit fullscreen mode

Now that we have set up our repo, let's build our Docker image. Make sure you are in the same directory as your Dockerfile, and type following in terminal:

$ docker build -t <<image-name>> .
Enter fullscreen mode Exit fullscreen mode

We also need to tag it before we push:

$ docker tag <<image-name>> <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
Enter fullscreen mode Exit fullscreen mode

Make sure you replace things with your own.

For example:
If I used following:
image-name: django-demo
region: us-central1
project-id: ch-project
repo-name: django-test-repo
image-name-yout-want: django-demo

$ docker tag django-demo us-centra1-docker.pkg.dev/ch-project/django-test-repo/django-demo
Enter fullscreen mode Exit fullscreen mode

Finally we can push our tagged image to repo on Artifact Registry:

$ docker push <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
Enter fullscreen mode Exit fullscreen mode

In my case:

$ docker push us-centra1-docker.pkg.dev/ch-project/django-test-repo/django-demo
Enter fullscreen mode Exit fullscreen mode

If you go to GCP Artifact Registry, you should see your image:

Repositories

Deploy using Cloud Run

Finally, we are ready to deploy our application on Cloud Run.

We are going to use the image that we pushed to Artifact Registry. We also need a service account to use with Cloud Run. This service account will need a following IAM role as a minimum:

  • Cloud SQL Client

Now go to Cloud Run, and click CREATE SERVICE.

Under "Deploy one revision from an existing container image", select image from Artifact Registry. Select the latest.

latest image

Under Authentication, select "Allow unauthenticate invocations".

Authentication

Select the SQL instance you created from the drop down menu:

Cloud SQL connections

Also make sure you include Environment Variables that are inside .env file (Make sure the name matches the ones you use in the settings.py).

The exception is the DB_HOST. We used public IP address when we connected locally, but if we want to connect from Cloud Run, we have to use different value.

DB_HOST: /cloudsql/<<instance-connection-name>>

For example:
/cloudsql/ch-project:us-central1:django-demo-instance

We can get the connection name from the Overview page of SQL instance we created. Don't forget /cloudsql/.

Environment variables

In the later section I will show you a better way to manage Environment Variables using Secret Manager. For now, you can leave the rest as default, and create the Cloud Run service instance.

It will take few seconds, but once it completes, you should see a url:

Cloud Run URL

If you click it now, error message will show. That is because we have to include our url as Allowed Host in settings.py (Make sure you use your own url instead):

# settings.py
ALLOWED_HOSTS = ["127.0.0.1", "localhost", "django-demo-3parkuk6ra-uc.a.run.app"]
Enter fullscreen mode Exit fullscreen mode

Make sure you don't include "https://" or trailing "/" at the end of url.

If you have forms, you will have to include this as well (you may have to uncomment it when you run locally):

# settings.py
CSRF_TRUSTED_ORIGINS = ["https://django-demo-3parkuk6ra-uc.a.run.app"]
SECURE_SSL_REDIRECT = True
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
Enter fullscreen mode Exit fullscreen mode

Now we need to build image, tag and push again (Make sure you save settings.py).

$ docker build -t <<image-name>> .
$ docker tag <<image-name>> <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
$ docker push <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
Enter fullscreen mode Exit fullscreen mode

Now go back to Cloud Run, and select EDIT & DEPLOY NEW REVISION. Under Container image URL, select new image you just pushed.

New container image

Making sure "Serve this revision immediately" is selected, Deploy:

Deploy Cloud Run instance

Congrats! If you check the url, your app should be working now!

Use Secret Manager (Better Way to Manage Environment Variables)

Alright, our app is up and running, but we can make it better by using Secret Manager to keep things more secure. You should create a new secret for each environment variable values that you want to keep secure, such as tokens and passwords. For this tutorial, I will create secrets for SECRET_KEY and DB_PASSWORD.

Go to Cloud Run instance and click "EDIT & DEPLOY NEW REVISION".

First, delete environment variables that you want to convert to secret (can't have duplicates).

Next, scroll down until you see "Secrets" and click "ADD A SECRET REFERENCE":

New Secret reference

Click Secret drop down menu and select "CREATE NEW SECRET" to make "Create Secret" modal to appear.

Fill in name (it could be anything as long as unique within project), and under "Secret value", enter just the value of the environment variables:

Create Secret

You can leave the rest as default, and create secret. Then you need to change the "Reference method" to "Exposed as environment variable". Then you need to enter the environment variable name and choose the version. It should look like this:

How new secret reference should look like

You can choose the latest as a version, but choosing specific version is recommended. Click "Done" to create it.

Now we just need to repeat the same thing for the password. It should look something like this at the end:

Final secrets list

And that is it, now just Deploy the change. If you go to the URL, the app should work exactly the same! Yay!

Setup Cloud Storage

We are almost at the end. The application should be running fine on Cloud Run. However, our application is serving the static files. As more and more people use our app, more instance resources are used up. We will solve this issue by serving static files from Cloud Storage.

First, let's start by creating Cloud Storage bucket. We will create a public bucket that is available for everyone.

Go to Cloud Storage and Create a bucket. Give it a unique name, and choose same region as Cloud Run instance. Make sure "Enforce public access prevention on this bucket" is unchecked, and choose "Fine-grained" as our Access control:

Create a Bucket

Leave the rest as default, and create a bucket. Inside Bucket details, go to "PERMISSIONS" tab. We need to Grant Access to "allUsers", and give them Storage Object Viewer role:

Grant Access

Save and press "ALLOW PUBLIC ACCESS" button. Now, we are ready to serve static files!

Serve Static Files on Cloud Storage

We need to install django-storages library, and update our requirements.txt file:

$ pip install django-storages[google]==1.14.1
$ pip freeze > requirements.txt
Enter fullscreen mode Exit fullscreen mode

We also need to configure settings.py to use our storage bucket:

If you are using Django >=4.2, type:

# settings.py
STORAGES = {
    "default": {
        "BACKEND": "storages.backends.gcloud.GoogleCloudStorage",
    },
    "staticfiles": {
        "BACKEND": "storages.backends.gcloud.GoogleCloudStorage",
    }
}
Enter fullscreen mode Exit fullscreen mode

If you are using Django < 4.2, type instead (I am using 4.2, so I haven't tested this myself - it is from documentation):

# settings.py
DEFAULT_FILE_STORAGE = "storages.backends.gcloud.GoogleCloudStorage"
STATICFILES_STORAGE = "storages.backends.gcloud.GoogleCloudStorage"
Enter fullscreen mode Exit fullscreen mode

Also for all Django version, type:

# settings.py
# Google Cloud Storage for Static File Serve
GS_PROJECT_ID = env("GS_PROJECT_ID")
GS_BUCKET_NAME = env("GS_BUCKET_NAME")
GS_AUTO_CREATE_BUCKET = True
GS_DEFAULT_ACL = 'publicRead'

STATIC_URL = 'https://storage.googleapis.com/{}/'.format(GS_BUCKET_NAME)
Enter fullscreen mode Exit fullscreen mode

(You can check cloud storage documentation to learn more.)

In your .env file, make sure to include GS_PROJECT_ID and GS_BUCKET_NAME.

Don't forget to also set STATICFILES_DIRS and STATIC_ROOT properly in the settings file. Check documentation for more details.

Ok, now type following into terminal to collect static files:

$ python manage.py collectstatic
Enter fullscreen mode Exit fullscreen mode

Type "yes" to overwrite existing files. It may take few minutes. When it completes, go to Cloud Storage bucket, and you will see your static files! (Notice that there is no 'staticfiles' folder created in your project :))

storage bucket

Congratulation! In the next section, we will make sure to update Cloud Run instance. (You should know how to do it, so go ahead and try!)

Update Cloud Run instance to use Storage Bucket

First make sure everything is saved, and let's build our image again and push to repository (If you have 'staticfiles', you can delete it and add 'staticfiles' to .dockerignore):

$ docker build -t <<image-name>> .
$ docker tag <<image-name>> <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
$ docker push <<region>>-docker.pkg.dev/<project-id>>/<<repo-name>>/<<image-name-you-want>>
Enter fullscreen mode Exit fullscreen mode

Now go to Cloud Run instance, and "EDIT & DEPLOY NEW REVISION" while making sure you use the new image.

Also don't forget to include GS_PROJECT_ID and GS_BUCKET_NAME to Environment Variables.

Now Deploy! Everything should work as intended now!

And that is everything guys! It was a long article, but hopefully it was helpful. It was my first article, and writing isn't exactly my strength. I hope everything was clear, but if you have any questions, please leave comments.

Top comments (0)