In this article, you will learn how to build an automated GitHub actions pipeline that deploys a Django application to a digital ocean droplet.
Prerequisites:
A GitHub account.
A docker hub account.
A digital ocean account.
1 Base application.
For this guide, we shall be using a base project that can be cloned here.
Navigate to your desired folder/directory.
cd Desktop
Clone the project
git clone https://github.com/manulangat1/github-actions-do-article.git
Create and activate a virtual environment (use whatever tool you are comfortable with to achieve this.)
python3 -m virtualenv venv
source venv/bin/activate
#install the required dependancies
pip install -r requirements.txt
Run the Django development server to confirm that all is well.
python3 manage.py runserver
2. Setting up the GitHub actions file.
On the root of the project create a folder .github
, and inside it create another folder named workflows and a file inside the workflows folder named integrations.yaml. The structure should be as illustrated in the image below.
Paste the following code into the intergrations.yaml
file
name: Continuous Integration and Delivery # workflow name
on: # this is the entry point to the events, we specificy when the actions should be run
push:
branches: [develop] #specificy which branch should the workflow be triggered
pull_request:
branches: [develop] #specificy which branch should the workflow be triggered
jobs:
testing-docker-compose-auto-deploy-digital-ocean: # name of the job
runs-on: ubuntu-latest # specify that the app will run on the latest version of ubuntu
steps: # steps that should be followed while building and deploying the image.
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v2
3. Dockerize the application.
Create a new file named Dockerfile
at the root of the project and paste the following lines of code.
# FROM python:3.9.6-alpine
FROM python:3.10.1-slim-buster
# WORKDIR
ENV APP_HOME=/app
RUN mkdir $APP_HOME
RUN mkdir $APP_HOME/staticfiles
WORKDIR $APP_HOME
LABEL maintainer='manulangat1'
LABEL description="This is an application that shows how to create an automatic ci pipepline that deploys the django app to a digital ocean droplet"
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV DEBUG=False
ENV ENVIRONMENT=staging
RUN apt-get update \
&& apt-get install -y build-essential \
&& apt-get install -y libpq-dev \
&& apt-get install -y gettext \
&& apt-get -y install netcat gcc postgresql \
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
RUN pip3 install --upgrade pip
COPY requirements.txt $APP_HOME/requirements.txt
COPY . $APP_HOME/
RUN pip3 install -r $APP_HOME/requirements.txt
COPY /entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint
RUN chmod +x /entrypoint
ENTRYPOINT ["/entrypoint"]
Once done, create a new file docker-compose.yaml
and paste the following lines of code.
version: '3.8'
services:
web:
build:
context: .
dockerfile: Dockerfile
command: gunicorn do-guide.wsgi:application --bind 0.0.0.0:8000
volumes:
- static_volume:/home/app/web/staticfiles
image: manulangat/auto-deploy-do-github-actions:latest
ports:
- "8000:8000"
networks:
- bms_staging
volumes:
static_volume:
networks:
bms_staging:
With this in place, you can now add the steps to build the docker images in the GitHub actions workflow file
- uses: actions/checkout@v2
- name: Build the stack
run: docker-compose -f docker-compose.yaml up -d --build
4 Deploy to Dockerhub.
We want the latest image to be deployed to docker hub, Login to your docker hub account and create a new repository.
Grab the image name eg manulangat/auto-deploy-do-github-actions:tagname and head over to the workflow file.
In the workflow file, add a step that logins it to docker hub using your username and password, add these as repository secrets on GitHub
Once done with that, head over to the workflow file and add a step that login into Dockerhub.
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
The workflow file should at this point resemble the one below
name: Continuous Integration and Delivery
on:
push:
branches: [main, develop]
pull_request:
branches: [main, develop]
jobs:
testing-docker-compose:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v2
- name: Build the stack
run: docker-compose -f docker-compose.yaml up -d --build
- name: Get docker logs
run: docker ps
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
At this point, the step that pushed the built docker image can now be added after the login to Dockerhub step.
- name: Push to dockerhub
run: |
docker-compose push # this pushes the image to the repo that we have defined in the docker-compose.yaml file.
And a step that stops the containers
- name: stop containers
run: docker-compose -f docker-compose.yaml down --volumes
5. Auto-deploy to digital ocean droplet.
Head over to the digital ocean page and log in, upon logging in, create a Docker droplet, for this tutorial, a shared CPU of 2Gb and 50Gb SSD disk is sufficient. Give the droplet whatever name you see fit and hit the create droplet button.
To configure firewalls and enable port 8000 and port 7000, which our Django app will be using communicate with the outside world, go to the networking tab, then the firewall option. This will redirect you to the page below, fill name, add a custom TCP inbound rule to port 8000 and 7000, attach it to the droplet created above and hit create firewall button.
With the droplet and firewall now up and running, you can now add a step that automatically deploys the django app to the digital ocean droplets, but before that you need to add a few secrets on github repo.
Ssh into your digital ocean droplet
ssh root@YOUR_DO_IP
Run the following command to generate a new ssh key pair that the pipeline will use to connect to the droplet.
ssh-keygen
To get the value of the generated private key navigate to the ssh folder by
cd ~/.ssh
To get the private key, run the following command
cat id_rsa
make sure that you copy the whole of the contents and add it to your repository secrets as DO_PRIVATE_KEY
.
With that in place, copy the public key
cat id_rsa.pub
and add it to the authorised_keys file
nano authorised_keys
paste the contents and save and close the file.
Go back to the root of the project and create a new folder django-ci-example
, create a .env file if need be and a 'docker-compose.yaml' file and paste into it the contents of the file into.
nano docker-compose.yaml
And paste in the following lines of code
version: '3.8'
services:
web:
build:
context: .
dockerfile: Dockerfile
command: gunicorn do-guide.wsgi:application --bind 0.0.0.0:8000
volumes:
- static_volume:/home/app/web/staticfiles
image: manulangat/auto-deploy-do-github-actions:latest
ports:
- "8000:8000"
networks:
- django-ci-example
volumes:
static_volume:
networks:
django-ci-example:
Head over to your workflow file and add the following step
- name: Executing remote command and deployment to digital ocean for dev enviroment
uses: appleboy/ssh-action@master
with:
host: "YOUR_DO_IP"
USERNAME: "root"
PORT: 22
KEY: ${{ secrets.DO_PRIVATE_KEY}}
script: |
cd django-ci-example/
docker system prune -af
docker compose -f docker-compose.staging.yaml down --volumes
echo "${{secrets.DOCKER_PASSWORD}}" | docker login -u ${{secrets.DOCKER_USERNAME}} --password-stdin
docker system prune -af
docker compose -f docker-compose.staging.yaml pull
docker compose -f docker-compose.staging.yaml up --build --remove-orphans -d --force-recreate
# docker-compose -f docker-compose.staging.yaml up --build -d
At this point, your actions file should be similar to the one below.
name: Continuous Integration and Delivery
on:
push:
branches: [main, develop]
pull_request:
branches: [main, develop]
jobs:
testing-docker-compose:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- uses: actions/checkout@v2
- name: Build the stack
run: docker-compose -f docker-compose.yaml up -d --build
- name: Get docker logs
run: docker ps
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push to dockerhub
run: |
docker-compose push
- name: stop containers
run: docker-compose -f docker-compose.yaml down --volumes
- name: Executing remote command and deployment to digital ocean for dev enviroment
uses: appleboy/ssh-action@master
with:
host: "YOUR_DO_IP"
USERNAME: "root"
PORT: 22
KEY: ${{ secrets.DO_PRIVATE_KEY}}
script: |
cd django-ci-example/
docker system prune -af
docker compose -f docker-compose.staging.yaml down --volumes
echo "${{secrets.DOCKER_PASSWORD}}" | docker login -u ${{secrets.DOCKER_USERNAME}} --password-stdin
docker system prune -af
docker compose -f docker-compose.staging.yaml pull
docker compose -f docker-compose.staging.yaml up --build --remove-orphans -d --force-recreate
# docker-compose -f docker-compose.staging.yaml up --build -d
To test this out, make commit and push your works and the actions should be triggered. Once the actions is done, navigate to the web page ie YOUR_DO_IP:7000
and you should be welcomed with a page as shown below.
The code for this project can be found at this GitHub repo .
Kindly follow me to get instant notifications whenever I post articles and guides on Devops and Django.
Happy hacking.
Top comments (0)