DEV Community

Cover image for 😎 Scale your Django App with Celery
parmeshwar rathod
parmeshwar rathod

Posted on

😎 Scale your Django App with Celery

Here, I will demonstrate how I integrated Celery into my Django app after reading many frustrating documentation pages. trust me you will find this is the easiest way

Requirements:

  • running django app template( if don't have clone it this repo )
  • celery
  • celery-beat
  • docker(optional) or redis-server

Celery in nutshell:

Celery-flow-diagram

Celery is distributed task queue that helps in manage and execute the tasks in the background environments

to execute and receive task celery requires a message broker such as RabbitMQ, Redis. Celery worker nodes are used for offloading data-intensive processes to the background, making applications more efficient. Celery is highly available, and a single Celery worker can process millions of tasks a minute. As Celery workers perform critical tasks at scale, it is also important to monitor their performance, this worker continuously looks for tasks in the broker queue, picks a task and spin up a child process to process that task.
we will be using redis as message broker because its easy and popular.

**Celery beat is a scheduler, It kicks off tasks at regular intervals, that are then executed by available worker nodes in the cluster.

So let's install the celery and django_celery_beat

$ pip install celery django_celery_beat

Post installation of celery, django_celery_beat , let's do some configurations in django project

First, create a 'celery.py' file at the root of your project app. For illustration, if 'myapp' is your root project, then create 'celery.py' under the 'myapp' folder.

#celery.py

import os
from celery import Celery
from django.conf import settings

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myapp.settings")
app = Celery("myapp")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
app.conf.beat_scheduler = 'django_celery_beat.schedulers:DatabaseScheduler'

@app.task(bind=True, ignore_result=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

Enter fullscreen mode Exit fullscreen mode

In the above file, we have set some configurations that are important for the Celery app to run. We have used 'DatabaseScheduler' as the scheduler, which will store all schedules in the database. Alternatively, you can use 'PersistentScheduler,' which will create a file instead of storing in the database

make some changes in __init__.py file as below

# myapp__init__.py

from .celery import app  as celery_app

__all__ = ("celery_app")
Enter fullscreen mode Exit fullscreen mode

now add some variables and installed app in settings.py file

#settings.py
...
INSTALLED_APPS = [
...
    'django_celery_beat'

]
...

CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://127.0.0.1:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BACKEND", "redis://127.0.0.1:6379/0")
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'        
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60

Enter fullscreen mode Exit fullscreen mode

Post all the configuration ,migrate the changes using below command (Tables will be created to store tasks and schedules)

$ python manage.py migrate

Now, starting the Docker container will pull the Redis image and run your Redis container in detached mode on port 6379.

$ docker run -d -p 6379:6379 redis

OR else you can start using Redis-cli

if you don't have docker then you can simply download the Redis from it's official site

to start the redis-server
$ sudo service redis-server restart

check if if its
$ redis-cli ping
out will be
$ PONG

All configurations are done, let's start the Celery and Beat
to start celery worker node:
$ celery -A <app-name> worker -l INFO
output:

Celery worker started

start the beat for scheduler:
$ celery -A <app-name> beat -l INFO
output:
Celery beat started

OR

You can run both the worker and beat commands simultaneously using one command :
$ celery -A <root-app=name> worker --beat -l info

Let's schedule a task using the Django admin panel. Add a new task under the 'Periodic Task' table
Task schedule

Specify the date and time you would like to schedule for the trigger
Datetime schedul

finally we have scheduled and executed the debug task

Task excution

hurry!

Top comments (0)