Early in my articles I wrote about use celery, but not wrote how to install and configure it.
Prepare
- You have Django project
- You want to connect project with
celery
Short description
- Install broker (
redis
) globally - Install
celery python
library - Configure celery in
settings
- Write
async
function - Run celery
Detail
- Install
Redis
:
sudo apt install redis-server
Check:
➜ ~ redis-cli
127.0.0.1:6379> ping
PONG
127.0.0.1:6379>
- Install
celery
:
pip install celery
- Create config file. Please don't name at
celery
because may be names conflicts. For example,run_celery.py
:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'settings')
app = Celery('my_celery_application')
app.config_from_object('django.conf:settings')
-
settings.py
BROKER_URL = 'redis://127.0.0.1:6379/0'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
- Check all:
(ienv) ➜ project git:(master) ✗ celery -A run_celery worker -l info
-------------- celery@MacBook-Pro-User1.local v5.0.2 (singularity)
--- ***** -----
-- ******* ---- macOS-10.16-x86_64-i386-64bit 2021-02-14 13:26:14
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: ptt:0x7fe9f133c2e0
- ** ---------- .> transport: redis://127.0.0.1:6379/0
- ** ---------- .> results: redis://127.0.0.1:6379/
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. apps.centrifuge.centrifuge.send_sms_async
- Write you async task:
from run_celery import app
from django.conf import settings
from .models import PhoneCode
@app.task
def send_sms_async(identifier: int):
code = PhoneCode.objects.filter(pk=identifier).first()
if code:
provider: SMSProviderBase = Test()
provider.send_private_sms(code.phone, code.code)
Conclusion
- Celery with redis (or another backend) is a good way for your asynchronous Django projects
- You may create as many tasks as you wish
- You may choose a class-way for celery tasks
- To control use flower to control your asynchronous tasks
- To run celery via supervisor (
-B
- to use periodic tasks):
[program:example-celery]
command = <path/to/your/project/ienv>/bin/celery --app=run_celery:app worker --loglevel=INFO -B
directory = <path/to/your/project/>
user = <user>
stdout_logfile = <path/to/your/project/ienv>logs/celery.log
redirect_stderr=true
environment = LANG=en_US.UTF-8,LC_ALL=en_US.UTF-8
Thanks for reading
Top comments (0)