DEV Community

Lukas Klein
Lukas Klein

Posted on

Counting the queued Celery tasks

If you are using Redis for your Celery queues, there's a way to count the number of queued tasks grouped by the task without the need for an external tool. Note that this approach assumes that you're using the JSON serializer for your tasks.

Using a Redis-client of your choice, I chose redli (which supports tls), use LRANGE to get a list of all tasks, pipe them through jq to query headers->task, sort and make them uniq / count them:

> redli [your connection parameters] lrange celery 0 30000 | jq '.headers.task' | sort | uniq -c

 533 "devices.adapters.lorawan.tasks.run_lora_payload_decoder"
   2 "devices.adapters.particle.tasks.run_particle_payload_decoder"
  92 "devices.tasks.call_device_function"
8556 "devices.tasks.ping_device"
9682 "devices.tasks.process_device_field_rules"
   5 "devices.tasks.send_device_offline_email"
   2 "dzeroos.tasks.call_command"
   8 "dzeroos.tasks.flush_command_queue"
   1 "dzeroos.tasks.publish_device_config"
   1 "dzeroos.tasks.publish_protobuf_config"
Enter fullscreen mode Exit fullscreen mode

Bonus: statistics on task details

In my case, I needed to get some insights on one of the parameters of one task (I was investigating a loop that caused a long queue). This required some more jq and bash-magic and probably doesn't fit your use-case, I just paste it here for reference:

> redli [your connection parameters] lrange celery 0 1000 | jq -c '. | select(.headers.task | contains("taskname")) .body' | while read -r line; do echo "$line" | tr -d '"' | base64 -D | jq '.[0][0]'; done | sort | uniq -c

  15 "04576f6e-d5d1-45f4-8eef-a17e015335f4"
   9 "05264cc7-ae60-4f4f-9a18-2451e8d83f65"
  25 "4e240129-b84e-4e70-9f85-0e06f7a01875"
 224 "6c6a9aeb-10c7-417f-a928-791399d8adb9"
Enter fullscreen mode Exit fullscreen mode

Top comments (0)