<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mehedee Siddique</title>
    <description>The latest articles on DEV Community by Mehedee Siddique (@mehedees).</description>
    <link>https://dev.to/mehedees</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mehedees"/>
    <language>en</language>
    <item>
      <title>Configuring The Celery App</title>
      <dc:creator>Mehedee Siddique</dc:creator>
      <pubDate>Mon, 18 Sep 2023 17:07:17 +0000</pubDate>
      <link>https://dev.to/mehedees/configuring-celery-app-and-tasks-44g3</link>
      <guid>https://dev.to/mehedees/configuring-celery-app-and-tasks-44g3</guid>
      <description>&lt;p&gt;This post is the 2nd of a multipart series on Celery, find the first one here &lt;a href="https://dev.to/mehedees/introduction-to-celery-57l7"&gt;Introduction to Celery&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this post we'll explore Celery &lt;a href="https://docs.celeryq.dev/en/stable/userguide/configuration.html"&gt;configurations&lt;/a&gt; and in the process learn a bit about how tasks should be designed. &lt;/p&gt;

&lt;p&gt;To recap, lets first set up the Celery app instance&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;celery&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Celery&lt;/span&gt;


&lt;span class="c1"&gt;# Broker(RabbitMQ) connection str
&lt;/span&gt;&lt;span class="n"&gt;CELERY_BROKER&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"pyamqp://user:Pass1234@rabbitmq:5672//"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Result Backend(Redis) connection str
&lt;/span&gt;&lt;span class="n"&gt;CELERY_BACKEND&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"redis://redis:6379"&lt;/span&gt;


&lt;span class="c1"&gt;# Celery App instance
&lt;/span&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Celery&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;broker&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;CELERY_BROKER&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;backend&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;CELERY_BACKEND&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;include&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'app.tasks'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lets talk about some configurations now. &lt;/p&gt;

&lt;h4&gt;
  
  
  task_acks_late (&lt;code&gt;boolean&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;task_acks_late&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Disabled&lt;/code&gt; by default, meaning the worker acknowledges the task as &lt;code&gt;executed&lt;/code&gt; right before it starts execution. So, in any case the worker crashes while executing the task, the task won't be redelivered to the worker again. If this setting is &lt;code&gt;enabled&lt;/code&gt;, any task lost because of worker crash while in the middle of task execution will be delivered to the worker again by the broker. &lt;br&gt;
This begs the question, why is the setting not enabled by default? To explain, we need to understand what an ideal task looks like. In an ideal scenario, a task should be &lt;code&gt;idempotent&lt;/code&gt;. It means, when we send a task to the broker, no matter how many times the task is delivered to the worker for execution(due to worker crash or task retry on failure), the end result is always the same. Now, it is on us developers to write an idempotent task because Celery can't detect if it is not. Also, in reality, it might not always be possible to write an idempotent task. If we enable &lt;code&gt;task_acks_late&lt;/code&gt; on a non-idempotent task, we might face unwanted consequences because of the multiple execution of that task. For this reason, assuming our tasks are not idempotent, Celery keeps it disabled by default. If we make sure that our tasks are idempotent, we can take the benefit of enabling this setting and, it is recommended to do so. Read more on this &lt;a href="https://docs.celeryq.dev/en/stable/userguide/tasks.html"&gt;here&lt;/a&gt;&lt;br&gt;
Applying this setting on application level would apply it on all tasks but, we can also specify this setting in task level with &lt;code&gt;@celery_app.task(acks_late=True)&lt;/code&gt; parameter which will then take precedence over the app level setting for this task only. &lt;br&gt;
My preference is, we keep this setting disabled as it is by default and, for tasks we are sure to be idempotent and must be redelivered to the worker in case of any crash, we enable this setting in task level. Also, I like to explicitly add this config to the app as &lt;code&gt;disabled&lt;/code&gt; to make the codebase more explanatory. &lt;/p&gt;

&lt;h4&gt;
  
  
  task_ignore_result (&lt;code&gt;boolean&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;task_ignore_result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Disabled&lt;/code&gt; by default, meaning task results won't be saved to the result backend even if you have the backend configured. Enabling this setting would mean results for all tasks will be saved in the backend. &lt;br&gt;
Saving task result in the backend adds and extra overhead to performance, memory and storage. In &lt;em&gt;my opinion&lt;/em&gt;, we don't need results for most tasks and, if your code is mostly dependent on task results, you might not be taking the asynchronous execution benefit of Celery. For example, you are handling a newsletter subscribe post request and, have triggered an email sending task from your web application and, waiting for the task to finish in the worker before moving on further with request. In this case your code will still be behaving as if the email sending task was executed in the current application process, not in a separate worker. Having said that, there will still be many valid cases where keeping and tracking task results will be necessary or required(some Celery Canvas Workflows are dependent on task results). &lt;/p&gt;

&lt;p&gt;This setting is also available in task level and gives task level setting precedence for that task. &lt;/p&gt;

&lt;p&gt;My preference is to keep the default setting as &lt;code&gt;disabled&lt;/code&gt; and enable for specific tasks where it is necessary. We should also check &lt;a href="https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-store-errors-even-if-ignored"&gt;task_store_errors_even_if_ignored&lt;/a&gt;. &lt;/p&gt;

&lt;h4&gt;
  
  
  result_expires (&lt;code&gt;int(seconds) | timedelta&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;result_expires&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;300&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;24 hours&lt;/code&gt; by default. After the configured time the result will be deleted from the result backend. We should set the minimum time we need to access the result. Setting a larger value unnecessarily will add performance, memory and storage overheads. &lt;/p&gt;

&lt;h4&gt;
  
  
  worker_concurrency (&lt;code&gt;int&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;worker_concurrency&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Number of CPU cores&lt;/code&gt; by default. This setting defines how many concurrent processes the worker will use to execute tasks when concurrency pool is not defined or &lt;code&gt;prefork&lt;/code&gt; is used explicitly. If concurrency pool is either &lt;code&gt;eventlet&lt;/code&gt; or &lt;code&gt;gevent&lt;/code&gt;, then this setting denotes the number of green threads spawned. &lt;/p&gt;

&lt;p&gt;Concurrency pool of Celery is a different discussion and there is a nice &lt;a href="https://distributedpython.com/posts/celery-execution-pools-what-is-it-all-about/"&gt;article&lt;/a&gt; about it. The rule of thumb is, For CPU bound tasks use &lt;code&gt;prefork&lt;/code&gt; concurrency and for I/O bound tasks, use &lt;code&gt;gevent/eventlet&lt;/code&gt;. Don't use more number than available CPU cores with &lt;code&gt;prefork&lt;/code&gt;(default) pool but, for &lt;code&gt;gevent/eventlet&lt;/code&gt; pool, you can use hundreds or even thousands. &lt;/p&gt;

&lt;h4&gt;
  
  
  broker_pool_limit (&lt;code&gt;int | None&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;broker_pool_limit&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;10&lt;/code&gt; connection pools by default. This setting denotes the maximum number of connections can be opened in a connection pool. This number can be allocated depending on how many threads/green-threads(eventlets/gevent) are accessing a connection.&lt;/p&gt;

&lt;h4&gt;
  
  
  worker_prefetch_multiplier (&lt;code&gt;int&lt;/code&gt;)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;celery_app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;worker_prefetch_multiplier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;4&lt;/code&gt; by default, meaning the worker will prefetch 4 task messages per concurrency, e.g. if &lt;code&gt;worker_concurrency&lt;/code&gt; is 4 and &lt;code&gt;worker_prefetch_multiplier&lt;/code&gt; is 5 then, 20 tasks in total will be prefetched by the worker. &lt;br&gt;
Prefetching tasks helps us reduce the time a worker stays idle between completing a task and starting the next one. Also, prefetching too much tasks could cause a performance overhead.&lt;/p&gt;

&lt;p&gt;This is mostly all the configs we need in most cases and I hope I didn't miss any. Let me know if you think I should have mentioned any other configs. We should also go through &lt;a href="https://docs.celeryq.dev/en/stable/userguide/configuration.html"&gt;Celery's complete list of configs&lt;/a&gt; and play with them a bit to understand better. &lt;/p&gt;

&lt;p&gt;See you in the next post. &lt;/p&gt;

</description>
      <category>python</category>
      <category>celery</category>
      <category>tutorial</category>
      <category>conf</category>
    </item>
    <item>
      <title>Introduction to Celery</title>
      <dc:creator>Mehedee Siddique</dc:creator>
      <pubDate>Wed, 13 Sep 2023 19:38:53 +0000</pubDate>
      <link>https://dev.to/mehedees/introduction-to-celery-57l7</link>
      <guid>https://dev.to/mehedees/introduction-to-celery-57l7</guid>
      <description>&lt;p&gt;&lt;strong&gt;&lt;a href="https://docs.celeryq.dev/en/stable/index.html"&gt;Celery&lt;/a&gt;&lt;/strong&gt; is an &lt;a href="https://docs.celeryq.dev/en/stable/getting-started/introduction.html#what-s-a-task-queue"&gt;asynchronous task queue&lt;/a&gt; for python. We mostly use it to run tasks outside the cycle of our regular application, e.g. &lt;a href="https://en.wikipedia.org/wiki/Request%E2%80%93response"&gt;HTTP request-response cycle&lt;/a&gt;. Also, we can use Celery to schedule tasks for a specific time or run periodic tasks, as in &lt;a href="https://en.wikipedia.org/wiki/Cron"&gt;cron-jobs&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;A classic example is sending an email on user signup. You want to send a user an on-boarding email when they’re signing up but that means you’d need to hold the request, post the email, wait for success response for email sending and then, return response to the user . As sending email is a network call, this might take some time which means, for this time, your application and the request sits idly, so does the user, thinking why the hell am I signing up for this shitty application! &lt;/p&gt;

&lt;p&gt;Here comes Celery to the rescue. With Celery you can just publish an email sending task to a &lt;code&gt;Celery Worker&lt;/code&gt; (a celery application containing your email sending task’s codes, more on this later), finish the signup process and return response to the user, all while the Celery worker is doing the task of sending the email to the user. This way your request-response lifecycle is cut short and everyone is happy. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EN5HPENF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lgrk7b51aeoo3caci453.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EN5HPENF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lgrk7b51aeoo3caci453.png" alt="Async Job Execution" width="800" height="529"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are two main and one optional components of Celery:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Broker&lt;/strong&gt; - as the name suggests, an intermediary between the client application/s and worker/s.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Worker&lt;/strong&gt; - where the task actually executes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Result Backend aka. Backend&lt;/strong&gt; (Optional) - where the results of executed tasks are stored for retrieval.&lt;/p&gt;

&lt;h3&gt;
  
  
  Broker
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://docs.celeryq.dev/en/stable/getting-started/backends-and-brokers/index.html"&gt;Broker&lt;/a&gt; is responsible for receiving task messages from the client application/s, queue them and deliver them to available worker/s. It works as an intermediary. Broker is also responsible for maintaining multiple queues according to your need and priority. It also does the job of redelivering failed tasks to workers for retrying(if configured) and more. The most popular broker used with Celery is &lt;code&gt;RabbitMQ&lt;/code&gt;. &lt;code&gt;Redis&lt;/code&gt; can also be used as broker. &lt;/p&gt;

&lt;h3&gt;
  
  
  Worker
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://docs.celeryq.dev/en/stable/getting-started/first-steps-with-celery.html#application"&gt;worker&lt;/a&gt; is the executor of your tasks. There can be multiple workers running at once. Workers constantly monitor broker for new tasks. Whenever a worker has a free slot, it checks with broker for new tasks and picks them if available. Worker then executes the tasks and checks with broker for new tasks again. &lt;/p&gt;

&lt;h3&gt;
  
  
  Result Backend
&lt;/h3&gt;

&lt;p&gt;Celery gives you an option to &lt;a href="https://docs.celeryq.dev/en/stable/getting-started/backends-and-brokers/index.html"&gt;store the result&lt;/a&gt; of your task. It is achieved with a result backend. Backend stores the task’s execution result along with the return value from your tasks. Backend is also necessary to design some of the canvas workflows(more on this later). One of the most popular choice for Result Backend is &lt;code&gt;Redis&lt;/code&gt; because of it’s super fast Key-Value storage that makes fetching results of a task very efficient. &lt;code&gt;RabbitMQ&lt;/code&gt; can also be used as Backend. &lt;/p&gt;

&lt;p&gt;Lets jump into some actual codes now. We will use RabbitMQ as broker and Redis as backend. Lets create an &lt;code&gt;app&lt;/code&gt; directory inside our content root directory and create a python file &lt;code&gt;worker.py&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;First we create &lt;code&gt;connections strings&lt;/code&gt; for RabbitMQ broker and Redis backend. Then we pass them to create a &lt;code&gt;celery-app&lt;/code&gt; instance.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from celery import Celery


# Broker(RabbitMQ) connection str
CELERY_BROKER: str = (
    f"pyamqp://user:Pass1234@rabbitmq:5672//"
)

# Result Backend(Redis) connection str
CELERY_BACKEND: str = f"redis://redis:6379"


# Celery App instance
celery_app = Celery(
    __name__, broker=CELERY_BROKER, backend=CELERY_BACKEND
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our celery application is ready. Now lets make a task for it to execute. &lt;/p&gt;

&lt;p&gt;First, we will define a function &lt;code&gt;send_newsletter_welcome_email_task&lt;/code&gt; that accepts an email address string and sends a welcome email to that address. Lets import and use &lt;code&gt;smtplib&lt;/code&gt; and &lt;code&gt;loguru&lt;/code&gt; for email sending and logging. Then we will import the &lt;code&gt;celery_app&lt;/code&gt; instance from the &lt;code&gt;worker.py&lt;/code&gt; and decorate our email sending function with &lt;code&gt;@celery_app.task()&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from loguru import logger
from smtplib import SMTP

from .worker import celery_app


@celery_app.task()
def send_newsletter_welcome_email_task(email: str):
    logger.info(f"Send welcome email task received")
    with SMTP(
            host="smtp.freesmtpservers.com",
            port=25,
            timeout=60,
    ) as smtp:
        from_addr = "newsletter@mehedees.dev"
        smtp.sendmail(
            from_addr=from_addr,
            to_addrs=email,
            msg=f"To:{email}\nFrom: {from_addr}\r\nSubject: Welcome\n\nWelcome to the newsletter!",
        )
        logger.info("Email successfully sent")
    logger.info(f"Send welcome email task finished")
    return email
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our task is ready. Now we will define a mock web app to trigger our task.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from fastapi import FastAPI, Body
from loguru import logger

from .tasks import send_newsletter_welcome_email_task


app = FastAPI(
    debug=True,
    title="Test Python Celery",
    description="Test basics of Python Celery",
    openapi_url=f"/openapi.json",
)


@app.post(path='/newsletter/signup')
async def newsletter_signup(email: str = Body(embed=True)):
    logger.info(f"Received newsletter signup request from {email}")
    # Doing some processing bla bla bla
    logger.info("Initiating welcome email sending task")
    send_newsletter_welcome_email_task.delay(email)
    # Return response now, celery will take care of sending the welcome mail
    return {
        'success': 'True',
        'code': 200,
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We have created a quick &lt;code&gt;FastAPI&lt;/code&gt; web application with newsletter signup route. We’ve also imported our task here. The view function does some dummy processing and then calls the email sending task. &lt;br&gt;
To &lt;a href="https://docs.celeryq.dev/en/stable/userguide/calling.html#guide-calling"&gt;call the task&lt;/a&gt; we call &lt;code&gt;.delay()&lt;/code&gt; on our task: &lt;code&gt;send_newsletter_welcome_email_task.delay(email)&lt;/code&gt;. When triggered, the task will be sent to the broker(RabbitMQ). Then the broker will deliver the task to the available worker. The worker will then execute the task resulting in the welcome email being sent. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;We could also trigger the task with &lt;code&gt;.apply_async([email,])&lt;/code&gt;. &lt;a href="https://docs.celeryq.dev/en/stable/reference/celery.app.task.html#celery.app.task.Task.delay"&gt;.delay()&lt;/a&gt; is actually a shortcut for &lt;a href="https://docs.celeryq.dev/en/stable/reference/celery.app.task.html#celery.app.task.Task.apply_async"&gt;.apply_async()&lt;/a&gt;. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Our code is now almost ready. We have defined the celery app instance, defined a task, created a web app to trigger our task. We have one modification left for the celery app instance, we need to introduce our task to the application beforehand. Otherwise our worker won’t recognize the task that has been delivered to it for execution. Lets do it&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from celery import Celery


# Broker(RabbitMQ) connection str
CELERY_BROKER: str = (
    f"pyamqp://user:Pass1234@rabbitmq:5672//"
)

# Result Backend(Redis) connection str
CELERY_BACKEND: str = f"redis://redis:6379"


# Celery App instance
celery_app = Celery(
    __name__, broker=CELERY_BROKER, backend=CELERY_BACKEND
)

# Autodiscovery of defined tasks
celery_app.autodiscover_tasks(packages=['app'])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The last line discovers files named &lt;code&gt;task.py&lt;/code&gt; inside the package paths provided to the &lt;code&gt;autodiscover_tasks&lt;/code&gt; method. &lt;/p&gt;

&lt;p&gt;Our code is ready now. Lets run our Celery Worker with following command&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;celery -A app.worker.celery_app worker --loglevel=INFO&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We will see something like&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/usr/local/lib/python3.11/site-packages/celery/platforms.py:829: SecurityWarning: You're running the worker with superuser privileges: this is
2023-09-13T18:28:38.979250505Z absolutely not recommended!
2023-09-13T18:28:38.979365023Z 
2023-09-13T18:28:38.979406969Z Please specify a different user using the --uid option.
2023-09-13T18:28:38.979557797Z 
2023-09-13T18:28:38.979587460Z User information: uid=0 euid=0 gid=0 egid=0
2023-09-13T18:28:38.979641167Z 
2023-09-13T18:28:38.979666256Z   warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
2023-09-13T18:28:39.124459733Z  
2023-09-13T18:28:39.124513560Z  -------------- celery@e96b71c99d22 v5.3.1 (emerald-rush)
2023-09-13T18:28:39.124523657Z --- ***** ----- 
2023-09-13T18:28:39.124527360Z -- ******* ---- Linux-5.15.49-linuxkit-x86_64-with 2023-09-13 18:28:39
2023-09-13T18:28:39.124530601Z - *** --- * --- 
2023-09-13T18:28:39.124533566Z - ** ---------- [config]
2023-09-13T18:28:39.124536551Z - ** ---------- .&amp;gt; app:         app.worker:0x7fcc3b5f9cd0
2023-09-13T18:28:39.124539840Z - ** ---------- .&amp;gt; transport:   amqp://user:**@rabbitmq:5672//
2023-09-13T18:28:39.124542931Z - ** ---------- .&amp;gt; results:     redis://redis:6379/
2023-09-13T18:28:39.124546424Z - *** --- * --- .&amp;gt; concurrency: 6 (prefork)
2023-09-13T18:28:39.124549579Z -- ******* ---- .&amp;gt; task events: OFF (enable -E to monitor tasks in this worker)
2023-09-13T18:28:39.124552584Z --- ***** ----- 
2023-09-13T18:28:39.124555464Z  -------------- [queues]
2023-09-13T18:28:39.124558372Z                 .&amp;gt; celery           exchange=celery(direct) key=celery
2023-09-13T18:28:39.124561364Z                 
2023-09-13T18:28:39.124564209Z 
2023-09-13T18:28:39.124567139Z [tasks]
2023-09-13T18:28:39.124570067Z   . app.tasks.send_newsletter_welcome_email_task

[2023-09-13 18:29:10,192: INFO/MainProcess] mingle: searching for neighbors
2023-09-13T18:29:11.223127806Z [2023-09-13 18:29:11,222: INFO/MainProcess] mingle: all alone
2023-09-13T18:29:11.268981135Z [2023-09-13 18:29:11,268: INFO/MainProcess] celery@e96b71c99d22 ready.
2023-09-13T18:29:14.031251535Z [2023-09-13 18:29:14,030: INFO/MainProcess] Events of group {task} enabled by remote.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our worker is now running. Lets run our FastAPI app now on port &lt;code&gt;12345&lt;/code&gt;: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;uvicorn app.app:app --reload --workers 1 --host 0.0.0.0 --port 12345&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We will see something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;INFO:     Will watch for changes in these directories: ['/app']
2023-09-13T18:28:36.833486177Z INFO:     Uvicorn running on http://0.0.0.0:12345 (Press CTRL+C to quit)
2023-09-13T18:28:36.833500622Z INFO:     Started reloader process [1] using WatchFiles
2023-09-13T18:28:39.603132930Z INFO:     Started server process [8]
2023-09-13T18:28:39.603200518Z INFO:     Waiting for application startup.
2023-09-13T18:28:39.604262220Z INFO:     Application startup complete.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All is ready now. Lets finally test if Celery can execute our task asynchronously. We will now post an email address to our newsletter signup route&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;curl -d '{"email":"&lt;a href="mailto:test.celery@mehedees.dev"&gt;test.celery@mehedees.dev&lt;/a&gt;"}' -H "Content-Type: application/json" -X POST &lt;a href="http://0.0.0.0:12345/newsletter/signup"&gt;http://0.0.0.0:12345/newsletter/signup&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Our mock app logs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2023-09-13 18:31:28.498 | INFO     | app.app:newsletter_signup:17 - Received newsletter signup request from test.celery@mehedees.dev
2023-09-13T18:31:28.499976407Z 2023-09-13 18:31:28.499 | INFO     | app.app:newsletter_signup:19 - Initiating welcome email sending task
2023-09-13T18:31:29.099523900Z INFO:     172.18.0.1:61574 - "POST /newsletter/signup HTTP/1.1" 200 OK
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Mock app has triggered the email sending task(sent to broker) and returned response without waiting for the blocking email sending task to finish executing. Meanwhile Celery worker logs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[2023-09-13 18:31:29,103: INFO/MainProcess] Task app.tasks.send_newsletter_welcome_email_task[cb2f6812-16d6-48ec-9e04-fa20b57bfead] received
2023-09-13T18:31:29.141559704Z 2023-09-13 18:31:29.129 | INFO     | app.tasks:send_newsletter_welcome_email_task:9 - Send welcome email task received
2023-09-13T18:31:31.923187151Z 2023-09-13 18:31:31.922 | INFO     | app.tasks:send_newsletter_welcome_email_task:21 - Email successfully sent
2023-09-13T18:31:32.237598103Z 2023-09-13 18:31:32.237 | INFO     | app.tasks:send_newsletter_welcome_email_task:22 - Send welcome email task finished
2023-09-13T18:31:32.319138219Z [2023-09-13 18:31:32,313: INFO/ForkPoolWorker-4] Task app.tasks.send_newsletter_welcome_email_task[cb2f6812-16d6-48ec-9e04-fa20b57bfead] succeeded in 3.189990901999977s: 'test.celery@mehedees.dev'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Worker has successfully received the task from broker and executed it asynchronously.&lt;/p&gt;

&lt;p&gt;You can find the fully functional and dockerized codebase &lt;a href="https://github.com/mehedees/test-python-celery/tree/basic-usage"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;I started this article originally to share my experience with Celery Canvas Workflows and Concurrency options but then realized, an introductory article might be helpful. So, I'll be writing at least two more articles, one focusing on production ready Celery app development and another on advanced features and issues. &lt;/p&gt;

&lt;p&gt;Constructive criticisms and feedback are welcome! &lt;/p&gt;

&lt;p&gt;Keep learning, never settle(sorry oneplus) and, till I see you again!&lt;/p&gt;

&lt;p&gt;Edit: The 2nd post on &lt;a href="https://dev.to/mehedees/configuring-celery-app-and-tasks-44g3"&gt;Configuring Celery&lt;/a&gt; is now available. &lt;/p&gt;

</description>
      <category>python</category>
      <category>celery</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
