DEV Community

Fatjon Lala
Fatjon Lala

Posted on

Celery worker on Kubernetes error (permission denied)

Hi everyone,

I am trying to setup airflow on Kubernetes but without using the helm charts, rather using simple deployment scripts. I have successfully managed to run webserver, scheduler, and triggerer and connect to each other and also to the postgreSQL db.

Currently, trying to setup the celery worker, but I am getting some permission denied error from the billiard library as following:

[2023-12-08 09:36:50,561: CRITICAL/MainProcess] Unrecoverable error: FileNotFoundError(2, 'No such file or directory')
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/celery/worker/worker.py", line 202, in start
self.blueprint.start(self)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/bootsteps.py", line 116, in start
step.start(parent)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/bootsteps.py", line 365, in start
return self.obj.start()
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/base.py", line 130, in start
self.on_start()
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/prefork.py", line 109, in on_start
P = self._pool = Pool(processes=self.limit,
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/asynpool.py", line 464, in __init__
super().__init__(processes, *args, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/pool.py", line 1045, in __init__
self._create_worker_process(i)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/asynpool.py", line 482, in _create_worker_process
return super()._create_worker_process(i)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/pool.py", line 1141, in _create_worker_process
on_ready_counter = self._ctx.Value('i')
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/context.py", line 179, in Value
return Value(typecode_or_type, *args, lock=lock,
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/sharedctypes.py", line 81, in Value
lock = ctx.RLock()
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/context.py", line 108, in RLock
return RLock(ctx=self.get_context())
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/synchronize.py", line 206, in __init__
SemLock.__init__(self, RECURSIVE_MUTEX, 1, 1, ctx=ctx)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/synchronize.py", line 70, in __init__
sl = self._semlock = _billiard.SemLock(
FileNotFoundError: [Errno 2] No such file or directory

As there is no directory mentioned (following some github issues), I doubt that it could be with /dev/shm related directory to which I have provided access for the user running the celery worker. Has anyone had the same problem? If so, can you share some hints on possible solution to it? Note that the default behaviour of our Kubernetes is read-only, meanwhile we mount a emptyDir for the /dev/shm, in order to write-read there. Thanks in advance!

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read more →

Top comments (0)

Image of Docusign

🛠️ Bring your solution into Docusign. Reach over 1.6M customers.

Docusign is now extensible. Overcome challenges with disconnected products and inaccessible data by bringing your solutions into Docusign and publishing to 1.6M customers in the App Center.

Learn more