DEV Community

Fatjon Lala
Fatjon Lala

Posted on

Celery worker on Kubernetes error (permission denied)

Hi everyone,

I am trying to setup airflow on Kubernetes but without using the helm charts, rather using simple deployment scripts. I have successfully managed to run webserver, scheduler, and triggerer and connect to each other and also to the postgreSQL db.

Currently, trying to setup the celery worker, but I am getting some permission denied error from the billiard library as following:

[2023-12-08 09:36:50,561: CRITICAL/MainProcess] Unrecoverable error: FileNotFoundError(2, 'No such file or directory')
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/celery/worker/worker.py", line 202, in start
self.blueprint.start(self)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/bootsteps.py", line 116, in start
step.start(parent)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/bootsteps.py", line 365, in start
return self.obj.start()
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/base.py", line 130, in start
self.on_start()
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/prefork.py", line 109, in on_start
P = self._pool = Pool(processes=self.limit,
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/asynpool.py", line 464, in __init__
super().__init__(processes, *args, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/pool.py", line 1045, in __init__
self._create_worker_process(i)
File "/home/airflow/.local/lib/python3.9/site-packages/celery/concurrency/asynpool.py", line 482, in _create_worker_process
return super()._create_worker_process(i)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/pool.py", line 1141, in _create_worker_process
on_ready_counter = self._ctx.Value('i')
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/context.py", line 179, in Value
return Value(typecode_or_type, *args, lock=lock,
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/sharedctypes.py", line 81, in Value
lock = ctx.RLock()
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/context.py", line 108, in RLock
return RLock(ctx=self.get_context())
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/synchronize.py", line 206, in __init__
SemLock.__init__(self, RECURSIVE_MUTEX, 1, 1, ctx=ctx)
File "/home/airflow/.local/lib/python3.9/site-packages/billiard/synchronize.py", line 70, in __init__
sl = self._semlock = _billiard.SemLock(
FileNotFoundError: [Errno 2] No such file or directory

As there is no directory mentioned (following some github issues), I doubt that it could be with /dev/shm related directory to which I have provided access for the user running the celery worker. Has anyone had the same problem? If so, can you share some hints on possible solution to it? Note that the default behaviour of our Kubernetes is read-only, meanwhile we mount a emptyDir for the /dev/shm, in order to write-read there. Thanks in advance!

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay