celery: daemonic processes are not allowed to have children
I got a similar Error trying to call a multiprocessing method from a Celery task in django. I solved using billiard instead of multiprocessing
import billiard as multiprocessing
Hope it helps.
billiard
and multiprocessing
are different libraries - billiard
is the Celery project's own fork of multiprocessing
. You will need to import billiard
and use it instead of multiprocessing
However the better answer is probably that you should refactor your code so that you spawn more Celery tasks instead of using two different ways of distributing your work.
You can do this using Celery canvas
from celery import group
@app.task
def sleepawhile(t):
print("Sleeping %i seconds..." % t)
time.sleep(t)
return t
def work(num_procs):
return group(sleepawhile.s(randint(1, 5)) for x in range(num_procs)])
def test(self):
my_group = group(work(randint(1, 5)) for x in range(5))
result = my_group.apply_async()
result.get()
I've attempted to make a working version of your code that uses canvas primitives instead of multiprocessing. However since your example was quite artificial it's not easy to come up with something that makes sense.
Update:
Here is a translation of your real code that uses Celery canvas:
tasks.py
:
@shared_task
run_training_method(saveindex, embedder_id):
embedder = Embedder.objects.get(pk=embedder_id)
embedder.training_method(saveindex)
models.py
:
from tasks import run_training_method
from celery import group
class Embedder(Model):
def embedder_update_task(self):
my_group = []
for saveindex in range(self.start_index, self.start_index + self.nsaves):
self.create_storage(saveindex)
# Add to list
my_group.extend([run_training_method.subtask((saveindex, self.id))
for i in range(self.nproc)])
result = group(my_group).apply_async()
If you are using a submodule/library with multiprocessing already baked in, it may make more sense to set the -P threads
argument of the worker:
celery worker -P threads
https://github.com/celery/celery/issues/4525#issuecomment-566503932
Update: There was a bug in command-line parsing in celery < v5.1.1
that did not allow -P threads
even though it was supported. It is fixed in >= v5.1.1
. It has been officially supported since v4.4
.
I got this when I use multiprocessing with Celery 4.2.0 and Python3.6. Solved this by using billiard.
I changed my source code from
from multiprocessing import Process
to
from billiard.context import Process
solved this error.
Attention, import source is billiard.context
not billiard.process