Calling async_result.get() from within a celery task
Here is a fragment that hushes the warning if you know what you are doing is safe:
from celery.result import allow_join_result
with allow_join_result():
result.get()
source
There is the solution of chaining of course, except it's impossible to chain remote tasks. The only way to call remote tasks is using app.send_task, which returns an AsyncResult, and that I can't chain as I need the task function itself..
No, it is possible to chain remote tasks. I've just tried it in a project of mine and it works. I suggest you try it with a trivial test task first to make sure you got the basics down before moving to something more complex. I've created these tasks:
@app.task
def foo(arg):
return arg + 1
@app.task
def bar(arg):
return "I barred " + str(arg)
The two tasks are held in a module named app.tasks
. (It is part of a Django project.)
Then I wrote a command that does:
import celery
print (celery.signature("app.tasks.foo", args=(1, )) |
celery.signature("app.tasks.bar")).delay().get()
And I got on the screen:
I barred 2
If you want your task to be synchronous, you can use ready()
to hold a loop:
import time
while not result_from_remote.ready():
time.sleep(5)
return result_from_remote.get()