Multithreading for Python Django
I've continued using this implementation at scale and in production with no issues.
Decorator definition:
def start_new_thread(function):
def decorator(*args, **kwargs):
t = Thread(target = function, args=args, kwargs=kwargs)
t.daemon = True
t.start()
return decorator
Example usage:
@start_new_thread
def foo():
#do stuff
Over time, the stack has updated and transitioned without fail.
Originally Python 2.4.7, Django 1.4, Gunicorn 0.17.2, now Python 3.6, Django 2.1, Waitress 1.1.
If you are using any database transactions, Django will create a new connection and this needs to be manually closed:
from django.db import connection
@postpone
def foo():
#do stuff
connection.close()
Celery is an asynchronous task queue/job queue. It's well documented and perfect for what you need. I suggest you start here
The most common way to do asynchronous processing in Django is to use Celery and django-celery
.
tomcounsell's approach works well if there are not too many incoming jobs. If many long-lasting jobs are run in short period of time, therefore spawning a lot of threads, the main process will suffer. In this case, you can use a thread pool with a coroutine,
# in my_utils.py
from concurrent.futures import ThreadPoolExecutor
MAX_THREADS = 10
def run_thread_pool():
"""
Note that this is not a normal function, but a coroutine.
All jobs are enqueued first before executed and there can be
no more than 10 threads that run at any time point.
"""
with ThreadPoolExecutor(max_workers=MAX_THREADS) as executor:
while True:
func, args, kwargs = yield
executor.submit(func, *args, **kwargs)
pool_wrapper = run_thread_pool()
# Advance the coroutine to the first yield (priming)
next(pool_wrapper)
from my_utils import pool_wrapper
def job(*args, **kwargs):
# do something
def handle(request):
# make args and kwargs
pool_wrapper.send((job, args, kwargs))
# return a response