How to log memory usage of an Django app per request

A Django middleware for tracking memory usage and generating a usable result immediately, needs to hook both process request and process response. In other words, look at difference between start and finish of request and log a warning if exceeds some threshold.

A complete middleware example is:

import os
import psutil
import sys

THRESHOLD = 2*1024*1024

class MemoryUsageMiddleware(object):

    def process_request(self, request):
        request._mem = psutil.Process(os.getpid()).memory_info()

   def process_response(self, request, response):
        mem = psutil.Process(os.getpid()).memory_info()
        diff = mem.rss - request._mem.rss
        if diff > THRESHOLD:
            print >> sys.stderr, 'MEMORY USAGE %r' % ((diff, request.path),)
        return response

This requires the 'psutil' module to be installed for doing memory calculation.

Is brute force and can lead to false positives in a multithread system. Because of lazy loading, you will also see it trigger on first few requests against new process as stuff loads up.


This may not fully cover your question, but I recommend trying nginx+uwsgi instead of apache2+mod_wsgi. In my tests it turned out to be much more stable (mod_wsgi choked at some point completely), much faster and uses a lot less memory (it may just fix all your issues altogether).

About tracking memory usage, you can create a simple middleware:

class SaveMemoryUsageMiddleware(object):
    def process_response(self, request, response):
        # track memory usage here and append to file or db
        return response

and add it to your middlewares.

For memory tracking code I recommend checking out: Total memory used by Python process?

However, it would probably be better if you could avoid doing this on production. Just for dev and tests to track down real problem.