Auto-reloading of code changes with Django development in Docker with Gunicorn

Thanks to kikicarbonell, I looked into having a volume for my code, and after looking at the Docker Compose recommended Django setup, I added volumes: - .:/code to my web container in docker-compose.yml, and now any code changes I make automatically apply.

## docker-compose.yml:
web:
  restart: always
  build: .
  expose:
    - "8000"
  links:
    - postgres:postgres
  volumes:
    - /usr/src/app/static
    - .:/code
  env_file: .env
  command: /usr/local/bin/gunicorn myapp.wsgi:application -w 2 -b :8000 --reload

Update: for a thorough example of using Gunicorn and Django with Docker, checkout this example project from Rackspace, which also shows how to use docker-machine to launch the setup on remote servers like Rackspace Cloud.

Caveat: currently, this method does not work when your code is stored locally and the docker host is remote (e.g., on a cloud provider like Digital Ocean or Rackspace). This also applies to virtual machines if your local file system is not mounted on the VM. Note that there are separate volume drivers (e.g., flocker), and there might be something out there to address this need. For now, the "fix" is to rsync/scp your files up to a directory on the remote docker host. Then, the --reload flag will auto-reload gunicorn after any scp/rsync. Update: If pushing code to remote docker host, I find it far easier to just rebuild the docker container (e.g., docker-compose build web && docker-compose up -d). This can be slower though than the rsync approach if your src folder is large.


I faced very similar problem trying to configure auto-reload of the project with a little bit different setup. I set up volumes but it did not work anyway. After an hour of googling and thorough examination of my code I figured out that volume paths in Dockerfile and docker-compose.yml simply do not match. Make sure that they are the same.

My Dockerfile

FROM python:3.6.9-alpine3.10

COPY ./requirements/local.txt /app/requirements/local.txt

RUN set -ex \
    && apk add --no-cache --virtual .build-deps postgresql-dev git gcc libgcc musl-dev jpeg-dev zlib-dev build-base \
    && python -m venv /env \
    && /env/bin/pip install --upgrade pip \
    && /env/bin/pip install --no-cache-dir -r /app/requirements/local.txt \
    && runDeps="$(scanelf --needed --nobanner --recursive /env \
        | awk '{ gsub(/,/, "\nso:", $2); print "so:" $2 }' \
        | sort -u \
        | xargs -r apk info --installed \
        | sort -u)" \
    && apk add --virtual rundeps $runDeps \
    && apk del .build-deps

### Here is the path to the project
COPY . /app

WORKDIR /app/project

ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

EXPOSE 8088

My docker-compose.yml

version: '3'


services:

  web:
    build:
      context: ../..
      dockerfile: compose/local/Dockerfile
    restart: on-failure
    command: python manage.py runserver 0.0.0.0:8088 --settings=project.settings.local
    volumes:
      # - .:/var/www/app  # messed up path
      - .:/app  # correct path
    env_file:
      - ../../.env.local
    depends_on:
      - db
    ports:
      - "8000:8000"

You have another problem- Docker caches each layer that it builds. You shouldn't have to re-run pip install every time!

ADD . /code/
RUN pip install -r /code/requirements/docker.txt

This is your problem- Docker checks every ADD statement to see if any files have changed and invalidates the cache for it and every later step if it has. The correct way to do this is...

ADD ./requirements/docker.txt /code/requirements/
RUN pip install -r /code/requirements/docker.txt
ADD ./code/

Which will only invalidate your pip install line if your requirements file changes!