How do I reduce a python (docker) image size using a multi-stage build?
I recommend the approach detailed in this article (section 2). He uses virtualenv so pip install stores all the python code, binaries, etc. under one folder instead of spread out all over the file system. Then it's easy to copy just that one folder to the final "production" image. In summary:
Compile image
- Activate virtualenv in some path of your choosing.
- Prepend that path to your docker ENV. This is all virtualenv needs to function for all future docker RUN and CMD action.
- Install system dev packages and
pip install xyz
as usual.
Production image
- Copy the virtualenv folder from the Compile Image.
- Prepend the virtualenv folder to docker's PATH
ok so my solution is using wheel, it lets us compile on first image, create wheel files for all dependencies and install them in the second image, without installing the compilers
FROM python:2.7-alpine as base
RUN mkdir /svc
COPY . /svc
WORKDIR /svc
RUN apk add --update \
postgresql-dev \
gcc \
musl-dev \
linux-headers
RUN pip install wheel && pip wheel . --wheel-dir=/svc/wheels
FROM python:2.7-alpine
COPY --from=base /svc /svc
WORKDIR /svc
RUN pip install --no-index --find-links=/svc/wheels -r requirements.txt
You can see my answer regarding this in the following blog post
https://www.blogfoobar.com/post/2018/02/10/python-and-docker-multistage-build