How do I return an image in fastAPI?
If you already have the bytes of the image in memory
Return a fastapi.responses.Response
with your custom content
and media_type
.
You'll also need to muck with the endpoint decorator to get FastAPI to put the correct media type in the OpenAPI specification.
@app.get(
"/image",
# Set what the media type will be in the autogenerated OpenAPI specification.
# fastapi.tiangolo.com/advanced/additional-responses/#additional-media-types-for-the-main-response
responses = {
200: {
"content": {"image/png": {}}
}
}
# Prevent FastAPI from adding "application/json" as an additional
# response media type in the autogenerated OpenAPI specification.
# https://github.com/tiangolo/fastapi/issues/3258
response_class=Response,
)
def get_image()
image_bytes: bytes = generate_cat_picture()
# media_type here sets the media type of the actual response sent to the client.
return Response(content=image_bytes, media_type="image/png")
See the Response
documentation.
If your image exists only on the filesystem
Return a fastapi.responses.FileResponse
.
See the FileResponse
documentation.
Be careful with StreamingResponse
Other answers suggest StreamingResponse
. StreamingResponse
is harder to use correctly, so I don't recommend it unless you're sure you can't use Response
or FileResponse
.
In particular, code like this is pointless. It will not "stream" the image in any useful way.
@app.get("/image")
def get_image()
image_bytes: bytes = generate_cat_picture()
# ❌ Don't do this.
image_stream = io.BytesIO(image_bytes)
return StreamingResponse(content=image_stream, media_type="image/png")
First of all, StreamingResponse(content=my_iterable)
streams by iterating over the chunks provided by my_iterable
. But when that iterable is a BytesIO
, the chunks will be \n
-terminated lines, which won't make sense for a binary image.
And even if the chunk divisions made sense, chunking is pointless here because we had the whole image_bytes
bytes
object available from the start. We may as well have just passed the whole thing into a Response
from the beginning. We don't gain anything by holding data back from FastAPI.
Second, StreamingResponse
corresponds to HTTP chunked transfer encoding. (This might depend on your ASGI server, but it's the case for Uvicorn, at least.) And this isn't a good use case for chunked transfer encoding.
Chunked transfer encoding makes sense when you don't know the size of your output ahead of time, and you don't want to wait to collect it all to find out before you start sending it to the client. That can apply to stuff like serving the results of slow database queries, but it doesn't generally apply to serving images.
Unnecessary chunked transfer encoding can be harmful. For example, it means clients can't show progress bars when they're downloading the file. See:
- Content-Length header versus chunked encoding
- Is it a good idea to use Transfer-Encoding: chunked on static files?
I had a similar issue but with a cv2 image. This may be useful for others. Uses the StreamingResponse
.
import io
from starlette.responses import StreamingResponse
app = FastAPI()
@app.post("/vector_image")
def image_endpoint(*, vector):
# Returns a cv2 image array from the document vector
cv2img = my_function(vector)
res, im_png = cv2.imencode(".png", cv2img)
return StreamingResponse(io.BytesIO(im_png.tobytes()), media_type="image/png")
All the other answer(s) is on point, but now it's so easy to return an image
from fastapi.responses import FileResponse
@app.get("/")
async def main():
return FileResponse("your_image.jpeg")
It's not properly documented yet, but you can use anything from Starlette.
So, you can use a FileResponse
if it's a file in disk with a path: https://www.starlette.io/responses/#fileresponse
If it's a file-like object created in your path operation, in the next stable release of Starlette (used internally by FastAPI) you will also be able to return it in a StreamingResponse
.