python requests: post and big content
Using an open file object as the data
parameter ensures that requests
will stream the data for you.
If a file size can be determined (via the OS filesystem), the file object is streamed using a 8kb buffer. If no filesize can be determined, a Transfer-Encoding: chunked
request is sent sending the data per line instead (the object is used as an iterable).
If you were to use the files=
parameter for a multipart POST, on the other hand, the file would be loaded into memory before sending. Use the requests-toolbelt package to stream multi-part uploads:
import requests
from requests_toolbelt.multipart.encoder import MultipartEncoder
csvfile = '/path/file.csv'
with open(csvfile) as f:
m = MultipartEncoder(fields={'csv_field_name': ('file.csv', f, 'text/csv')})
headers = {'Content-Type': m.content_type}
r = requests.post(url, data=m, headers=headers)
This will not load the entire file into memory, it will be split into chunks and transmitted a little at a time. You can see this in the source code here.