How to gzip while uploading into s3 using boto
I implemented the solution hinted at in the comments of the accepted answer by garnaat:
import cStringIO
import gzip
def sendFileGz(bucket, key, fileName, suffix='.gz'):
key += suffix
mpu = bucket.initiate_multipart_upload(key)
stream = cStringIO.StringIO()
compressor = gzip.GzipFile(fileobj=stream, mode='w')
def uploadPart(partCount=[0]):
partCount[0] += 1
stream.seek(0)
mpu.upload_part_from_file(stream, partCount[0])
stream.seek(0)
stream.truncate()
with file(fileName) as inputFile:
while True: # until EOF
chunk = inputFile.read(8192)
if not chunk: # EOF?
compressor.close()
uploadPart()
mpu.complete_upload()
break
compressor.write(chunk)
if stream.tell() > 10<<20: # min size for multipart upload is 5242880
uploadPart()
It seems to work without problems. And after all, streaming is in most cases just a chunking of the data. In this case, the chunks are about 10MB large, but who cares? As long as we aren't talking about several GB chunks, I'm fine with this.
Update for Python 3:
from io import BytesIO
import gzip
def sendFileGz(bucket, key, fileName, suffix='.gz'):
key += suffix
mpu = bucket.initiate_multipart_upload(key)
stream = BytesIO()
compressor = gzip.GzipFile(fileobj=stream, mode='w')
def uploadPart(partCount=[0]):
partCount[0] += 1
stream.seek(0)
mpu.upload_part_from_file(stream, partCount[0])
stream.seek(0)
stream.truncate()
with open(fileName, "rb") as inputFile:
while True: # until EOF
chunk = inputFile.read(8192)
if not chunk: # EOF?
compressor.close()
uploadPart()
mpu.complete_upload()
break
compressor.write(chunk)
if stream.tell() > 10<<20: # min size for multipart upload is 5242880
uploadPart()
You can also compress Bytes with gzip easily and upload it as the following easily:
import gzip
import boto3
cred = boto3.Session().get_credentials()
s3client = boto3.client('s3',
aws_access_key_id=cred.access_key,
aws_secret_access_key=cred.secret_key,
aws_session_token=cred.token
)
bucketname = 'my-bucket-name'
key = 'filename.gz'
s_in = b"Lots of content here"
gzip_object = gzip.compress(s_in)
s3client.put_object(Bucket=bucket, Body=gzip_object, Key=key)
It is possible to replace s_in
by any Bytes, io.BytesIO, pickle dumps, files, etc.
If you want to upload compressed Json then here is a nice example: Upload compressed Json to S3
There really isn't a way to do this because S3 doesn't support true streaming input (i.e. chunked transfer encoding). You must know the Content-Length prior to upload and the only way to know that is to have performed the gzip operation first.