How could I use aws lambda to write file to s3 (python)?
I've had success streaming data to S3, it has to be encoded to do this:
import boto3
def lambda_handler(event, context):
string = "dfghj"
encoded_string = string.encode("utf-8")
bucket_name = "s3bucket"
file_name = "hello.txt"
s3_path = "100001/20180223/" + file_name
s3 = boto3.resource("s3")
s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string)
If the data is in a file, you can read this file and send it up:
with open(filename) as f:
string = f.read()
encoded_string = string.encode("utf-8")
My response is very similar to Tim B but the most import part is
1.Go to S3 bucket and create a bucket you want to write to
2.Follow the below steps otherwise you lambda will fail due to permission/access. I've copied and pasted it the link content here for you too just in case if they change the url /move it to some other page.
a. Open the roles page in the IAM console.
b. Choose Create role.
c. Create a role with the following properties.
-Trusted entity – AWS Lambda.
-Permissions – AWSLambdaExecute.
-Role name – lambda-s3-role.
The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs.
Copy and past this into your Lambda python function
import json, boto3,os, sys, uuid from urllib.parse import unquote_plus s3_client = boto3.client('s3') def lambda_handler(event, context): some_text = "test" #put the bucket name you create in step 1 bucket_name = "my_buck_name" file_name = "my_test_file.csv" lambda_path = "/tmp/" + file_name s3_path = "output/" + file_name os.system('echo testing... >'+lambda_path) s3 = boto3.resource("s3") s3.meta.client.upload_file(lambda_path, bucket_name, file_name) return { 'statusCode': 200, 'body': json.dumps('file is created in:'+s3_path) }