How to cp file only if it does not exist, throw error otherwise?
You could test for the existence of a file by listing the file, and seeing whether it returns something. For example:
aws s3 ls s3://bucket/file.txt | wc -l
This would return a zero (no lines) if the file does not exist.
If you only want to copy a file if it does not exist, try the sync
command, e.g.:
aws s3 sync . s3://bucket/ --exclude '*' --include 'file.txt'
This will synchronize the local file with the remote object, only copying it if it does not exist or if the local file is different to the remote object.
You can do this by listing and copying if and only if the list succeeds.
aws s3 ls "s3://my-bucket/production/myfile" || aws s3 cp "dist/myfile" "s3://my-bucket/production/myfile"
Edit: replaced && to || to have the desired effect of if list fails do copy
So, turns out that "aws s3 sync" doesn't do files, only directories. If you give it a file, you get...interesting...behavior, since it treats anything you give it like a directory and throws a slash on it. At least aws-cli/1.6.7 Python/2.7.5 Darwin/13.4.0 does.
%% date > test.txt
%% aws s3 sync test.txt s3://bucket/test.txt
warning: Skipping file /Users/draistrick/aws/test.txt/. File does not exist.
So, if you -really- only want to sync a file (only upload if exists, and if checksum matches) you can do it:
file="test.txt"
aws s3 sync --exclude '*' --include "$file" "$(dirname $file)" "s3://bucket/"
Note the exclude/include order - if you reverse that, it won't include anything. And your source and include path need to have sanity around their matching, so maybe a $(basename $file) is in order for --include if you're using full paths... aws --debug s3 sync is your friend here to see how the includes evaluate.
And don't forget the target is a directory key, not a file key.
Here's a working example:
%% file="test.txt"
%% date >> $file
%% aws s3 sync --exclude '*' --include "$file" "$(dirname $file)" "s3://bucket/"
upload: ./test.txt to s3://bucket/test.txt/test.txt
%% aws s3 sync --exclude '*' --include "$file" "$(dirname $file)" "s3://bucket/"
%% date >> $file
%% aws s3 sync --exclude '*' --include "$file" "$(dirname $file)" "s3://bucket/"
upload: ./test.txt to s3://bucket/test.txt/test.txt
(now, if only there were a way to ask aws s3 to -just- validate the checksum, since it seems to always do multipart style checksums.. oh, maybe some --dryrun and some output scraping and sync..)