How to create download link for an Amazon S3 bucket's object?
The following patterns are valid for constructing S3 URLs:
http(s)://<bucket>.s3.amazonaws.com/<object>
http(s)://s3.amazonaws.com/<bucket>/<object>
If you want the public to access the bucket, it is as simple as
http://[YourBucketName].s3.amazonaws.com/[YourFileName]
So long as you set permissions correctly.
If you're worried about download abuse, you'll want an authenticated URL (which I guess you want from your code sammple). In which case, I suggest you use the Amazon SDK: http://aws.amazon.com/sdkforphp/ as it contains examples of what you need.
$s3->getObjectUrl($bucket, $filename, '5 minutes');
Docs: http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_getObjectUrl
I see that people have already responded to this but I wanted to add some more context for those who may have a secured bucket (requires access). Note, you don't have to generate the URLs if you talk directly to the S3 bucket, then you can use 'file_get_contents' etc. but it is much slower as you can't use multi curl requests (for speed). However you could use pthreads if you have a newer php release.
INSTALLATION: Install the S3 Class file for Amazon, there are easy ways to add it using composer or just downloading the S3.php file manually.
NOT SECURED: (see other posts on this matter, just basically use the URL)
http(s)://<bucket>.s3.amazonaws.com/<object>
http(s)://s3.amazonaws.com/<bucket>/<object>
SECURED HTTPS (when you have your bucket protected):
https://amazon.com/file/you/wanted.xxx?ID:XXXXX?SIG:YYYYY
(1) Create a https:// url and use the multi curl tool to get them all at the same time (recommended).
A simplistic example:
$url = /path/to_the/file_name/file.ext
//note check amazon to confirm the path which will contain only "_" and no spaces.
$s3 = new S3($awsAccessKeyID, $awsSecretKey);
$curls[] = $s3->get_object_url($bucketName, $uri, '1 hour');
var_dump($results = multiCurlRequest($curls));
more info:
http://docs.aws.amazon.com/aws-sdk-php/v2/api/class-Aws.S3.S3Client.html#_getObjectUrl http://undesigned.org.za/2007/10/22/amazon-s3-php-class/documentation
FYI:
function multiCurlRequest($curlList = array(),$user = '', $pass = '',$timeout = self::MULTI_REQ_TIMEOUT_SECS, $retTxfr = 1) {
if (empty($curlList) || count($curlList) == 0) return false;
$master = curl_multi_init();
$node_count = count($curlList);
for ($i = 0; $i < $node_count; $i++) {
$ch[$i] = curl_init($curlList[$i]);
curl_setopt($ch[$i], CURLOPT_TIMEOUT, $timeout); // -- timeout after X seconds
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, $retTxfr);
curl_setopt($ch[$i], CURLOPT_HTTPAUTH, CURLAUTH_ANY);
curl_setopt($ch[$i], CURLOPT_USERPWD, "{$user}:{$pass}");
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($master, $ch[$i]);
}
// -- get all requests at once, finish when done or timeout met --
do { curl_multi_exec($master, $running); }
while ($running > 0);
$results = array();
// -- get results from requests --
for ($i = 0; $i < $node_count; $i++) {
$results[$i] = curl_multi_getcontent($ch[$i]);
if ((int) curl_getinfo($ch[$i], CURLINFO_HTTP_CODE) > 399 || empty($results[$i])) {
$this->set_request( [ ['label' => '404', 'href' => $results[$i], '404' => '1' ] ] );
unset($results[$i]);
}
curl_multi_remove_handle($master, $ch[$i]);
curl_close($ch[$i]);
}
curl_multi_close($master);
if (empty($results)) return false;
//$results = array_values($results); // -- removed as we want the original positions
return $results;
}