PHP cURL, read remote file and write contents to local file
Actually, using fwrite is partially true. In order to avoid memory overflow problems with large files (Exceeded maximum memory limit of PHP), you'll need to setup a callback function to write to the file.
NOTE: I would recommend creating a class specifically to handle file downloads and file handles etc. rather than EVER using a global variable, but for the purposes of this example, the following shows how to get things up and running.
so, do the following:
# setup a global file pointer
$GlobalFileHandle = null;
function saveRemoteFile($url, $filename) {
global $GlobalFileHandle;
set_time_limit(0);
# Open the file for writing...
$GlobalFileHandle = fopen($filename, 'w+');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FILE, $GlobalFileHandle);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_USERAGENT, "MY+USER+AGENT"); //Make this valid if possible
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); # optional
curl_setopt($ch, CURLOPT_TIMEOUT, -1); # optional: -1 = unlimited, 3600 = 1 hour
curl_setopt($ch, CURLOPT_VERBOSE, false); # Set to true to see all the innards
# Only if you need to bypass SSL certificate validation
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
# Assign a callback function to the CURL Write-Function
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'curlWriteFile');
# Exceute the download - note we DO NOT put the result into a variable!
curl_exec($ch);
# Close CURL
curl_close($ch);
# Close the file pointer
fclose($GlobalFileHandle);
}
function curlWriteFile($cp, $data) {
global $GlobalFileHandle;
$len = fwrite($GlobalFileHandle, $data);
return $len;
}
You can also create a progress callback to show how much / how fast you're downloading, however that's another example as it can be complicated when outputting to the CLI.
Essentially, this will take each block of data downloaded, and dump it to the file immediately, rather than downloading the ENTIRE file into memory first.
Much safer way of doing it! Of course, you must make sure the URL is correct (convert spaces to %20 etc.) and that the local file is writeable.
Cheers, James.
Let's try sending GET request to http://facebook.com
:
$ curl -v http://facebook.com * Rebuilt URL to: http://facebook.com/ * Hostname was NOT found in DNS cache * Trying 69.171.230.5... * Connected to facebook.com (69.171.230.5) port 80 (#0) > GET / HTTP/1.1 > User-Agent: curl/7.35.0 > Host: facebook.com > Accept: */* > < HTTP/1.1 302 Found < Location: https://facebook.com/ < Vary: Accept-Encoding < Content-Type: text/html < Date: Thu, 03 Sep 2015 16:26:34 GMT < Connection: keep-alive < Content-Length: 0 < * Connection #0 to host facebook.com left intact
What happened? It appears that Facebook redirected us from http://facebook.com
to secure https://facebook.com/
. Note what is response body length:
Content-Length: 0
It means that zero bytes will be written to xxxx--all_good.txt
. This is why the file stays empty.
Your solution is absolutelly correct:
$fp = fopen('file.txt', 'w');
curl_setopt($handle, CURLOPT_FILE, $fp);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
All you need to do is change URL to https://facebook.com/
.
Regarding other answers:
- @JonGauthier: No, there is no need to use
fwrite()
aftercurl_exec()
- @doublehelix: No, you don't need
CURLOPT_WRITEFUNCTION
for such a simple operation which is copying contents to file. - @ScottSaunders:
touch()
creates empty file if it doesn't exists. I think it was intention of OP.
Seriously, three answers and every single one is invalid?