Forum Moderators: coopster
If I do this by myself, I will use exec('wget http://www.example.com/test.pdf --output-document=something.pdf'), is this reliable enough?
Also, PHP has a built-in version of Curl. It's the same basic idea as wget, and the built-in version takes more work to use, but is probably more robust.
$destination=fopen("files/localfile.dat","w");
$source=fopen($url,"r");
while ($a=fread($source,1024)) fwrite($destination,$a);
fclose($source);
fclose($destination);
The advantage of this method is that you can enforce maximum upload sizes without having to download everything before you check for the file size:
$destination=fopen("files/localfile.dat","w");
$source=fopen($url,"r");
$maxsize=3000;
$length=0;
while (($a=fread($source,1024))&&($length<$maxsize))
{
$length=$length+1024;
fwrite($destination,$a);
}
fclose($source);
fclose($destination);
Imagine your server has 3Gb space free and someone supplies you a 3.1Gb file by URL. Your server is going to have serious problems pretty soon if you just eat it all and then check.