Forum Moderators: coopster
I have been programming php for a little while and i have come across a conundrum what i need to do for a site i am doing is transfer one image from one server to another .. By basically putting in the web address and the path to the pic . To basically download it to my server for a local copy so i can display it from my server to save the other guy bandwidth.
i have made up one script but it only seems to be able to do pics less then 60k or so .. it basically just opens the file puts the contents into a string and saves it as a jpg
I do not have accessto GD or anything so is there a way i can do it effectively?
Instead, read a chunk then write a chunk, and repeat till the file is done.
i.e.
while (!feof($inFp) ) {
$data = fgets($inFp, 4096);
fwrite($outFp, $data);
}
Also beware of magic_quotes_runtime adding slashes to your files.
$fd = fopen( $webpage, "r" );
$contents = fread($fd, 99999999999);
fclose( $fd );
$File = fopen("images/latest/latest.jpg","w");
fwrite($File,"$contents" );
fclose($File);
echo "..Done";
this is what i have ... how would i make it write in chunks?
You want something more like this..
[pre]
$webpage = " location of image here ";
$fd = fopen( $webpage, "rb" )
or die("couldn't open webpage");
$File = fopen("images/latest/latest.jpg","wb");
while (!feof($fd) ) {
$contents = fread($fd, 4096);
fwrite( $File, $contents );
}
fclose( $fd );
fclose( $File );
echo "..Done";
[/pre] You might want to think about adding some more error handling. If someone gets disconnected while they are transferring the file then you might end up with half a file. So it might be better to write it into a temporary file and only replace the main file when you have definitely transfered a complete file.