Forum Moderators: coopster

Message Too Old, No Replies

Force-download script not reliable

would like help understanding why

         

pixeltierra

7:37 am on Sep 12, 2008 (gmt 0)

10+ Year Member



I have written a force-download script that reads large video files (30-50M) and outputs to the browser, using header() to prompt a save as.

I have gone to great pains to track the progress of every download, and what percent successfully downloads, if the download fails, and if it fails from user cancellation, time-out, or other.

I have users downloading around the world with this script and it seems to fail as much as it succeeds. If it succeeds with one user, it usually always does. If it fails, it usually always fails. But not always. My data seems to indicate this is independent of OS, browser, and location. Usually when the DL fails, it is directly after the first chunk is flush()ed to the browser.

I've considered max_execution, session_write_close, browser blocks and a thousand other things, but can't understand why this script fails to even get going consistently for some users without cancellation, fails in the middle for some, and works perfectly for others.

Oddly the only think that I've found that can influence whether this script tends to work for a user is the $chunksize for fread(). Too small seems to encourage failure, as does too big. I've settled on the 'sweet spot' of 1024*1024, but still there is an unacceptably high fail rate.

So the input I would love to have from you wonderful folks is what might cause such erratic behavior?

Below is a short exerpt of the code:

// dump the file, keeping track of how much has been downloaded.
while(!feof($handle)) {

set_time_limit(30 * 60);

$buf = fread($handle, $chunksize);

echo $buf;

//exit if user cancels
if (connection_status() == 1) {
fclose($handle);
$this->make_transcript($db, $smarty);
exit;
} else if (connection_status() == 2) {
fclose($handle);
$this->make_transcript($db, $smarty);
exit;
}

flush();

$bytes_sent+=strlen($buf); /* We know how many bytes were sent to the user */

$update_data = array(
'percent_downloaded' => round(100 * $bytes_sent / $bytes_total),
'elapsed' => (time() - $this->download_id)/60,
);

//store percent downloaded (keeps mysql connection alive)
$r2 = $db->update('downloads', $update_data, "download_id = $this->download_id");

if (PEAR::isError($r2)) {

$this->msg .= "Tracking code query failed \n";
trigger_error(MYSQL_NOW ." db query that tracks download progress failed: " . serialize($r2), E_USER_WARNING);
}

$this->make_transcript($db, $smarty);

tfk11

8:29 pm on Sep 12, 2008 (gmt 0)

10+ Year Member



I also encountered some odd issues while writing a thumbnail generator.

I tracked the issue down to a call to readfile(). I replaced the function with a homegrown solution:

for( $i = 0, $max = (int)ceil($file_length / $chunk_size); $i < $max; ++$i) {
echo fread($file, $chunk_size);
flush();
}

I first tried a while( !feof() ) loop but it didn't help. The for loop seemed to do the trick for me but I have no idea why.

pixeltierra

8:48 pm on Sep 12, 2008 (gmt 0)

10+ Year Member



That's quite odd. It seems you're just reading the whole file in one chunk with fread? But that is what readfile should do...

Either way, I'm already using fread. I've made the following files to test the influence of chunk size, and various headers being sent or not. Seems not to have an influence.

[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]

[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]
[domain.com...]

Since these are big files, I'm considering using something like Amazon's s3 service. Any good experiences there?