Welcome to WebmasterWorld Guest from 3.93.74.227

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

readfile timeout, possible? how?

     
9:32 am on Mar 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 29, 2003
posts:94
votes: 0


Hi im reading an external http:// file in to a page via readfile, is there anyway to set a time out on the operation as sometimes the external file can be very slow and it holds up my page?
what id idealy like is "if the external file has not responded or been read completly within say 1 second then give up an continue with my page, as it stands it waits about 60 seconds.

The only way i can do it is with an iframe, so the frame can load late or not at all while my page completes. but i need the contents IN my page.

Many Many thanks for any help you can offer.

12:30 pm on Mar 3, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:May 12, 2004
posts:533
votes: 0


You could set up a cron job to run in the background every 10 minutes or so, pulling the information from this page into a database, you then just get the latest data from your database, thus solving annoying wait

I've done this on a few sites when obtaining stock prices or similar

Cheers,
hughie

8:36 pm on Mar 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 29, 2003
posts:94
votes: 0


Thanks for the suggestion, id rather not go down that route if at all possible.

I have been playing around with stream_set_timeout without luck, it just doesnt seem to do anything whatever time out i set


$fp = fopen("http://www.example.com", "r");
stream_set_timeout($fp, 1);
$res = fread($fp, 2000);
$info = stream_get_meta_data($fp);
fclose($fp);
if ($info['timed_out']) {
$related = "unavailable";
} else {
$related = file_get_contents("http://www.example.com");
}

can anyone show me what im doing wrong, thanks

11:06 pm on Mar 6, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 29, 2003
posts:94
votes: 0


Wel im still ripping my hair out here, has anyone got any ideas please?

I have now tried this but it still just timeout after 60 seconds.

$file = "http://www.example.com/foo.html";

$fp= fopen($file, 'rb' );
if (!$fp) {die ("Failed to open $file, errors: $errstr ($errno)<br>\n");
}
$timelimit = "2";
stream_set_blocking($fp, FALSE );
stream_set_timeout($fp, 1);
$start= microtime();
usleep(100);

$chunk= fread($fp, 10); //chunks, packets are limited anyhow

ob_flush();


if (microtime() + $start > $timelimit){
$related = "";}
else {
$related = file_get_contents($file);
}

Thanks

11:41 pm on Mar 6, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:May 12, 2004
posts:533
votes: 0


Can you give us the context your using this in, we might be able to come up with an alternative.

Hughie

12:06 am on Mar 7, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 29, 2003
posts:94
votes: 0


Im trying to include a page of information from a remote server (just tables of content, No <head> info), the information is html and im formatting it myself before output in my page, the page has text which is constantly changing so im trying to avoid caching, the content is from an old employers website that i struck a deal with but there server sometimes slows down to crawl.

thers not a lot more i can say, but why my script wont work is beyond me.
I must be doing something wrong as i never seem to get stream_set_timeout to work.
I have tried fsockopen with a timeout but the server does respond but is just slow with the stream.

Thanks