Forum Moderators: coopster

Message Too Old, No Replies

file get contents .limitation?

         

chessmotifs

1:56 pm on Jul 4, 2007 (gmt 0)

10+ Year Member



Hi, I've got another problem...
See this simple file_get_contents code

[codes]
<?php
$link= 'http://www.example.com';
$content=file_get_contents($link);

echo $content;
?>
[/codes]

I tried the code with these links:
1. http://www.example.com (...worked OK. Also tried google.com, yahoo.com,... all OK.)
2. http://www.example.com/buy/chess (...$content captured an error page instead of the result)
3. http://www.example.com/chessmotifs (...$content returned nil, ... totally nothing. This is the one I need the code to work btw.)

Could this be because the site (in this case cafepress) uses some sort of scripts to generate their sub-pages?
If yes, is there an alternatif method to "file_get_contents" that allows a page to generate fully before capturing?

Thanks for your advise!

[edited by: dreamcatcher at 1:59 pm (utc) on July 4, 2007]
[edit reason] Use example.com, thanks. [/edit]

Habtom

2:04 pm on Jul 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Could this be because the site (in this case cafepress) uses some sort of scripts to generate their sub-pages?
If yes, is there an alternatif method to "file_get_contents" that allows a page to generate fully before capturing?

It could be due to a number of reasons. Since it is a dynamic page, you might be losing some of the parameters or inputs.

chessmotifs

7:44 am on Jul 6, 2007 (gmt 0)

10+ Year Member



Hi Habcom, thanks for your reply.
Just wondering then,...

Is there a way for me to save/dump DHTML outputs into a file?
(using php or other scripts)

UserFriendly

11:43 pm on Jul 7, 2007 (gmt 0)

10+ Year Member



The webmasters for those sites may be blocking anything they suspect to be a script, to stop screen scraping, webform abuse, etc.