Forum Moderators: coopster
The structure of the code looks something like this:
<?php
$result = mysql_query("select url from table where 1");
while($row = mysql_fetch_array($result)) {
$myurl = $row['url'];
$rawdata = file_get_contents($myurl);
(some processing to grab data I want)
mysql_query("insert into table values (...)");
}
?>
If I run this for each individual URL separately, it works fine. However, in the loop, it starts pulling incorrect data - in fact, it seems to be "real data" but from a wrong file.
I'm guessing this has something to do with a timing issue w/ the server.
Any suggestions?
1 - display your URL in the loop so that you see which URL is in the process of fetching and where does it break.
2 - if possible use cURL library, that gives you more control for spidering complex websites which keeps sending control from page to page.
3 - do not store data BLINDLY, paste the INSERT query too for each loop run to see what you are sending to database.
4 - Do i need to tell you about the usage of magic_quotes?