Forum Moderators: coopster

Message Too Old, No Replies

check website/url availibility

         

sumobaby

4:07 pm on May 15, 2005 (gmt 0)

10+ Year Member



Hi there, we have links pages which is pulled out of a DB for ease of maintenance. updating, ordering etc.

To avoid having too many dead links to sites I wrote a dirty script to loop through the URLs in the DB but it doesn't work all the time and was wondering if there's a better and more precise way of checking.

Here's what I currently do:

while($row = mysql_fetch_assoc($result))
{
$fp = @fopen($row["url"],"r");

if ($fp) {
echo $row["title"]." working<br>";
} else {
echo $row["title"]." <strong>not working</strong><br>";
}
$i++;
}

dreamcatcher

4:29 pm on May 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi,

I use Xenu for all my link checking. Seems to be pretty effective. You might also find other link checking solutions on Hotscripts.com.

dc

sumobaby

4:49 pm on May 15, 2005 (gmt 0)

10+ Year Member



that's interesting, thank you, hadn't thought of that. I thought that perhaps I could ping a URLs in our links page but realised that websites which display the registrars holding page will not show as being dead / unavailable. So, I'm not sure if there's a good solution anymore.

Also, having an inbuilt script would be better, as what I planned to do is display all the sites which the script say are dead. Display the URL and name which are clickable to be able to review whether to delete them or change the status to under review which would keep the URL in our links database, but take them off the front end.

ergophobe

8:40 pm on May 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't know about xenu, but if you used something like wget to grab the page and then grep for (registrars¦register)\.com or something and flag the ones where that comes up, you might be able to figure something out