Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Crawling old pages

         

zantoro

2:35 pm on Aug 3, 2009 (gmt 0)

10+ Year Member



Hello,
Google keeps crawling unexistent pages of my website (it has been replaced with a new one 2 years ago) which in some cases is fine but in other cases my web server tries to answer anyway causing performance problems.
Is there a way to tell Google those pages ain't there anymore?

Thanks,
Roberto.

tedster

10:25 pm on Aug 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A 404 or 410 http response from the server tells Google that the url isn't there anymore. Google will contine to request those urls for a long time, with declining frequency - after all, webmasters could replace the url with new content. In fact, I've had good luch for clients by luck finding 404 urls with significant backinks and putting new content there. They can rank rather quickly!

A "Disallow" rule in robots.txt will tell Google not to request the url.

in other cases my web server tries to answer anyway causing performance problems

To be qhort about it - you need to fix that. If it's extremely complex, then these urls are immediate candidates for a robots.txt disallow.