Forum Moderators: goodroi
I still see people trying to access pages from search engines searches years after the original pages have gone.
I've got a custom error page that will redirect them to the correct area for most of them.
But yes, disallowing those pages in robots.txt will stop the search engines bots asking for them anymore.
Everybody but Inktomi Slurp and Jeeves used to handle 410 responses correctly, and stop requesting the file as soon as all spider hosts with that URL on their crawl list had seen the 410 response.
Jim
This is what i am talking about."widgets.htm" is a page i had removed from my website 1 month ago.Inspite of that msnbot keeps comming again and again to fetch it.My server replies with a 404 as you can see.
a 410-Gone response is what should be returned
What do i have to do to make the server return a 410 response instead of the 404 response?
Thanks.