Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
I think this will work, but i thought i would get a few more opinions to be sure.......
Robots.txt are used to stop SE's from searching through pages you do not wish them to see.
Does this mean that if a site has been updated, and some old pages no longer exist.........in theory i could put these old links in my robots.txt and they would eventually cancel themselves from view on the SE's?
Am i thinking correct? or do robots not work like this?
The problem is not the robots.txt file, but the fact that many SEs don't update their database often enough, resulting in stale SERPs.
[edited by: engine at 10:45 am (utc) on July 22, 2002]
personally i wouldn't do that because over time your robots.txt file would become rather large and unwieldly. just let the spiders generate 404s and they'll get the message that the pages have gone.
as far as i know there are no detrimental effects when doing this.