Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
With sites expanding and changing it should be said that there will be many a webmaster not too happy to find unwanted material appearing in the SERPS and striving to find the removal form or ban the bot altogether.
So my question goes out to you is:
How often (in hours, days, bursts or whatever measurement) do you feel it would be appropriate for the robots.txt to be checked?
You get a lot of 404 errors from 'bots trying to find it, cluttering up your error logs and hiding real errors!
Other than that, the lack of a robots.txt file is interpreted by robots to mean, "request anything you like."
A good default robots.txt file which allows unlimited access but prevents all those 404s is: