Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
it wouldn't be logical if not having one would be interpreted as disallow all, because a large number of web sites would never get a chance to be spidered (people who never heard of robots.txt - personal home pages etc would never make it to the web). it doesn't mean it's not true on the other hand.
there seems to be no one answer to the question of robots.txt
There is one answer ;) Just put a blank robots.txt file - and everything will be crawled fine. And because we are in the Google forum, the most amazingly witty, funny and intelligent GoogleGuy told us to do it!
[edited by: nutsandbolts at 10:42 am (utc) on June 17, 2002]
I always list out all of the important spiders and robots in my robots.txt and then specify if they have full access, partial access, or no access at all.
In my opinion, this makes it easier to change it in the future.