Forum Moderators: goodroi

Message Too Old, No Replies

Robot.txt

newbie question

         

playadonna

2:31 pm on Sep 18, 2005 (gmt 0)



if I don't want to disallow any urls do i still need a robot.txt
what I mean is it robot.txt only used to stop the bots from go to pages
that are not necessary?

jdMorgan

2:51 pm on Sep 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You are correct. robots.txt is used to exclude robots from files or directories.

If you have no robots.txt file and you monitor your server error log, you will find many errors related to the missing robots.txt file. These errors can be eliminated --making the error log much more useful for finding real errors-- by uploading a blank robots.txt file to your server. Alternatively, a robots.txt file containing only

User-agent: *
Disallow:

will serve the same purpose -- to allow all robots access to all resources, but eliminate 404 errors on robots.txt fetches.

Jim