Forum Moderators: goodroi
There isn't anything on the site that I don't want the spiderbots getting their hands on, so I haven't bothered to put any robot.txt on the mainpage. I was under the impression that foregoing the robot.txt would just let the spiderbots do their thing unabated, but all the bots seem to really want to find it, and I'm getting tons of 404 errors as a result.
Anyone have an idea why this may be happening? Should I put some basic robot.txt validator on the page to make the bots happy?