Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
Most robots follow the robots exclusion standard, which can be found at
The idea of limiting a robots crawling is that you might not want all pages indexed (any inexed page can be an entrypoint to your site). You wouldn't want to start your visit at the feedback page, or perhaps you dont want robots to index your pdf docs which you keep in a certain dir. Or there might be a section that's password protected, eg.
*shrug* That's why WE used it.