I have heard that most people use the Robots.txt to just disallow your specified spiders/bots. And all information relating to the robots.txt file always talk about disallow, ie: User-agent: grub Disallow: /
But I wasn't sure how to use it to help your site. Maybe to direct the spiders to specified pages, or even use it somehow as a site-map/link list of all your pages.
What are the different benefits of the robots.txt file aside from banning certain spiders & how would you set it up. Thanks alot for any help, just trying to clear things up a bit :)
Robots.txt is useful to disallow specific robots from specific pages/scripts/images, etc. Like your meta-description (in some engines), it gives you additional control over the "presentation" of your site, allows you to prevent visiotrs from entering into your site on some random page, etc.
Disallowing pest 'bots is a secondary function - although it is a much-discussed subject.