Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
What follows is a "back to the basics" on getting good rankings
"Having a robots.txt to include the pages that you want the search engines to include"
My understanding is that you can ban non desired bots and those what you don't even mention will crawl anyway. What I read there is different, "include those you want to crawl your site". Is that necessary really?
One most overlooked point to keep in mind is:
Note also that regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "Disallow: /tmp/*" or "Disallow: *.gif".