Question 1: Here's the robots.txt file I'm currently using. To be perfectly honest I cannot explain the reason(s) behind every exclusion like a pro . . :(
Anyone see anything wrong with this list?
Robots.txt
User-agent: *
Disallow: /cgi-bin/
Disallow: /cgibin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/
Disallow: /wp-login.php
Disallow: /wp-register.php
Disallow: /images/
Disallow: /go/
Disallow: /privacy-policy/
Disallow: /comment-policy/
Disallow: /terms-of-service/
Disallow: /faq/
Disallow: /contact-form/
Disallow: /iframes/
Disallow: /*?*
User-agent: psbot
Disallow: /
User-agent: Xenu
Disallow: /
User-agent: ia_archiver
Disallow: /
User-agent: Baiduspider
Disallow: /
User-agent: MJ12bot
Disallow: /
User-agent: Googlebot-Image
Disallow: /
Question 2: Is a robots.txt file, for a Wordpress site, not really all that important for indexing or SEO purposes? I see so many versions of robots.txt, even amongst "the pros". (I also wonder if I'm actually seeing what the bots are seeing or if the pros cloak their robots.txt file.)
Is it more a matter of "Don't be a dumbass by excluding bots from sections/content that OUGHT to be indexed"?