Forum Moderators: goodroi
User-agent: Googlebot
User-agent: Googlebot-Mobile
Noindex: /*?
Disallow: /*%23
Disallow: *bots=nocrawl
Noindex: *bots=nocrawl
Noindex: /*~r
Noindex: *bots=noindex
Noindex: /includes/
User-agent: AdsBot-Google
User-agent: Mediapartners-Google
Disallow: /
User-agent: Googlebot-Image
Disallow: /images/
Allow: /*.jpg
Allow: /*.jpeg
Allow: /*.png
Allow: /*.gif
User-agent: WDG_SiteValidator
Disallow: /neuterale.php
User-Agent: MJ12bot
Disallow:
User-agent: Slurp
Crawl-delay: 300
User-agent: Msnbot
Crawl-delay: 120
User-agent: Teoma
Crawl-delay: 240
User-agent: *
Disallow: /*?
Disallow: /*%23
Disallow: /*~r
Disallow: *bots=nocrawl
Disallow: *bots=noindex
Disallow: /includes/
Is that valid? Will it work? If not, what do you see wrong?
About the "noindex" directive no worries. It is unofficially supported by Googlebot.
Thanks in advance for your kind support,
John
I don't understand why you are trying to compress your robots.txt file? It seems you are trying too hard to save a few kb.
I would worry more about making sure the robots.txt is accessible & understandable to search bots. Search robots can get easily confused. Your robots.txt should be formatted exactly as the search engines request if you want to make sure the robots follow your instructions.