Welcome to WebmasterWorld Guest from 54.145.221.99

Forum Moderators: goodroi

Message Too Old, No Replies

Compressing Robots.txt

Is this robots.txt compression valid?

   
5:24 pm on May 15, 2009 (gmt 0)

5+ Year Member



I wanted to compress my robots.txt and I came to this below:

User-agent: Googlebot
User-agent: Googlebot-Mobile
Noindex: /*?
Disallow: /*%23
Disallow: *bots=nocrawl
Noindex: *bots=nocrawl
Noindex: /*~r
Noindex: *bots=noindex
Noindex: /includes/

User-agent: AdsBot-Google
User-agent: Mediapartners-Google
Disallow: /

User-agent: Googlebot-Image
Disallow: /images/
Allow: /*.jpg
Allow: /*.jpeg
Allow: /*.png
Allow: /*.gif

User-agent: WDG_SiteValidator
Disallow: /neuterale.php

User-Agent: MJ12bot
Disallow:

User-agent: Slurp
Crawl-delay: 300
User-agent: Msnbot
Crawl-delay: 120
User-agent: Teoma
Crawl-delay: 240
User-agent: *
Disallow: /*?
Disallow: /*%23
Disallow: /*~r
Disallow: *bots=nocrawl
Disallow: *bots=noindex
Disallow: /includes/

Is that valid? Will it work? If not, what do you see wrong?
About the "noindex" directive no worries. It is unofficially supported by Googlebot.

Thanks in advance for your kind support,

John

12:46 pm on May 18, 2009 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



It might work but I think you are probably going to have problems with at least one bot.

I don't understand why you are trying to compress your robots.txt file? It seems you are trying too hard to save a few kb.

I would worry more about making sure the robots.txt is accessible & understandable to search bots. Search robots can get easily confused. Your robots.txt should be formatted exactly as the search engines request if you want to make sure the robots follow your instructions.