Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Compressing Robots.txt

Is this robots.txt compression valid?

5:24 pm on May 15, 2009 (gmt 0)

New User

10+ Year Member

joined:June 14, 2006
posts: 31
votes: 0

I wanted to compress my robots.txt and I came to this below:

User-agent: Googlebot
User-agent: Googlebot-Mobile
Noindex: /*?
Disallow: /*%23
Disallow: *bots=nocrawl
Noindex: *bots=nocrawl
Noindex: /*~r
Noindex: *bots=noindex
Noindex: /includes/

User-agent: AdsBot-Google
User-agent: Mediapartners-Google
Disallow: /

User-agent: Googlebot-Image
Disallow: /images/
Allow: /*.jpg
Allow: /*.jpeg
Allow: /*.png
Allow: /*.gif

User-agent: WDG_SiteValidator
Disallow: /neuterale.php

User-Agent: MJ12bot

User-agent: Slurp
Crawl-delay: 300
User-agent: Msnbot
Crawl-delay: 120
User-agent: Teoma
Crawl-delay: 240
User-agent: *
Disallow: /*?
Disallow: /*%23
Disallow: /*~r
Disallow: *bots=nocrawl
Disallow: *bots=noindex
Disallow: /includes/

Is that valid? Will it work? If not, what do you see wrong?
About the "noindex" directive no worries. It is unofficially supported by Googlebot.

Thanks in advance for your kind support,


12:46 pm on May 18, 2009 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
votes: 380

It might work but I think you are probably going to have problems with at least one bot.

I don't understand why you are trying to compress your robots.txt file? It seems you are trying too hard to save a few kb.

I would worry more about making sure the robots.txt is accessible & understandable to search bots. Search robots can get easily confused. Your robots.txt should be formatted exactly as the search engines request if you want to make sure the robots follow your instructions.


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members