Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Varying results with Google and robots.txt

Varying results with Google and robots.txt

10:33 pm on May 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:May 18, 2006
votes: 0

After submitting to Sitemaps I am noticing a difference between my sites that have a disallow on a particular directory and those sites that have no exclusion.

The sites with:
User-agent: *
Disallow: /print/

Are not as well indexed as sites that have plain old:
User-agent: *

Does anyone think that Google is beginning to take robots.txt rules more seriously and this is an effect of this?
Has anyone else noticed similar results?

It does make me want to remove all excludes.

4:34 pm on May 29, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

The wouldn't make any sense to me -- I think there are other factors, such as inbound links, that is more important for depth of indexing.

Removing the exclusion for /print/ could open you up to duplicate problems. I would not suggest that.


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members