Welcome to WebmasterWorld Guest from 54.163.25.166

Message Too Old, No Replies

Varying results with Google and robots.txt

Varying results with Google and robots.txt

     
10:33 pm on May 28, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:May 18, 2006
posts:68
votes: 0


After submitting to Sitemaps I am noticing a difference between my sites that have a disallow on a particular directory and those sites that have no exclusion.

The sites with:
User-agent: *
Disallow: /print/

Are not as well indexed as sites that have plain old:
User-agent: *
Disallow:

Does anyone think that Google is beginning to take robots.txt rules more seriously and this is an effect of this?
Has anyone else noticed similar results?

It does make me want to remove all excludes.

4:34 pm on May 29, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


The wouldn't make any sense to me -- I think there are other factors, such as inbound links, that is more important for depth of indexing.

Removing the exclusion for /print/ could open you up to duplicate problems. I would not suggest that.