Welcome to WebmasterWorld Guest from 107.20.104.161

Message Too Old, No Replies

Varying results with Google and robots.txt

Varying results with Google and robots.txt

     

frozenpeas

10:33 pm on May 28, 2006 (gmt 0)

5+ Year Member



After submitting to Sitemaps I am noticing a difference between my sites that have a disallow on a particular directory and those sites that have no exclusion.

The sites with:
User-agent: *
Disallow: /print/

Are not as well indexed as sites that have plain old:
User-agent: *
Disallow:

Does anyone think that Google is beginning to take robots.txt rules more seriously and this is an effect of this?
Has anyone else noticed similar results?

It does make me want to remove all excludes.

tedster

4:34 pm on May 29, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The wouldn't make any sense to me -- I think there are other factors, such as inbound links, that is more important for depth of indexing.

Removing the exclusion for /print/ could open you up to duplicate problems. I would not suggest that.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month