homepage Welcome to WebmasterWorld Guest from 174.129.163.183
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Accredited PayPal World Seller

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Varying results with Google and robots.txt
Varying results with Google and robots.txt
frozenpeas




msg:704743
 10:33 pm on May 28, 2006 (gmt 0)

After submitting to Sitemaps I am noticing a difference between my sites that have a disallow on a particular directory and those sites that have no exclusion.

The sites with:
User-agent: *
Disallow: /print/

Are not as well indexed as sites that have plain old:
User-agent: *
Disallow:

Does anyone think that Google is beginning to take robots.txt rules more seriously and this is an effect of this?
Has anyone else noticed similar results?

It does make me want to remove all excludes.

 

tedster




msg:704744
 4:34 pm on May 29, 2006 (gmt 0)

The wouldn't make any sense to me -- I think there are other factors, such as inbound links, that is more important for depth of indexing.

Removing the exclusion for /print/ could open you up to duplicate problems. I would not suggest that.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved