homepage Welcome to WebmasterWorld Guest from 54.226.10.234
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Varying results with Google and robots.txt
Varying results with Google and robots.txt
frozenpeas

5+ Year Member



 
Msg#: 34536 posted 10:33 pm on May 28, 2006 (gmt 0)

After submitting to Sitemaps I am noticing a difference between my sites that have a disallow on a particular directory and those sites that have no exclusion.

The sites with:
User-agent: *
Disallow: /print/

Are not as well indexed as sites that have plain old:
User-agent: *
Disallow:

Does anyone think that Google is beginning to take robots.txt rules more seriously and this is an effect of this?
Has anyone else noticed similar results?

It does make me want to remove all excludes.

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34536 posted 4:34 pm on May 29, 2006 (gmt 0)

The wouldn't make any sense to me -- I think there are other factors, such as inbound links, that is more important for depth of indexing.

Removing the exclusion for /print/ could open you up to duplicate problems. I would not suggest that.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved