homepage Welcome to WebmasterWorld Guest from 54.196.159.11
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Interesting Panda Problem - Google is ranking my blocked pages
Uber_SEO




msg:4321113
 3:37 pm on Jun 2, 2011 (gmt 0)

So I've got a website that was affected by Panda. I identified pages that I considered weak, and I blocked them using the robots.txt file.

Guess what - although Google has dropped our cache from their index, they're continuing to rank the URLs, and in some cases, better than before I blocked them! Presumably this means that these pages will still be contributing poor user metrics to whatever Google is using to calculate Panda?

I realise I've taken a few shortcuts here (I probably should've noindexed rather than robots blocked), but I kinda presumed a robots.txt file would be enough to get poor quality pages to drop out of the SERPs.

What do you guys reckon? Is this counter-intuitive on Google's behalf?

 

netmeg




msg:4321116
 3:43 pm on Jun 2, 2011 (gmt 0)

The robots.txt will not remove pages from the index. You've actually just told Google not to look at them, you haven't told them to remove them. You're better off using NOINDEX (and taking them out of robots.txt so that Google will actually see the NOINDEX) If you really want them out, remove them in GWT. THEN make sure there's no links to them, and put them back in robots.txt. At least, that's how I'd do it.

indyank




msg:4321159
 4:54 pm on Jun 2, 2011 (gmt 0)

wow, i was waiting for someone to experiment this and yours seem to be a good one.

So,
1) Did the ranking revive (or improved further) after you blocked them with robots.txt or
2) were they ranking well even after panda, but you blocked them as you considered them to be week and now they appear to have further improved in ranking after implementing the block?

walkman




msg:4321258
 7:51 pm on Jun 2, 2011 (gmt 0)

netmeg is 100% right.
Noindex and if possible even link to them so Google sees them as fast as possible. Once they are gone from the index, then you can robots.txt them out.

suggy




msg:4321303
 9:33 pm on Jun 2, 2011 (gmt 0)

+1 walkman. Exact same thing happened to me when I mistakenly blocked first, before noindexing.

Uber_SEO




msg:4321454
 8:28 am on Jun 3, 2011 (gmt 0)

In most cases the robots.txt file has done the correct job - I've seen 99% of traffic to the pages I'm blocking drop off. It's just in some strange cases (presumably where I've got good links coming into page) these URLs are still showing up in the SERPs.

@indyrank - we haven't seen aggregate rankings return to what they were pre-Panda. I've got one specific case where I had a page that wasn't ranking in the top 100, we blocked with a robots.txt, and now that Google has dropped the cache, the page in question is now ranking on the 2nd page for its core term. Pretty random

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved