Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Interesting Panda Problem - Google is ranking my blocked pages

3:37 pm on Jun 2, 2011 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 7, 2005
posts: 86
votes: 1

So I've got a website that was affected by Panda. I identified pages that I considered weak, and I blocked them using the robots.txt file.

Guess what - although Google has dropped our cache from their index, they're continuing to rank the URLs, and in some cases, better than before I blocked them! Presumably this means that these pages will still be contributing poor user metrics to whatever Google is using to calculate Panda?

I realise I've taken a few shortcuts here (I probably should've noindexed rather than robots blocked), but I kinda presumed a robots.txt file would be enough to get poor quality pages to drop out of the SERPs.

What do you guys reckon? Is this counter-intuitive on Google's behalf?
3:43 pm on June 2, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
votes: 143

The robots.txt will not remove pages from the index. You've actually just told Google not to look at them, you haven't told them to remove them. You're better off using NOINDEX (and taking them out of robots.txt so that Google will actually see the NOINDEX) If you really want them out, remove them in GWT. THEN make sure there's no links to them, and put them back in robots.txt. At least, that's how I'd do it.
4:54 pm on June 2, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
votes: 9

wow, i was waiting for someone to experiment this and yours seem to be a good one.

1) Did the ranking revive (or improved further) after you blocked them with robots.txt or
2) were they ranking well even after panda, but you blocked them as you considered them to be week and now they appear to have further improved in ranking after implementing the block?
7:51 pm on June 2, 2011 (gmt 0)

Senior Member

joined:Dec 29, 2003
votes: 0

netmeg is 100% right.
Noindex and if possible even link to them so Google sees them as fast as possible. Once they are gone from the index, then you can robots.txt them out.
9:33 pm on June 2, 2011 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 1, 2003
votes: 0

+1 walkman. Exact same thing happened to me when I mistakenly blocked first, before noindexing.
8:28 am on June 3, 2011 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 7, 2005
posts: 86
votes: 1

In most cases the robots.txt file has done the correct job - I've seen 99% of traffic to the pages I'm blocking drop off. It's just in some strange cases (presumably where I've got good links coming into page) these URLs are still showing up in the SERPs.

@indyrank - we haven't seen aggregate rankings return to what they were pre-Panda. I've got one specific case where I had a page that wasn't ranking in the top 100, we blocked with a robots.txt, and now that Google has dropped the cache, the page in question is now ranking on the 2nd page for its core term. Pretty random

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members