Welcome to WebmasterWorld Guest from 23.22.250.113

Message Too Old, No Replies

Interesting Panda Problem - Google is ranking my blocked pages

     

Uber_SEO

3:37 pm on Jun 2, 2011 (gmt 0)

10+ Year Member



So I've got a website that was affected by Panda. I identified pages that I considered weak, and I blocked them using the robots.txt file.

Guess what - although Google has dropped our cache from their index, they're continuing to rank the URLs, and in some cases, better than before I blocked them! Presumably this means that these pages will still be contributing poor user metrics to whatever Google is using to calculate Panda?

I realise I've taken a few shortcuts here (I probably should've noindexed rather than robots blocked), but I kinda presumed a robots.txt file would be enough to get poor quality pages to drop out of the SERPs.

What do you guys reckon? Is this counter-intuitive on Google's behalf?

netmeg

3:43 pm on Jun 2, 2011 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The robots.txt will not remove pages from the index. You've actually just told Google not to look at them, you haven't told them to remove them. You're better off using NOINDEX (and taking them out of robots.txt so that Google will actually see the NOINDEX) If you really want them out, remove them in GWT. THEN make sure there's no links to them, and put them back in robots.txt. At least, that's how I'd do it.

indyank

4:54 pm on Jun 2, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



wow, i was waiting for someone to experiment this and yours seem to be a good one.

So,
1) Did the ranking revive (or improved further) after you blocked them with robots.txt or
2) were they ranking well even after panda, but you blocked them as you considered them to be week and now they appear to have further improved in ranking after implementing the block?

walkman

7:51 pm on Jun 2, 2011 (gmt 0)



netmeg is 100% right.
Noindex and if possible even link to them so Google sees them as fast as possible. Once they are gone from the index, then you can robots.txt them out.

suggy

9:33 pm on Jun 2, 2011 (gmt 0)

10+ Year Member



+1 walkman. Exact same thing happened to me when I mistakenly blocked first, before noindexing.

Uber_SEO

8:28 am on Jun 3, 2011 (gmt 0)

10+ Year Member



In most cases the robots.txt file has done the correct job - I've seen 99% of traffic to the pages I'm blocking drop off. It's just in some strange cases (presumably where I've got good links coming into page) these URLs are still showing up in the SERPs.

@indyrank - we haven't seen aggregate rankings return to what they were pre-Panda. I've got one specific case where I had a page that wasn't ranking in the top 100, we blocked with a robots.txt, and now that Google has dropped the cache, the page in question is now ranking on the 2nd page for its core term. Pretty random
 

Featured Threads

Hot Threads This Week

Hot Threads This Month