Welcome to WebmasterWorld Guest from 220.127.116.11
Someone put forth the idea on the forum that somehow Google builds a score of what it considers "low quality" pages and dings the site as a whole if it reaches a threshold.
[edited by: tedster at 2:32 am (utc) on Apr 22, 2011]
It's interesting that you can get individual pages to recover, since Panda is a site wide penalty. That flies in the face of what we've heard, seen about it.
2. Then comes a site-wide calculation of how much those page scores are going to influence scores for other pages on the site.
do the pages that are linked directly from a bad page (or a group of bad pages) suffer more than pages that are more distantly linked from the bad page(s)?
I think its clear that Panda appplies a 2 part filter
(1) All your #*$!ty pages get hammered.
(2) your main pages, usually the not #*$!ty ones, get hammered too.
What if the same type of idea is now applied negatively. The more internal anchor text links to bad pages, the more likely, Google estimates, users will have a bad experience on that site.
I'm still studying this area, but the strongest correlation I see so far involves collateral demotion for pages that link TO the "bad" page, not pages that are linked FROM it.
it seems reasonable that externally linking to a poor quality page/site may also have an effect.
Which perhaps suggests that one of the main reasons I am pandalized is because my content has been so widely scraped perhaps, since my most heavily penalized pages have the most content?