Swanson - 11:18 pm on Jun 6, 2011 (gmt 0)
londrum, that is what I suggested at the beginning of the Panda updates. I still believe this to be exactly what is happening to a large amount of sites.
Although there is no way to test this - it just makes sense, if you grade a page and/or site as lower quality then the links from those pages and sites MUST be downgraded in quality also. That is the only logical way you would apply the "quality" portion of the algorithm - surely?
We now know pages now have a hidden quality score that acts in addition to the pagerank and other elements - it makes complete sense that Google now uses a "hybrid" score to work out the quality of the page and therefore quality of links from those pages.
There is an interview with Dani Horowitz of Daniweb where she pretty much says that she feels a lot of her links were devalued overnight and that is what contributed to a huge loss of traffic. These links were from the content syndicated from Daniweb to other sites (via RSS etc) - i.e. duplicate and/or shallow content.
All I know is that I have had a site with only unique content that had been destroyed in the Panda update. I have managed to get it back by continuing to build links to it - I have made zero changes to the site itself.