>> My sense is that this is more a measure of PAGE quality than site quality.
Tedster, I agree with you on this. On our main e-commerce site (not affiliate) we continue to rank for the major terms which are super competitive and have *high* quality inbounds.
What we've lost out on are about 10-15% of our users who used long tail terms to find us ... think product titles, "buy whacky green widget in location". Those product pages are algorithmically weak - dupe descriptions with little value content (SKUs, Weights, descriptions, manufacturers etc..). Over 3 million of these pages.
To the person looking for that widget in our specific location, the page is VERY helpful and we converted at a significantly higher rate on those searches than others. However, I can see how they're algorithmically weak.
Another site that has lost a few percentage points in traffic is a large forum that we run. Again, highly relevant content for someone looking for something obscure but these pages are not encyclopaedic in nature - both in terms of content and variable deep inbound links. The forum topic would be something like "where can I buy a NTSC video convertor in location" and would rank for "ntsc video convertor location". Hard to describe this, as we have over 100,000 similar long tail searches over a month and most of them are so low frequency that they're not worth tracking.
New inbound and onsite strategy needs to be put into place to get external juice and distribute internal juice better than we have in the past.
I should add that there is very little or no change on our smaller sites (100-2000 pages).