Forum Moderators: Robert Charlton & goodroi
Consider what follows as perhaps one way to implement such a feedback cycle.
• Pages are ranked according to whatever secret herbs and spices Google deems appropriate to decide rankings – standard stuff. This rank can then be influenced by the following.
• As a separate process, using data from Adwords, Google groups pages (or more likely entire sites) into cohorts based on the occurrence of moneyed phrases (eg. all the websites about widgets in one group, all the websites about gadgets in one group).
• A quick analysis then gives a count of the number of occurrences of the moneyed phrases (as internal links and headings, methinks) in each website. This can then be represented as a percentage of the total links/headings within that website.
• Then Google could simply calculate an average for the entire cohort.
• Any site with more than the average gets zapped. Anything under, passes muster and ranks.
• Zapped webmasters, upon studying competitive sites, think "Oh, it looks like I've optimized too much. I'll just fix that."
• The cycle runs again and this time - thanks to the newly de-optimized websites - produces an even lower average, hitting a whole bunch of previously unaffected websites.
• The newly affected webmasters study their competitors and conclude they've optimized too much...
• Repeat ad infinitum, with the effect that sites are continually bouncing in and out of the SERPS.
OK, clearly, traditional factors do stave off the new algo at some point; how that works I couldn't guess. However, the new technique, in one fell swoop, effectively puts an end to website optimization.
Actually I think the changes are just Google tweaking the filter.
But at this point I just don't care. I was down to just a few unimportant pages left at 950. I've tried everything I can think of to get them back to no avail so I just did the ultimate deoptimization. I noindexed them.
Out of sight, out of mind. ;) More time to work on content.