Well, these changes may pose problems for us -- but I can barely imagine that this much disruption is some unanticipated accident or "bad data push" that just needs to be fixed. This seems especially so because, in the midst of it all, we're seeing some reports of search traffic coming back for sites where it had dried up lst summer. In some cases, the report is that the site made no changes and bad situation just turned around.
Who knows, things may go even stranger -- but something intentional sure seems to be brewing, by my read. We've seen reports of apparently good sites getting hurt in past months. We've seen reports of unusually successful spamming that sort of faded in and out.
Here are some of the ideas that have been tossed around lately:
1. A huge chunk of the Google team has really lost it.
2. They're intentionally obscuring more types of data that can be used in reverse engineering
3. Something new is being rolled out that can't be done all at once. This would most likely be some factor that makes use of the newly gained "BIg Daddy" elbow room. A new type of.spam detection? Some new kind of filter against scrapers? More in-depth use of historical measures for site-wide changes, link growth and manipulation of various kinds? Better trust testing for new subdomains?
Any other ideas? Just too soon to tell, I suppose, but what else is there to do but guess of areas to watch. The picture is not yet in focus.