I want to know why there had to be an update at all? After all Google has:
- User Serp behavior data (Searches/Views/Clicks/Search Sessions).
- What I have heard called WaterFall Data That is the data from the auto complete and from Instant Search. (they know keyword streams)
- Site analytics data from all over the web (time on page, time on site, user click path, site-to-site travel)
- They have Toolbar data - raw full click stream data - the whole enchilada.
- Only they know what is coming in from Android devices and Chrome browser data (mucho data)
All of which equates to what I call The USSR (User Search Success Rate). That is the rate that any keyword search/SERP ends up with a happy Google user.
Since they have the USSR per SERP rates, why was this "update" not processed right in stream from the click an other analytics data? If G has the smarts to do a intelligent self learning algo - how come it didn't learn what people like and take corrective action filtering out the 'content farms' one SERP at a time all along?