anand84 - 8:37 am on May 13, 2010 (gmt 0)
I'm currently thinking of something more along the lines of waveforms and continuous statistical testing... with near-real-time adjustments and experimentation in a kind of feedback loop. Something like automated algorithm evolution.
Interesting point. I am just trying to be a devil's advocate here and trying to see what if this is how the new algo works (ignoring the GOOG is broken discussion for now).
Maybe Google studies big and small websites differently. Small websites which have only a few dozen pages are majorly static - a possible indication that they contain evergreen content. On the contrary, the big websites must possibly be adding several dozen pages every single day - indicating that the old content on these websites gets stale after sometime.
With Caffeine focusing on real-time, could this be a reason why small static websites are getting a lot of Google-love and big websites are seeing drops?