aakk9999 - 3:24 am on Nov 18, 2013 (gmt 0)
True, we do not know how the percentages are calculated.
Google themselves say that 15% of queries submitted have never been seen before [google.com...] , therefore it is obvious that it is not possible to measure algo changes on these queries. Hence the percentage of change given by Google would apply to no more than 85% of searches.
But given the sheer volume of daily searches, I think it is impossible to calculate the percentage of changes with any precision. I suspect these percentages are extrapolated from search control sets Google uses to test algo changes. I also suspect that when calculating these percentages, Google takes first x results to compare before/after rather than the whole query result set.
An additional complexity in calculating percentage change is personalisation and localisation - personalisation cannot be taken into account when calculating these percentages but localisation may be, however if it is, it adds complexity and a mass of data.
Has anyone thought that this constant tweaking of the algo is because Google is perhaps slowly losing the battle with the speed the web is growing?