AlyssaS - 7:21 pm on Sep 21, 2010 (gmt 0)
As with any good webmaster I have understood the power of a raw log file from the beginning I know what I am looking at and it is simply jaw dropping. The level of computational power Google must use to throttle boggles my mind.
Entire sets of randomly rotating serps that are tailored throughout the day to throttle us.
Like you say, they would have to use stupendous resources to traffic throttle you - and you need to ask yourself why they would go to that expense? What's the payoff from their point of view?
The real answer might be a lot simpler - they are rotating databases (perhaps they need to take one out while working on another, or perhaps the load at certain times of the day is such that they need to introduce another one). If the rankings on these databases are out of sync - say you are at #1 on one, but #9 on another, that would account for the fluctuations in traffic. (especially if they've introduced a database from another country - you may have been at #1 in the USA, but #9 in Germany for reasons of relevancy, and the German database may have been introduced to take the load of the American searches at certain points of the day)
There is an easy way to prove or disprove this theory: check your cache dates at various times of the day. If the cache date remains the same throughout the day, but your traffic gets dialled down during part of the day, then they are throttling your traffic. If the cache dates vary along with your traffic, you know they are simply rotating databases.
FWIW, from my observations, cache dates are all over the place. On Monday it may say that it was cached a day ago, and on Tuesday, it reports that it was cached a week ago - which makes no sense unless they rotated to a more out-of-date database.