I think we were saying something similar, only yours is much more readable and well thought out...
I'm thinking along the lines of a capacity restraint too.
But then, "my" capacity restraint is a bit different in nature...
I'm not sure that the sites that are getting the "slow treatment" ATM will suddently be updated at "hi-speed" ...
Let me clarify a bit, because what I'm thinking is being seen goes along with what you and tedster are saying, I think...
Anyway, in my mind it's the same only a bit different:
The 'capacity restraint' (to borrow from Claus) I was thinking about was not necessarily 'site specific' but rather 'data increase & update by necessity' specific, and as a byproduct of decreasing the speed of the data being updated in the 'mostly seen index' (Big Daddy) some sites and pages are seeming to be left out of the index, which may or may not be the case, and seems in some ways to fall in line with what the two of you are saying:
tedster
In other words, at certain times, having fresh results for certain topics is more important to Google than other topical areas or taxonomies.
Claus
And it depends on some kind of page classification scheme (like you've seen with the "show options" option on the SERPS).
Here's another version of my thoughts: News, Tweets, Other 'Extremely Time Sensitive' (er, politely 'stuff') or 'very fresh + important' (stale v. fresh, PR, TR, Etc.) would IMO be the emphasis (priority) for crawling, scoring and updating on the Big Daddy Index (AFAIK the results most people see currently see), and the pages 'nearby' the (for lack of a better phrase) 'super fresh + important' pages IMO would see some benefit from the 'freshness priority crawling', because they would be closer to the 'current crawl priority' and therefore 'fresher' since in some ways freshness 'cascades' like PR.
I'm thinking they are trying to keep new data insertion and 'obsolete index' (Big Daddy) updates down during the change over, since the storage method is going to the recycle bin soon anyway...
Why would they keep updating Big Daddy, even at the old crawl rate when it's not going to be used? It could even be they are storing the new crawl data for all pages, including 'non priority pages' (not 'super fresh' results) in the Caffeine data structure and 'pushing' it (on a 'fresh dependent basis') to Big Daddy's index after a period of time... Or, only updating the Big Daddy Index from certain crawl cycles. (Or something to the same effect, go with the point.)
IMO If they are doing one of the preceding and using the crawl data to update Caffeine directly rather than Big Daddy, it would explain quite a bit of the 'crawl to seen index' slowdown, even though pages and new sites are still being spidered, even if not as fast, because they will probably be using some of the crawling resources during the change over, which would IMO slow down on Caffeine's indexing a bit too...
I guess my thoughts are: The slowdown of 'fresh is not critical for a period of N weeks or so' sites might only be in the 'seen by most people' (Big Daddy) indexing and updating of pages, but not the Caffeine index...
IOW They could be indexing sites and pages on in the new infrastructure most can't see yet (Caffeine), and the slowdown is 'more relative' to the updating of the old index (Big Daddy), which happens to be what most people see most of the time.
I think I've rambled my point out somewhere in the preceding, and the short version might best be put as a question and answer:
Would you update non-critical (doesn't need to be 'super timely' within a 7 to 21 day period of time, because no one except webmasters cares or notices) pages (sites) on a data storage mechanism you are replacing, or would you use your resources to change over and update the new data storage system you are replacing it with? Personally, I would work on getting the new one in place and keep the old, soon to be replaced storage system updated on an 'as needed' basis.
Here's another question in basic terms to see my point about the slowdown or perceived slowdown in indexing, putting it in an 'everyday webmaster' situation... Would you update the asp version of a page you were about to convert to PHP or even the HTML 4 version of a page you're converting to HTML 5 or would you concentrate on getting the new version in place? (IOM They're doing relatively the same thing on a much larger scale.)