Getting back to the AI topic for a moment here:
First, let's bring SciFi excitement down a notch: it's not AI as it will never, say, write poems (or start a nuclear war for that matter), after a year or so of sorting search results. It will always sort search results. So, it's a specialized software with a self-adjusting algo - not unlike your favorite antivirus software.
Anyways, what I'm trying to say is that such software would require looking at lots of data points to come up with a reasonable change in the algo. Your antivirus software looks at EVERY email you receive and EVERY file you run/open and so it has plenty of data to crunch. However, most (90%+ in my experience) searches do not provide enough data points simply because there may be 1 or two searches per month.
This thread has been centered around problems with long tail searches and I would like to sort of bring it all back tying the pseudo-AI and long tail together. They simply don't mix: how do you vary SERPs and analyze the resulting CTR if all you got is one individual search in a considerably long period of time? On most of those extra-long tail searches (again, 90%+ of traffic on some sites) you literally need to wait years to accumulate any statistically significant amount of data. Or, if you let your AI loose on a thin data point diet, you are going to have wild swings on the output end. Maybe that's what explains the Yo-Yo traffic some people here reported?
What I'm trying to say is that there is still an important role in this new Google for a good old fixed algo. If you are trying to rank for "buy [your favorite ED remedy brand]", let the Force be with you - you are going to need it to fool Hal 9000.( "sorry Dave, I can't let you search for this brand. Remember last time you took it and it lasted more than 4 hours?")
For the rest of us - I think it will be business (almost) as usual once the move over to the new servers completes. Another reason for continuing the use of the fixed algo (however complex with 201 variables) is that the long tail searches are usually not important, both from AdWords revenue stand point (most have no ads) and from user experience. Most reasonable people would understand that feeding an entire page of text into Google will produce unexpected result and will manually adjust their query. So, you don't need your CPU intensive pseudo-AI on unimportant searches - just send it to the old algo.
What do you guys think?