Forum Moderators: open
Each have to solve some algo improvement from suggestions from the "Friday happy-hour-feel-free-to-mention anything" evening and from the top-three most irritant spam-report list.
Full moon is the deadline.
Failing to finish by full moon means next month you get to answer PR0-emails or you get to scan catalogs.
(this month many email will get answered ;)).
Googleguy posting at WebmasterWorld at two AM local time in the morning is a good sign the deadline is aproaching or has been surpassed..
At least, that's how i picture it.
(ID) Index Date = (P/N)+(P/O)/(#servers+(?)..... and so on...
This may be total crap, but maybe worth looking into since everyone is so into finding out when this happens. I will find out a little more about it as well as I can get info.
l8r
For the algorithm piece, for example, I'm thinking that they must have certain new features that they want to add before doing an update. So the developers each work on their own piece. And then the testers (Quality Assurance or QA) test the software to see if it is producing the expected results. The QA team then lets the developers know what problems they see, the developers fix them, QA tests again, etc.
It is not unusual for a software product to take 50% (or more) longer than the projected time. In the case of Google, they don't have that kind of leeway.
If they're getting to the deadline and a certain feature can't be made to work as well as they'd like, it needs to be "backed out" of the code. By this time, the developer is probably is sleeping in his office at night - not to waste the time going home and back. And then, of course, the code needs to be tested again after the changes have been backed out.
And when software is ready, they apply it to the data and roll out the update.
I may be way off base, but this is my simplistic take on what the process might be.
Beth
So my theory: Google doesn't just flip the switch. Instead, Google has to check each of the 2.4 billion pages againt the other 2.4 billion pages. If that's true, then we are dealing with an enormous integer, which I believe would be expressed as 2,400,000,000!
(note: the exclamation point. what you call that?)
[edited by: dvduval at 4:40 pm (utc) on Sep. 25, 2002]
And when you start to think about the complex interactions that could occur between the "main" algorithm software, the bot software (including its own algorithm, I suppose), the data, and who knows how many other pieces, it sure makes you appreciate what goes into the whole process.
Not to say, that I, like everyone else, wouldn't like to see the update happen right NOW. : )
Beth
Google operates in a place beyond space and time, trying to understand a linear logic to the process is clearly missing the point. What we see every month (or so) are merely the off-shoots of a complex multi-dimensional space-time continuum. I think it probably has something to do with the emergence of new black holes in the universe.