Forum Moderators: Robert Charlton & goodroi
There are a large number of Data Centres that feed into Google.com. These are in a state of flux following a major infrastructure upgrade called Big Daddy. Depending where you are in the World and which ISP you are using you can receive SERPS from a number of different Data Centres which each have somewhat different results.
What seems to happen when Google does a major upgrade whether to their algoritm or infrastructure they cause all sorts of unexpected problems and they then spen the next three or four months trying to clean up the mess that they have inadvertently caused. Its almost as though their data is a big fan and they occassionally chuck a huge turd at it because they like cleaning up the mess. During the clean up period period the changes migrate across the Data Centres and you see the situation that you describe.
I hope that this helps
Best wishes
Sid
Best explanation yet. Do you think they're backing out Big Daddy? It sure looks like a complete rebuild. All manner of odd things going on.
I could tell them how to save some bandwidth. Downloading the robots.txt and the sitemap.xml in one pass, ignoring the pages that have changed (as per the sitemap) and downloading the home page (which HASN'T changed) twice in three seconds doesn't seem a good use of resources to me.
I wonder if Googe runs some phantom datacentres with some phantom webmasters who can feed back what nonsense they're currentl up to.