outland88 - 6:12 pm on Apr 11, 2010 (gmt 0) [edited by: outland88 at 6:16 pm (utc) on Apr 11, 2010]
One of the many ironies regarding this is Google's statement in WMT "These estimates are of low accuracy." Well if it’s of low accuracy why are they using it? Plus now I'm also seeing they're measuring the load speeds of my mail servers. What the heck does it matter how fast/slow my personal mail is to Google? I’m not broadcasting my mail unless Google has found a way to slip in and do that.
Plus the big “iffy” to me is using any of the tools Google is providing in WMT to check load times. Is this just another Google gambit to gather (snoop) even more data from webmasters or have a surefire way to penalize sites?
Plus thirdly will this policy be mainly implemented against the rank and file or will Google’s “fat cat” buddies be eliminated from any penalization. In other words has Google deemed many so indispensable to their own interests they'll feel no wraith.
I also get quite a kick out of Google’s statement in WMT that the accuracy is (fewer than 100 data points). Most people who engage in any testing will tell you anything falling outside of 4-5 points is not accurate enough to predict. Anything above a 10 is likely measurable by flipping a coin.
Plus I wonder if this has anything to do with Google's introduction of ultra high speed internet in places where people will do about anything to have it.
[edited by: outland88 at 6:16 pm (utc) on Apr 11, 2010]