Andylew - 6:56 pm on May 26, 2010 (gmt 0)
If you're about the speed factor causing a massive traffic loss - I'd say relax about that one. First, it's a minor factor. And second, server speed is only the small part of the Speed Speed metric. The bigger part is all about the page's rendering speed in a browser, and those fixes are within your ability to change without moving servers.
I agree partly, however basic bandwidth throttling on shared server would restrict googles ability to spider an entire site within a time period.
A very simple way of comparing with similar content and I would say improving their algo would be to add a ratio; site pages:max crawl pages pcm.
The logic is quite straight forward,
site 1: 5 million pages of which google can only spider 1mil pcm based on the server limitations not googles but all pages are updated once per month. That means some pages may only be crawled once every 5 months so google index could have pages listed that are 4-5 months out of date.
site 2: Same site but because of server resource google can crawl all 5mil per month. That means google would have pages that were never more than a month out of date.
Site 3: Same site but enough server resource for google to crawl 5mil pages per day. Resulting in near real time indexing of new pages.
If they were the only 3 sites in the world IMO they would rank, site 3, site 2, site 1.
Now on a large scale the same applies if their latest infrastructure and algo means they can now in effect real time list results then the time it takes to spider an entire site to find new content is going to be paramount or they will never be able to find the new content. Previously googles own infrastructure limitations would have probably meant that they couldnt spider a 5mil page site in a month (or pre defined period) so couldnt use this to penalise such a site. Now they have more resource they have removed their own limitations so can now use webmasters hardware limitations as a direct ranking factor?