Welcome to WebmasterWorld Guest from 22.214.171.124
Server speed is a factor that helps for a better crawl, potentially more frequent per page, as well as deeper into the site. But it's not a factor for staying in the index.
The number of pages that actually stay in the index, rather than just get spidered, is related very strongly overall to the backlink strength of both the home page and how that link juice is distributed internally. Backlinks to deeper pages are also a big help. Many webmasters have noticed that there seems to be a formula (admittedly a moving target of a formula) for how many pages will be retained in the main index for any site. It's not a simplistic formula, however. It's not a flat percentage, nor is it a certain hard number.
Page load speed is still not an active factor in ranking (or crawling or indexing) but it is definitely on the horizon as a ranking factor, possibly this year. The early notices and the push from Google to inform webmasters and give them somne tools and education began last year.
I'm following up from an 2010 earlier thread where it was speculated that Google would commence the use of this factor in it's algo. Has anyone noticed if site speed is now a ranking factor?
Late last year Matt Cutts indicated in a video that page load speed was live as a ranking factor
.... interest me a lot more. Fast sites set off a cascade of effects that can indirectly improve rankings
Wish I could - it's not easy to find specific videos, especially when it's Matt Cutts and could be on any of several websites. I've already spent 30 minutes looking, and that's enough for now.
Several websites I have worked with have seen significant conversion and subsequent revenue increases by making speed changes that most webmasters would deem unnecessary.
They would likely target websites where a significantly slower site speed is a detriment to user value.
so that wouldn't be fair
I think if you want to play this game long-term, you should probably up the ante a bit
And now today, every time a page on Wired or whatever site grabs my browser and won't let go for 25 seconds, I want to grab their web team and scream at them.
so that wouldn't be fair
I remember an annual contest in the 90s run by the Bandwidth Preservation Society - the idea was to build an entire site with less than 5kb of code. Oh yeah, that was a discipline! In those days, people used to say that "www" stood for "world wide wait". Of course many were on 8kb dial-up modems, too.
There are other factors in making a page load fast for visitors as well.
Some sites are so bloated they take just as long to load.