| 5:44 am on Nov 19, 2009 (gmt 0)|
|Good news for crappy pages with stolen text and no images + a load of Adsense ads on it! |
| 6:16 am on Nov 19, 2009 (gmt 0)|
|<snip> ... those with more dollars who can afford faster servers and associated hardware would "win" |
Not especially. Fast hosting isn't a bank-breaking expense. When you want speed, shop around for good quality hosting and don't just take the cheapest crappy discount plan you can find. I run dozens of blazing fast sites on a VPS, and it costs me less per month than my coffee habit.
The other factors are where you'll make or break your speed - good code, good optimization, small files, slim design. Even on a blazing fast host it's easy to make a site that loads slowly.
| 8:14 am on Nov 19, 2009 (gmt 0)|
Let's take a newspaper site as an example...isn't the browser just grabbing THAT page?
The delays on the user end are from the site then loading 18 other third party things that we will (eventually, if none of them upchuck) see.
| 9:22 am on Nov 19, 2009 (gmt 0)|
Yes there are tons of things we can do to speed up a website, including sacrificing design over speed or by paying for super fast hosting etc but none of this really matters. The point is that by including the speed of a site in its algorithms google is once again dictating how it wants the web to be and forcing site owners to comply to its ever growing list of demands.
| 10:31 am on Nov 19, 2009 (gmt 0)|
|The point is that by including the speed of a site in its algorithms google is once again dictating how it wants the web to be and forcing site owners to comply to its ever growing list of demands. |
Or, realistically, its reflecting the desires of its user base, and promoting sites that meet those desires.
If Users feel Google are promoting crap sites due to speed, they'll leave. If Users like the sites being promoted, they'll stay.
Google might have undue influence on technical development (see the SPDY thread [webmasterworld.com], or nofollow, canonical tag etc) but ranking is their sauce. If users like it, then it sticks. If users don't, it doesn't.
Sounds like some on this thread want Webmasters to have undue influence on Google's algorithm, but see it through the prism of "Google has undue influence on the web".
| 10:42 am on Nov 19, 2009 (gmt 0)|
After my initial distaste at Google trying to lead us in one direction rather than, in effect, reporting on what is happening naturally out there I'm starting to warm to this idea.
I don't like them saying speed in isolation gets a boost but when compared to your competition as one of 200 factors then perhaps that is a good thing. What is really important is that you can compare your site against the competition and in most cases you can do something about what you find. If you can't beat your competition on speed you either have to try harder or do more in other SEO areas. Overall I've done a 'U' turn and think that if it is done right it could be a good thing.
| 12:50 pm on Nov 19, 2009 (gmt 0)|
|Or, realistically, its reflecting the desires of its user base, and promoting sites that meet those desires. |
|I don't like them saying speed in isolation gets a boost but when compared to your competition as one of 200 factors then perhaps that is a good thing. What is really important is that you can compare your site against the competition and in most cases you can do something about what you find. If you can't beat your competition on speed you either have to try harder or do more in other SEO areas. Overall I've done a 'U' turn and think that if it is done right it could be a good thing. |
Thanks for the voices of reason.
|The point is that by including the speed of a site in its algorithms google is once again dictating how it wants the web to be and forcing site owners to comply to its ever growing list of demands. |
The people at Google did a study on page load time.
(Outcome linked in this thread.)
The study suggested a longer page load time resulted in decreased visitor satisfaction.
They want to return results their visitors enjoy and perceive as better than other search engines.
The people at Google decided since page load time appears to be important to their visitors, according to information gained from testing a hypothesis on a live website (not 'I asked the neighborhood' or 'this is how I feel, so everyone must'), they would include speed as one of 200 or so factors used to determine what order they show links to the pages in their results.
Then the people at Google decided to give webmasters the 'heads up' so while building and managing websites they would have an idea another factor, which could be tough to definitively detect in the ordering of results, was most likely coming ahead of time so webmasters could start preparing for it, rather than it taking 6 months and 100 'mysterious rankings disappearance' threads for people to even guess at what the new factor is, and the people building website decided to complain about it?
If you want to be upset at someone or something be p***** at the liars who tell you to your face, 'Speed doesn't matter too much... A 1 or 2 second load time is ok, as long as it's not really slow.', then go and demonstrate exactly the opposite behavior.
Some basic reasoning in the place of 'bashing Google for the sake of bashing Google' would probably lead to the conclusion the behavior of Google.Com's visitors is what dictated speed as a probable factor in the rankings, not Google trying to dictate how it wants the web to be. And, they aren't forcing anyone to comply, they're letting you know with a fair degree of certainty (as much as we're likely to get) speed is probably going to be a factor in the ordering of results, so build (or adjust) your site accordingly.
They're giving us all some valuable information about the effects of site speed on visitor behavior and letting us know based on the behavior they observed, in an effort to show their visitors what their behavior suggests they like to see in a website, Google will probably use speed as a factor in the ordering of results.
@ tedster: There I go earning my nickname again. :) Maybe some of it has to do with the lack of objectivity, common sense and reasoning displayed at times, like complaining about a factor in the results which is based on what gathered data seems to suggests rather than saying, 'Thanks for letting us know, because we can not only keep from having to test extensively to determine another one of the factors involved in the ordering of results, we have an idea of something we can (all probably) easily and inexpensively do with our site(s) to provide a better visitor experience...'
BTW: Thanks to the Yahoo @ Google (is that an oxymoron?) who decided to tell us about this...
Some of us appreciate the info, and even if our sites don't appear faster to a bot, they may to our visitors.
| 1:53 pm on Nov 19, 2009 (gmt 0)|
Yes, this isn't a ranking issue, it's a $ issue. I'm sure I've seen studies in years past where there's an increase in sales for decreased page load time.
Two takeaways: 1) do this for your pocketbook, not the rankings. It's good business sense. (and you don't have to sacrifice design. You may have to sacrifice huge graphics - but that's 'poor' design in most cases anyway). 2) We've got Google telling us relatively concretely that here's a factor you can use to boost your serps. This is a good thing. They don't normally tell us this.
| 2:20 pm on Nov 19, 2009 (gmt 0)|
|The study suggested a longer page load time resulted in decreased visitor satisfaction. |
i can understand that in relation to google's search page, because people basically just come for the results. so the quicker they appear the better.
but i don't think you can apply the same rule to every site out there. people will quite happily wait for an extra second or two if they know what's coming. how many of us long-time users would desert webmasterworld if it suddenly started taking one second longer to load? probably none of us, but presumably it would suffer a hit to its rankings.
| 3:17 pm on Nov 19, 2009 (gmt 0)|
|people will quite happily wait for an extra second or two if they know what's coming. how many of us long-time users would desert webmasterworld if it suddenly started taking one second longer to load? probably none of us, but presumably it would suffer a hit to its rankings. |
Yes, it might, but if Webmaster World's owner felt that the benefits of slower page-loading outweighed the benefits of ranking 2nd instead of 3rd for "widget anchor text" or "AdSense whatsit glitch," he might find the tradeoff acceptable.
Just as important, Google searchers who weren't Webmaster World regulars wouldn't necessarily be missing anything, because:
- Webmaster World wouldn't disappear from the rankings, and...
- Searchers who never got beyond the top 1, 2, or 3 search results would be directed to similar information on sites that matched or outscored Webmaster World pages for Google's other 199 or so ranking factors.
There's zero chance that Google will rank pages solely on how long they take to load, so why obsess about that one possible ranking factor? Especially when you should be able to fix the problem (and serve users better in the process) if it does turn out to affect your Google Search rankings?
| 3:29 pm on Nov 19, 2009 (gmt 0)|
|how many of us long-time users would desert webmasterworld if it suddenly started taking one second longer to load? probably none of us, but presumably it would suffer a hit to its rankings. |
Desert? Maybe not.
Visit less frequently? Absolutely.
If there was a one second delay for every page load I wouldn't visit, read or post anywhere near as much. (Don't give Brett any ideas...) There are days now when my visit ends when the site does the 'new post stall out' on me...
You might think 'consciously' the speed related to a second or two of load time doesn't matter that much, because it doesn't sound like it should, but IMO, behavior 'subconsciously' changes re a website that is slower to load. (I don't have any data to back the preceding statement up, but my gut tells me people say it doesn't, but their behavior changes when faced with the situation.)
I surf my sites all the time to see how they load and where the slowdowns are to try and eliminate them, because I'm on a fast connection and I know if I can see the page change I'm probably not quite as fast as I want to be for dial-up visitors.
I can't always get to where I can't tell if the page has reloaded without looking like here most of the time, but I'm usually close...
Anyway, I know from surfing my own sites, even the graphics that take a bit of 'extra time' to load (not the entire page) get to me after a while, so I try to find a way to eliminate the hesitations from the page displaying as much as possible.
There's one I work on and according to YSlow it's at about 79.4% for speed, etc. on the home page optimization / load time (whatever they base that percentage on... I don't have it installed right now or I'd be more precise).
Anyway, I also know that's a script telling me about it, and I've opened it on dial-up a couple times and the real-time 'request to display' on 56K is less than two seconds all the time and usually under 1 sec. (If you have the graphics cached... They're about 1.5-1.75 secs or so from page-open to load. So without the graphics cached: under 2 is the norm and if you have 'em in the cache, under a second.) It's got a graphic header about the size of the header here, with a graphic footer about the size of the crumb line at the bottom of the page and a banner... Most of the rest of the page is designed using CSS rather than images, but it's a dynamic directory, so this includes all DB connections, page parsing, mod_rewrite, and everything else that makes a php driven site with friendly urls work.
If you want to have some fun and see my point some time, back up one of your homepages, upload it to a test server and use JS or a server-side lang to slow it down, then refresh it a few times and sit there and watch it load... Empty your cache, refresh some more, empty some more, then when you get bored, go back to your 'regular speed' home page and do the same thing. (This is what happens when someone visits that page frequently and doesn't have a cache of it... You're just compressing their visits into a shorter time span to see what they go through to open and view the page.)
When you're done with that comparison, take the test page back to 'regular speed' and start optimizing for speed. Pay careful attention to which graphics load first, which delay, what you can see before the graphics are loaded, where the 'bottle-necks' might be and start playing with load order, text length (within the source code), etc. to see how much faster you can make it actually appear in the browser (using the suggestions from some of the tools is cool, but IMO the only way to really tell the difference and impact the difference may have is to sit and watch it load over and over.)
When you've got it as fast as you can make it, go back and surf your home page again... My guess is you'll optimize your entire site and think speed is highly important to visitors, even if they tell you it isn't.
EDITED: Added clarification to some points.
| 6:19 pm on Nov 19, 2009 (gmt 0)|
I know I leave a site that starts to load slowly, im gone in a flash.
Most of the time its cause not all of the data is all in the same place. We've got the page loading from the server, the pictures the site is trying to show you loading from another server for flickr account. Then the ads displaying on the site are coming from another server.
I've seen full pages get hung cause the ads took 5 seconds to load.
sorry im not going to sit around and wait for your mix match website to load.
| 7:58 pm on Nov 19, 2009 (gmt 0)|
At the end of the day what we think doesn't matter. In absolute terms it doesn'y matter what Google thinks. In comparative terms it does matter what Google thinks because being speedier can give you an edge over your competition.
It doesn't even matter if the speed measurements are real, what matters is finding out what works in Google's algorithms.
All this stuff about returning visitors, the odd second here or there is for another discussion and is in my opinion of absolutely no relevance to this discussion. They are important considerations but not in Google ranking terms.
The questions are:
Can you make your pages speedier without detracting from user experience.
Are you prepared to do a bit of analysis and work to have a chance of boosting your rankings.
If the answer is yes and yes then download the Fireworks extension and start analysing your own pages and your competitors. You will find loads of things that you can do to improve speed and more importantly have an edge over lazy competitors.
The tool is far from perfect but it is a good resource and it does provide many avenues for improvement.
| 8:07 pm on Nov 19, 2009 (gmt 0)|
What I don't understand is... how does the speed of any particular site speak to it's relevance?
The great thing about Google, in fact the best thing about Google is the relevance of it's results.
I don't think speed speaks to relevance at all.
I would hate to see the more relevant site on a subject get pushed down below a less relevant site on the same subject just because the less relevant one has better performance.
| 8:10 pm on Nov 19, 2009 (gmt 0)|
|They want to return results their visitors enjoy and perceive as better than other search engines. |
Many users could have the perception, even subliminally, that Google speeds up their computer (or other SE's slow it down). "Whenever I use Google, my computer just seems to work better. That's all."
| 8:10 pm on Nov 19, 2009 (gmt 0)|
|What I don't understand is... how does the speed of any particular site speak to it's relevance? |
It doesn't speak to it's relevance. It probably speaks to it's 'usefulness' from a visitor's perspective. As I mentioned, I'm sure I've seen studies that very small delays in page loads means noticeable decrease in conversions. In large numbers your visitors CARE about page load speed and leave if there's any type of delay.
In addition, and I speculate, I bet there's correlation between speed and spam. In other words, it's not just speed alone, but a bunch of other factors AND slow, slow may be another measure of low quality sites.(my main sites are ripping fast. My crap test sites are slow. So for me, this new test is accurate).
| 12:05 am on Nov 20, 2009 (gmt 0)|
|It doesn't speak to it's relevance. It probably speaks to it's 'usefulness' from a visitor's perspective. |
Yes, and any number of sites might be equally relevant for a given term. Joe's green widgets page at Joes-thin-affiliate-site.com might be as relevant for "green widgets" as a Wikipedia article on green widgets is, but it won't rank as high on a SERP after the search engine factors in things like trust, quality of inbound links, amount of content on the page, etc.
| 6:29 pm on Nov 23, 2009 (gmt 0)|
Folks, some of you have been on my session I did on Pubcon about loading speed and how to achieve that on a shoestring budget.
Let me summarize some of the points, I hope are helpful to anyone concerned about "loading speed":
The loading speed of a page depends on various factors - some are easy to control from your side, some are not.
Tools to find out about the network profile of your hosting companies are basically:
which are available on Linux servers and also on sites, which provide a web page interface to these tools.
Test your route to the domain/ips your pages are on from different websites with these tools and a good roundtrip time for a package of data sent to and back is measured in milliseconds and below 100 ms!
If you are with an unstable host or on a blade server with your VPS that is constantly overloaded there is NO way to improve that other than MOVING!
Then configure your webserver right. With Apache there are 2 basic configuration modes and 1 very important config option:
a) deliver on persistent connections or not
b) maximum concurrent connections
The best values differ from site to site. There is NO perfect default value!
Once you made sure you are with the right hosting company (pricing is NO indication of quality here, by the way) you may start to work on the real factors that are within your reach.
First understand the amount of requests sent to your webserver by requesting a single page:
normally also triggers
So a single page may trigger 20 requests to your webserver and your webserver does not like that at all! 20 requests per second could make the next visitor feel an additional delay of 1 second easily!
Install "firebug" for Firefox and watch the net statistics while you reload pages. Hold the shift key while doing so, or you will hit your local cache!
If your page takes more than 5 seconds on a DSL or cable modem line - you have to do something!
This is just a first start and to really have a real fast loading site: it is a process of optimization over time - I look at speed like once a week and try to come up with something to improve it. After a few months you will be there and you will see your conversion increase and more people coming to your site!
| 7:48 pm on Nov 23, 2009 (gmt 0)|
Here's something I've been mulling over in terms of whether speed is a helpful to relevance or utility (of course, if you run a website, you should aim to make it as fast as possible).
For some Google searches you have sorting options, for instance "by date". If you had the sorting option "by fastest sites" would you choose it?
Personally, I think I would choose that option if my internet was uncomfortably slow, or I had experienced a number of slow sites in the results. By default, I think I'd prefer "sort by most relevant".
| 9:10 pm on Nov 23, 2009 (gmt 0)|
|if my internet was uncomfortably slow |
I think you're mixing two different things. If I'm on a slow connection, I care more about page weight than about response times. If I'm on a fast connection, response times matter more than page weight.
Similarly, if I'm on a satellite connection, the number of objects (indpendent of page weight) matters more because of latency than it does if I'm on a wired connection, either slow or fast.
But to answer your question: if I were on a slow connection and every page had a note that said "Objects 32; Size 200kb" I would use that data in deciding where to click. If it said "Avg response: 100ms" I'm not sure I would.
| 12:31 am on Nov 24, 2009 (gmt 0)|
How about another extreme? Let's say a result was labelled "this site has exactly the information you need but will take 5 seconds to load".
Quicker sites provide a better "user experience", I don't dispute that. Encouraging site owners to make their sites quicker is all well and good. Encouraging users to prefer quicker sites is counter-productive. I don't think that helps quality of information at all.
| 2:22 am on Nov 24, 2009 (gmt 0)|
That is a very astute observation, Andy. And it may be why Google is sort of "pushing from both sides" on this speed issue.
Yes, they are sending out advanced warnings about site speed as a potential ranking factor in the algo, but they are also releasing tools and information that help sites to speed up - and even working on the SPDY protocol the make speed more possible in a global way.
| 8:20 am on Nov 24, 2009 (gmt 0)|
I wonder if our distaste for being told what to do by Google is causing knee jerk reactions, I know it did with me. There are a few things that we need to remember.
1. In Google terms speed is not absolute, you just need to be faster than the competition.
2. It is one of 200 factors and may be a minor factor.
3. We don't know what Google will be measuring.
4. We don't know if it will be page speed or site speed.
Assuming you and your competitors have every other optimisation maxed out then it may be that a page that is the speediest in terms of what Google measures will be given a boost compared with that competition. If the only factor you have left to out compete is speed (as measured by Google) then you are in a very unusual neck of the woods.
What I am saying is that in all probability there will be other things that you should be doing to your site that are more important than speed. Do those things first then use the tools provided to take out all of the speed mistakes you have made when designing your site.
Once this thing kicks in it will be interesting to see if site owners develop speedy gateway pages that then link to the real meat of information or entertainment. It seems to me that this would be an excellent solution, give Google what it wants, be one click away from what users want and have a very low bounce rate.
| 8:32 am on Nov 24, 2009 (gmt 0)|
Thinking about this, I don't think the rule would eventually apply to everyone. With the kind of broadband available in the developed world, I don't think anyone would mind loading a slower page because it is a matter of waiting that extra second.
But for countries like here in India, internet is still way slower. That means it makes sense for Google to show me text -based websites over flash presentations which obviously would take a lot more time to load.
| 4:44 pm on Nov 24, 2009 (gmt 0)|
For years we've talked about Google using user behavior in the SERPs. And of course, there's personalized search.
So for people like anand84 and me (supposedly in the developed world, but often in places with very slow connections), I would think personalized search might kick in over time.
All of the 200 factors used in SERPs are at least once removed from the ideal case (Google knows what you need) and response times and page load times are just another example. From Google's perspective, I would frame the question like this:
*Do we have data to show that all other things being equal, users prefer faster pages?
*If yes, how important is that factor relative to other factors and how does this vary with speed of connection?
If you can answer those questions, and answer yes to the first one, you have to slide that factor in. You can argue that the faster page isn't the better page, and of course it would have to be a close race for that to be the deciding factor.
But until some search engine can actually understand natural language and know the user so well that it knows the current state of knowledge of the user, it can't serve the "best" page based solely on quality of information. It can only know that these 28 pages seem to fit pretty well and might all be adequate, therefore let's go looking at tertiary factors.
With more data and more personalized search, I could see speed, page width, fixed versus fluid layouts, overall visual appeal, rockin' Lawrence Welk music in the background might all become factors in ranking pages.
Currently, all measures of quality in search are mere proxies for meaning and quality. We are in the infancy of search and no SE is capable of truly understanding a user's question and going out and finding the ten best pages for that user (because "best" is very user specific).
I would say that favoring pages with keywords that match search keywords but are not taken based on meaning but at best context and without knowing the state of the user's knowledge are a "first order proxy" of meaning that lets the search engine guess at the meaning of the user's question.
Inbound links with anchor text are a secondary proxy of meaning, and also a proxy of quality, but currently preferred because the first order proxy is so easy to manipulate.
But inbound links can be manipulated too and furthermore, these are just proxies of meaning and Google is miles from understanding meaning itself, of either the question or the page.
So it brings in the other 200 tertiary proxies, which are mostly measures of quality, rather than meaning. And from Google's perspective "quality" means "the user prefers it" not "the best available information".
And that's as it should be.
| 5:11 pm on Nov 24, 2009 (gmt 0)|
Great Post, Thanks!
| 8:56 pm on Dec 1, 2009 (gmt 0)|
My colleagues and I have this thought:
Google's new OS is going to be web dependent. With a web based architecture it will be imperative for the browser to not be weighed down by slow loading sites.
So once again Google is only thinking about Google.
| 10:24 pm on Dec 1, 2009 (gmt 0)|
|Google's new OS is going to be web dependent. With a web based architecture it will be imperative for the browser to not be weighed down by slow loading sites. |
I'm not sure what you mean by "web dependent," but I doubt that the functionality of Chrome OS will be compromised by Joe Blow's slow-loading site.
IMHO, to whatever extent speed may be a factor in Google's complex ranking algorithm, it's likely to be about providing a good experience for the Google Search user. If statistics indicate that slow-loading sites dissatisfy searchers, why wouldn't Google search want to include "molasses factor" in its ranking algorithm?
| 9:49 pm on Dec 2, 2009 (gmt 0)|
There is now a new item in WMT, under Labs: Site Performance.
I'd say the presence of this section confirms site speed will soon be a factor in Google's algo.
| 6:23 am on Dec 3, 2009 (gmt 0)|
At my end the Site performance item started to show from today morning.
| 7:45 am on Dec 3, 2009 (gmt 0)|
The only one to really benefit for millions of web pages to optimize for speed is the entity that wants to index those millions of pages. As a USER who opens several hundred pages/images a day a few milliseconds lost is no big deal. The only time that metric becomes important is if the USER is attempting to view millions of pages per day...
That said, you still want to avoid delay in delivery of content to the user.
| This 131 message thread spans 5 pages: < < 131 ( 1 2  4 5 ) > > |