| 12:32 pm on Nov 14, 2009 (gmt 0)|
Whether they do it or not, what's the betting the negative SEOs are going to play around with near-DOS as a way of harming their clients' competitors?
| 12:38 pm on Nov 14, 2009 (gmt 0)|
That's a much scarier thought.
| 1:14 pm on Nov 14, 2009 (gmt 0)|
I doubt they will use speed as a factor on its own - it depends how the speed will be factored in. If for example, Google couples together the speed factor with the page bounce rate, then perhaps they may draw the conclusion that high bounce rate of a particular page is influenced by the page loading speed and demote that page in SERPs?
Or instead they could promote pages with high speed loading rather than demote slow loading pages.
| 6:19 pm on Nov 14, 2009 (gmt 0)|
I think this will be a good move.
There are so many database driven sites out there which need more than two seconds to load because people don't know how to optimize database queries.
Our sites average 40ms. (WMT)
| 6:38 pm on Nov 14, 2009 (gmt 0)|
I think it is already a factor, try moving your site to a fast server and you will see rise in serps in few weeks time or even in few days.
| 7:55 pm on Nov 14, 2009 (gmt 0)|
hum, define rise in serps. Years ago when the world was on dialup small pages were the vogue so it would be nothing new.
| 2:45 am on Nov 15, 2009 (gmt 0)|
|I think blindly taking into factor speed might not be appropriate since that would mean giving preference to a small two page site over a comprehesive Wikipedia article. Don't you think? |
I doubt if Google does much of anything "blindly." There might be some sense in using pageload time as one of several possible tiebreakers in cases where all other ranking scores were equal, but I can't see Google ranking Bob's widget affiliate pitch ahead of a 2,000-word Wikipedia article on widgets just because Bob's page loads more quickly than Wikipedia's does.
| 3:01 am on Nov 15, 2009 (gmt 0)|
Soon we're going to see a new Google announcement:
"Sign up for Google Hosting, the fastest servers on the web"
| 3:16 am on Nov 15, 2009 (gmt 0)|
LOL - A fast loading page of crap, what a nice user experience.
| 3:23 am on Nov 15, 2009 (gmt 0)|
Speed is already a factor in AdWords.
| 5:09 am on Nov 15, 2009 (gmt 0)|
Sluice through goose... still end up with goose sluice.
Only pages that will get caught in this are flash. G has already dropped off "dialup" (remember that?) as a category. NOTE: No hard evidence in that regard, but do you think I'm wrong?
| 7:03 am on Nov 15, 2009 (gmt 0)|
Matt Cutts did a video interview about site speed [videos.webpronews.com] with Mike MacDonald of WebProNews. He puts more emphasis on giving fast sites a rankings boost, but he does mention that "really slow sites" are probably not what users want to see.
| 9:25 am on Nov 15, 2009 (gmt 0)|
Good news for crappy pages with stolen text and no images + a load of Adsense ads on it!
| 10:36 am on Nov 15, 2009 (gmt 0)|
I would imagine load speed won't be their only ranking factor.
I don't consider Wikipedia or my own very long pages stuffed full of images slow to load.
| 9:05 am on Nov 16, 2009 (gmt 0)|
how can we get the speed of our website?
| 12:20 pm on Nov 16, 2009 (gmt 0)|
serenoo, try this tool [code.google.com...]
| 8:36 am on Nov 17, 2009 (gmt 0)|
Is there another way without installing anything?
| 5:28 pm on Nov 17, 2009 (gmt 0)|
|Is there another way without installing anything? |
Mouse button click and stop watch ;-)
The Fireworks extension just takes you to a script called Web Page Analyzer 0.98, if you find that you don't need to install the extension.
| 5:54 pm on Nov 17, 2009 (gmt 0)|
Sorry to do a immediate follow up post but I wanted to say (rant) something about the general topic of this thread.
This is like Newspapers stopping reporting the news and trying to create it. Google's sheer might can steer the web where it wants it to go rather than in the direction its users want. They seem to think that the search engines are about finding answers. One of their senior people even said recently that he could envisage a day when you would do a search and Google would return one correct answer.
I happen to think that much of the massive increase in use of the Web is not about finding answers but rather about hobbies and entertainment (and other stuff). The Web, it seems to me is shifting emphasis away from text and onto media. People are using the web for entertainment and search engines give them a starting point on a voyage of exploration, questionable habits and obsessions.
I have been doing web sites since most people had 14K modems so habitually I do all I can to keep my code and images as small as possible. Having looked at research on ADSL usage I recently started giving myself the luxury of larger less compressed images on sites where it seemed to me users would appreciate better visual content. Now Google is in effect saying we care about 3 things - text, links and speed. Anything produced by an Adobe application just slows down your site so it's a waste of time.
I agree with others here that sites that are very unresponsive and those where a database back end slows them down to a trickle deserve to be penalised but pages that start to load quickly but have lots of content shouldn't suffer.
| 6:17 pm on Nov 17, 2009 (gmt 0)|
I reckon the topic is fair. This and other recent concerns motivated me to do a Foo post whinging about software developers: [webmasterworld.com...]
Do you know it took me 76 seconds to download a page this afternoon. Fair enough I'm on an overly slow connection (and I don't want to move to a big city for a super speedy connection).
But things are getting really bad out there. Some sites are worse than they were in the times of dialup.
A kick up the arse is needed and I'm behind Google 100% on this however they tackle it.
I could take part of that hit if they do a silly implementation. Most of my images are in unnecessarily high resolution. I'm prepared to take a hit on that if they can't figure out that enough load to let visitors start reading the page quickly.
| 6:30 pm on Nov 17, 2009 (gmt 0)|
There's a lot of help from both Google Page Speed and Y! Slow that goes beyond image compression and database optimization. Rather than resist, I suggest at least reading the help files.
For my entire online career I've felt that page speed is an important and almost secret weapon. Now it's becoming an open secret, apparently. I'll bet lots of people will still ignore it.
| 6:35 pm on Nov 17, 2009 (gmt 0)|
The only thing to come out of all this is that the purported social media consultants of today will also double up as website speed consultants from next year..
Talk about creating jobs!
| 9:09 pm on Nov 17, 2009 (gmt 0)|
|I'll bet lots of people will still ignore it. |
And still loads more will not understand what it means.
I didn't mean to suggest any form of resistance I was just trying to point out that Google is trying to push the web in a particular direction which isn't necessarily in the interests of users. Perhaps I'm becoming cynical in my old age but I suspect there's more to this than what the official line suggests.
| 9:40 pm on Nov 17, 2009 (gmt 0)|
There are different versions of speed: the overall load time, and the load time per KB of viewable content. There's also page rendering time.
Personally I think overall load time is a flawed measure, because it fails to take into account just how much useful content you might be trying to display. So you may have an 80kb image, but is it an intricate and attractive picture or just some badly-optimised logo?
|Perhaps I'm becoming cynical in my old age but I suspect there's more to this than what the official line suggests. |
Maybe the true agenda here is to get us to ditch all of those bloaty, useless nofollows.
| 9:48 pm on Nov 17, 2009 (gmt 0)|
When it comes down to it, loading speed has zero effect on relevancy - none whatsoever. The stated intent is to measure user experience, and somehow incorporate that into an algorithm that's supposed to be centred on relevancy.
I think Ted's spot on to say that speed is highly important. All that faster internet means is less patience ;)
There's, of course, an element of Google scaring people into speeding up there websites for the "greater good". The outcome is hard to criticise, although the method is not necessarily to my tastes.
| 10:52 pm on Nov 17, 2009 (gmt 0)|
If you've been following Google's arc on this, last year they did a lot of user testing around the speed of their own pages. They mentioned in a blog article that they were surprised that even fractions of a second were making a statistically significant difference in user abandonment rates.
- They were pushing sites to respond to HTTP 1.1 If-Modified-Since requests for quite a few years.
- They hired Steve Souders (the force behind the Y!Slow tool) away from Yahoo - and some of Steve's research uncovered very surprising data and new best practices.
- They created their own Page Speed tool and publicized it rather heavily.
- They moved into testing AJAX on their own SERPs - precipitously as it turned out - essentially to improve the speed of the user experience.
- They just recently announced the ongoing development of SPDY networking protocol.
Essentially they would like the Internet to run like a local application! Speeding up the web is a major and long-term goal for Google. Even though there may be a self-interest factor, there is definitely a strong and altruistic component, geek-style. They really want to world to have a faster Internet.
My take is that, at least in the beginning, the algo may reward fast sites, rather than penalizing unusually slow sites. They already have a pile of server response data from a decade plus of spidering. They know how fast the server side is already, and it's probably not all that good. Some of the AJAX bloat out there is becoming horrendous - and that's especially on tecchie sites!
| 11:06 pm on Nov 17, 2009 (gmt 0)|
|the algo may reward fast sites, rather than penalizing unusually slow sites |
It's the same thing, isn't it? It still favours a particular subset of sites at the expense of others, and has no relationship with relevancy.
When it comes to usability, speed is a massively significant factor. There's no argument from me on that and I guess I would stand to gain from speed in the algorithm.
But I can't get away from the idea that this has nothing to do with relevancy at all. Is it better to have a faster site, or a more relevant one?
| 11:38 pm on Nov 17, 2009 (gmt 0)|
No one seems to have asked WHY some web sites are slow to load.
Early this year I was forced to upgrade my 2-year-old server to a faster one with a bigger pipe. The reason? Higher speed broadband coupled, to a degree, with high-speed accellerator (aka scraper) addons in browsers.
The speed of download has at least quadrupled over the past couple of years (note: this is UK). It now takes only a dozen or so simultaneous visitors to over-load a small virtual server, even one running at 100Mbps. Add a few videos, the odd Mbyte flash page or PDF and speed is soon dragged down.
It's not the fact that there is more traffic, although there is to a certain extent (a high and increasing proportion of it bots from google et al). It's that scraper-browsers pull in a lot of pages and images VERY fast. Hit peak time with several doing this and splatt! An hour later, back to a trickle.
Of course, it's not always browsers at fault. I have education-usage sites that are plagued by so-called "security proxies" that take it upon themselves to scrape every page of a site "just in case" - and then come back again an hour later, regardless of caching directives. I can't block them or my client complains.
I'm not even going to go into "illegal" hits from content thieves and their like - I kill most of those at the door.
How many small-scale business and hobby sites can afford to shop around for higher speed hosting? Indeed, how many would even suspect it is necessary? Most, if told, would say, "Ok, it takes two seconds to show a page. So what? I can't afford to pay twice as much to halve the display time. Especially in this recession!"
Indeed, it doesn't matter that much. Most people are happy with a second or two wait: in my experience that is commonplace anyway. So google gets upset if it takes a few hundred milliseconds: that's THEIR problem not ours. The REAL visitors are quite content. If SEs want faster access then a) hit less frequently and violently; and b) visit when the site isn't busy, like 1am in the morning (and don't let them say that can't be done!).
The sites I host now are small. A year or so ago I moved my highest bandwidth site to its own high-speed server just to get ANY usable bandwidth back on the virtual server. It gave me a respite of about 12 months before I had to upgrade the VS. And charge customers for the extra. Which they are not entirely happy about but put up with it when I explain.
When it comes down to it, for whom do we build web sites? Search engines or customers?
SEs are dictating far too much. They want our content because it makes them money. I don't really think they care over-much if it makes us money - in several cases companies have been bankrupted or closed due to an SE screwing around with its algorithms (one of them was almost one of my clients!) and never an apology seen.
I know my rant will not affect the outcome of this - I doubt very much google will even see it. I just wanted to make the (long-winded) point. Sorry.
| 11:56 pm on Nov 17, 2009 (gmt 0)|
|I can't get away from the idea that this has nothing to do with relevancy at all. Is it better to have a faster site, or a more relevant one? |
Sort of like asking whether your food should taste good or be nutritious, isn't it?
I'm into SEO because I realized when I built my first website that it did no good if people didn't visit it. And if more people will come and stay when a site is fast, then I'll find a way to give them what they want.
| This 131 message thread spans 5 pages: 131 (  2 3 4 5 ) > > |