Welcome to WebmasterWorld Guest from 220.127.116.11
How do you think it affects us? I think blindly taking into factor speed might not be appropriate since that would mean giving preference to a small two page site over a comprehesive Wikipedia article. Don't you think?
Or instead they could promote pages with high speed loading rather than demote slow loading pages.
joined:July 3, 2008
I think blindly taking into factor speed might not be appropriate since that would mean giving preference to a small two page site over a comprehesive Wikipedia article. Don't you think?
I doubt if Google does much of anything "blindly." There might be some sense in using pageload time as one of several possible tiebreakers in cases where all other ranking scores were equal, but I can't see Google ranking Bob's widget affiliate pitch ahead of a 2,000-word Wikipedia article on widgets just because Bob's page loads more quickly than Wikipedia's does.
joined:Jan 26, 2004
"Sign up for Google Hosting, the fastest servers on the web"
Only pages that will get caught in this are flash. G has already dropped off "dialup" (remember that?) as a category. NOTE: No hard evidence in that regard, but do you think I'm wrong?
I don't consider Wikipedia or my own very long pages stuffed full of images slow to load.
This is like Newspapers stopping reporting the news and trying to create it. Google's sheer might can steer the web where it wants it to go rather than in the direction its users want. They seem to think that the search engines are about finding answers. One of their senior people even said recently that he could envisage a day when you would do a search and Google would return one correct answer.
I happen to think that much of the massive increase in use of the Web is not about finding answers but rather about hobbies and entertainment (and other stuff). The Web, it seems to me is shifting emphasis away from text and onto media. People are using the web for entertainment and search engines give them a starting point on a voyage of exploration, questionable habits and obsessions.
I have been doing web sites since most people had 14K modems so habitually I do all I can to keep my code and images as small as possible. Having looked at research on ADSL usage I recently started giving myself the luxury of larger less compressed images on sites where it seemed to me users would appreciate better visual content. Now Google is in effect saying we care about 3 things - text, links and speed. Anything produced by an Adobe application just slows down your site so it's a waste of time.
I agree with others here that sites that are very unresponsive and those where a database back end slows them down to a trickle deserve to be penalised but pages that start to load quickly but have lots of content shouldn't suffer.
Do you know it took me 76 seconds to download a page this afternoon. Fair enough I'm on an overly slow connection (and I don't want to move to a big city for a super speedy connection).
But things are getting really bad out there. Some sites are worse than they were in the times of dialup.
A kick up the arse is needed and I'm behind Google 100% on this however they tackle it.
I could take part of that hit if they do a silly implementation. Most of my images are in unnecessarily high resolution. I'm prepared to take a hit on that if they can't figure out that enough load to let visitors start reading the page quickly.
For my entire online career I've felt that page speed is an important and almost secret weapon. Now it's becoming an open secret, apparently. I'll bet lots of people will still ignore it.
I'll bet lots of people will still ignore it.
And still loads more will not understand what it means.
I didn't mean to suggest any form of resistance I was just trying to point out that Google is trying to push the web in a particular direction which isn't necessarily in the interests of users. Perhaps I'm becoming cynical in my old age but I suspect there's more to this than what the official line suggests.
Personally I think overall load time is a flawed measure, because it fails to take into account just how much useful content you might be trying to display. So you may have an 80kb image, but is it an intricate and attractive picture or just some badly-optimised logo?
Perhaps I'm becoming cynical in my old age but I suspect there's more to this than what the official line suggests.
joined:Jan 27, 2003
I think Ted's spot on to say that speed is highly important. All that faster internet means is less patience ;)
There's, of course, an element of Google scaring people into speeding up there websites for the "greater good". The outcome is hard to criticise, although the method is not necessarily to my tastes.
Essentially they would like the Internet to run like a local application! Speeding up the web is a major and long-term goal for Google. Even though there may be a self-interest factor, there is definitely a strong and altruistic component, geek-style. They really want to world to have a faster Internet.
My take is that, at least in the beginning, the algo may reward fast sites, rather than penalizing unusually slow sites. They already have a pile of server response data from a decade plus of spidering. They know how fast the server side is already, and it's probably not all that good. Some of the AJAX bloat out there is becoming horrendous - and that's especially on tecchie sites!
joined:Jan 27, 2003
the algo may reward fast sites, rather than penalizing unusually slow sites
It's the same thing, isn't it? It still favours a particular subset of sites at the expense of others, and has no relationship with relevancy.
When it comes to usability, speed is a massively significant factor. There's no argument from me on that and I guess I would stand to gain from speed in the algorithm.
But I can't get away from the idea that this has nothing to do with relevancy at all. Is it better to have a faster site, or a more relevant one?
Early this year I was forced to upgrade my 2-year-old server to a faster one with a bigger pipe. The reason? Higher speed broadband coupled, to a degree, with high-speed accellerator (aka scraper) addons in browsers.
The speed of download has at least quadrupled over the past couple of years (note: this is UK). It now takes only a dozen or so simultaneous visitors to over-load a small virtual server, even one running at 100Mbps. Add a few videos, the odd Mbyte flash page or PDF and speed is soon dragged down.
It's not the fact that there is more traffic, although there is to a certain extent (a high and increasing proportion of it bots from google et al). It's that scraper-browsers pull in a lot of pages and images VERY fast. Hit peak time with several doing this and splatt! An hour later, back to a trickle.
Of course, it's not always browsers at fault. I have education-usage sites that are plagued by so-called "security proxies" that take it upon themselves to scrape every page of a site "just in case" - and then come back again an hour later, regardless of caching directives. I can't block them or my client complains.
I'm not even going to go into "illegal" hits from content thieves and their like - I kill most of those at the door.
How many small-scale business and hobby sites can afford to shop around for higher speed hosting? Indeed, how many would even suspect it is necessary? Most, if told, would say, "Ok, it takes two seconds to show a page. So what? I can't afford to pay twice as much to halve the display time. Especially in this recession!"
Indeed, it doesn't matter that much. Most people are happy with a second or two wait: in my experience that is commonplace anyway. So google gets upset if it takes a few hundred milliseconds: that's THEIR problem not ours. The REAL visitors are quite content. If SEs want faster access then a) hit less frequently and violently; and b) visit when the site isn't busy, like 1am in the morning (and don't let them say that can't be done!).
The sites I host now are small. A year or so ago I moved my highest bandwidth site to its own high-speed server just to get ANY usable bandwidth back on the virtual server. It gave me a respite of about 12 months before I had to upgrade the VS. And charge customers for the extra. Which they are not entirely happy about but put up with it when I explain.
When it comes down to it, for whom do we build web sites? Search engines or customers?
SEs are dictating far too much. They want our content because it makes them money. I don't really think they care over-much if it makes us money - in several cases companies have been bankrupted or closed due to an SE screwing around with its algorithms (one of them was almost one of my clients!) and never an apology seen.
I know my rant will not affect the outcome of this - I doubt very much google will even see it. I just wanted to make the (long-winded) point. Sorry.
I can't get away from the idea that this has nothing to do with relevancy at all. Is it better to have a faster site, or a more relevant one?
Sort of like asking whether your food should taste good or be nutritious, isn't it?
I'm into SEO because I realized when I built my first website that it did no good if people didn't visit it. And if more people will come and stay when a site is fast, then I'll find a way to give them what they want.