homepage Welcome to WebmasterWorld Guest from 54.234.59.94
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 72 message thread spans 3 pages: 72 ( [1] 2 3 > >     
Google Adds Site Speed To Web Search Ranking Algorithm
vandread




msg:4113121
 6:34 pm on Apr 9, 2010 (gmt 0)

Google has started to include site speed in web search rankings. They have enabled that feature a few weeks ago and it is currently only enabled for English queries on Google.com.

According to the blogpost over at the Google Webmaster blog less than 1% of search queries are affected by the site speed signal currently.

[googlewebmastercentral.blogspot.com...]

What's your opinion on that? I personally think that speed is not everything. A complex site, one with lots of photos or designs is obviously slower loading than a plain html site without graphics at all.

 

tedster




msg:4113324
 11:39 pm on Apr 9, 2010 (gmt 0)

Well, it's only a factor in 1% of the English queries done on google.com. And it's only ONE factor at that - kind of a tie-breaker, I'd say.

But I hope this pushes some of the hosting services who are just coasting right now (especially those with big names) to get off their butt and offer something a bit more like "service".

TheMadScientist




msg:4113340
 12:09 am on Apr 10, 2010 (gmt 0)

Would you use it as a 'tie-breaker' or as a signal of 'non-quality' when a site is 'super-slow' comparative to other results, or maybe both?

I ask, because couldn't it be only 1% of queries are effected, because if sites are in the 'relative speed' or 'acceptable speed range' of the sites they are competing against it's a non-factor and the reason only 1% of queries are effected is because there are only enough sites with a low enough load speed to trigger a 'distinct and obvious lack of speed' filter for 1% of the queries done?

crobb305




msg:4113341
 12:16 am on Apr 10, 2010 (gmt 0)

I have observed some very odd spikes in the past 2 weeks on my php redirects. This is a file that houses my affiliate links. According to GWT, (which explicitly names this redirect file as slow-loading, despite the fact that it is blocked by robots.txt), the load times for this php file have been consistent for 3 months, under 2 seconds, but suddenly, multiple spikes ranging from 9 seconds to 15 seconds!

All of the links within the file work fine, and there are only about 5 of them. So this is a very small file. Any ideas what would cause these sudden spikes in slow load times? Anyone else see this? I fear this is going to have a negative effect on my rankings.

BillyS




msg:4113347
 12:34 am on Apr 10, 2010 (gmt 0)

I tweaked our site exactly two weeks ago. According to GWT we were slower than 70% of sites, now we're at 50%. The only recommendations I have left involve getting rid of Adsense...

Interesting development for search, should send a lot of panic and speculation through the ranks.

BTW - anyone run any speed tools against Wikipedia? I have. I wonder if they'll drop?

micklearn




msg:4113405
 4:18 am on Apr 10, 2010 (gmt 0)

I realize that the stats shown on Alexa.com are generally considered to be suspect, but has anyone else experienced that GWT shows the *exact* same numbers as Alexa does for "Average Load Time"? (on Alexa.com, it's under "Traffic Stats" >> "Traffic Rank" and near the bottom of the page)

Also, could "less than 1% of search queries are affected by the site speed signal *currently*" mean that it's in test mode and will eventually have an effect on all search queries?

@BillyS - As long as there is a newly noticed and direct link to Wikipedia on AOL's homepage, I don't think that they'll ever have any problems with rankings. But, then again, maybe you were being sarcastic...

smallcompany




msg:4113416
 4:48 am on Apr 10, 2010 (gmt 0)

Any ideas what would cause these sudden spikes in slow load times? Anyone else see this?


I've posted a question some time ago in this thread as well as in PHP of this forum. No super answer.

I thought about that quite a bit and I figured that Google is simply waiting for the new page to load and calculates that into loading time of the PHP redirect script (BTW, I ban this in robots, too). That was the only way for me to justify Google's measurement anywhere from 3-15 seconds. Sites of some of my partners really take time to load (which varies).
And yes, I have sites with same redirects scripts that are in the green area in WMT. It could easily be that if partner's site is fast, that my redirect script measures fast as well.

Other than this, I could not figure anything else. I tried using an array of links opposing to separate lines (as suggested in PHP thread), no difference. I need PHP as my all tracking concept is PHP based.

icedowl




msg:4113424
 5:21 am on Apr 10, 2010 (gmt 0)

Any ideas what would cause these sudden spikes in slow load times? Anyone else see this?


Yes. In fact, in early February I was on vacation and could only get online with a dial-up connection. During this time mine spiked way up. Since they say "Data may also not be available for your site if not enough users (with Google Toolbar and the PageRank feature turned on) have visited your site's pages during the time period displayed.", I assume that at least part of the speed is based on information gathered by the Toolbar. Personally I do use the toolbar and was using it when on dial-up.

idolw




msg:4113432
 5:29 am on Apr 10, 2010 (gmt 0)

What do they consider when they calculate the page load time?
The code itself or the server location, too?

tedster




msg:4113473
 7:56 am on Apr 10, 2010 (gmt 0)

The complete details are not published, but we do know that they are using Toolbar data. Here's another thread on the topic - Google site performance measurements for international sites [webmasterworld.com]

Also - see the new blog article from Matt Cutts:

The main thing I want to get across is: don't panic... we still put much more weight on factors like relevance, topicality, reputation, value-add, etc. — all the factors that you probably think about all the time. Compared to those signals, site speed will carry much less weight.

www.mattcutts.com/blog/site-speed/ [mattcutts.com]

idolw




msg:4113547
 1:48 pm on Apr 10, 2010 (gmt 0)

Thanks for that Tedster.
Our sites get over 90/100 score in Google PageSpeed for all pages. We host the sites on our own servers in our country but target global visitors.
I wonder whether we should go for content delivery solutions to make sure sites load quickly enough everywhere in the world. I guess the answer is yes.

roodle




msg:4113554
 2:54 pm on Apr 10, 2010 (gmt 0)

That Matt Cutts quote is contradictory to what he said when interviewed about Caffine (I'm sure most of you have seen it) where he said speed wasn't going to be a factor, though they still thought it was good for the users that webmasters improved this aspect of their sites (of course).

tedster




msg:4113572
 3:36 pm on Apr 10, 2010 (gmt 0)

Right, the site speed factor is not the purpose of Caffeine. It is integrated even into the non-Caffeine results that are still kicking around.

tedster




msg:4113591
 4:26 pm on Apr 10, 2010 (gmt 0)

If you are using Google Analytics on a site, consider the new Asynchronous Tracking Code [google.com] code that became available to all GA users last December.

The first benefit Google mentioned about this new code is "Faster overall page load time". Given the slow-down that is sometimes caused by original Google Analytics code, this seems like a good thing to look at. It also allows you to put the Analytics snippet higher in the page's source code without delaying the rest of the content.

ChicagoFan67




msg:4113740
 3:12 am on Apr 11, 2010 (gmt 0)

2 or 3 weeks ago I experimentally added 2 400kb images (html scaled) toward the top of my home page instead of their thumbnails. My header & logo image is tied up in the CSS of the page and was the last to load, taking forever. It really looked awful and I did experience a about a %30 drop in page views.

Google updated its cache of the page yesterday and this page is now pushed back a page or 2 or 3 or 4 in the SERPs for keywords that are either associated with these pictures or fall near them. Most keywords that were present on the page before this update are unaffected except for some that are also toward the top of the page near the images.

tedster




msg:4113742
 3:24 am on Apr 11, 2010 (gmt 0)

That does sound suspiciously like a site speed demotion, ChicagoFan - and the timing would be right, too. If you take some steps to speed that up (I think a good compression utility should be able to shrink those image sizes by 90%) I'd be interested to hear the results.

ChicagoFan67




msg:4113750
 4:04 am on Apr 11, 2010 (gmt 0)

I have replaced the images with custom thumbnails about 5kb. Will let you know Tedster.

tangor




msg:4113787
 9:12 am on Apr 11, 2010 (gmt 0)

Another article with additional information at The Register: [theregister.co.uk...]

outland88




msg:4113935
 6:12 pm on Apr 11, 2010 (gmt 0)

One of the many ironies regarding this is Google's statement in WMT "These estimates are of low accuracy." Well if it’s of low accuracy why are they using it? Plus now I'm also seeing they're measuring the load speeds of my mail servers. What the heck does it matter how fast/slow my personal mail is to Google? I’m not broadcasting my mail unless Google has found a way to slip in and do that.

Plus the big “iffy” to me is using any of the tools Google is providing in WMT to check load times. Is this just another Google gambit to gather (snoop) even more data from webmasters or have a surefire way to penalize sites?

Plus thirdly will this policy be mainly implemented against the rank and file or will Google’s “fat cat” buddies be eliminated from any penalization. In other words has Google deemed many so indispensable to their own interests they'll feel no wraith.

I also get quite a kick out of Google’s statement in WMT that the accuracy is (fewer than 100 data points). Most people who engage in any testing will tell you anything falling outside of 4-5 points is not accurate enough to predict. Anything above a 10 is likely measurable by flipping a coin.

Plus I wonder if this has anything to do with Google's introduction of ultra high speed internet in places where people will do about anything to have it.

[edited by: outland88 at 6:16 pm (utc) on Apr 11, 2010]

g1smd




msg:4113936
 6:14 pm on Apr 11, 2010 (gmt 0)

Well, when I look in WebmasterTools and it says pages of some new client site take 9 seconds to load, and that the site is in worst 7% of sites queried, I am guessing that is a very big hint to do something about it. :)

outland88




msg:4113938
 6:31 pm on Apr 11, 2010 (gmt 0)

Well, when I look in WebmasterTools and it says pages of some new client site take 9 seconds to load, and that the site is in worst 7% of sites queried, I am guessing that is a very big hint to do something about it. :)


You're telling me with the experience you have that it takes WMT to tell you how slow a sites might possibly be. After that they're quite a few variables that can't be controlled.

tangor




msg:4113956
 7:12 pm on Apr 11, 2010 (gmt 0)

I've mentioned this elsewhere... it is NOT about the "happy user" it is about google being happy with not having to wait for your server to send material to their bots! Do it quicker, they can do more without expending more resources (time really is money for google).

I wonder if crawl-delay specifications in robots.txt is also included in these page load calculations?

tedster




msg:4113976
 8:17 pm on Apr 11, 2010 (gmt 0)

I see things differently, tangor. If that were 100% true, then Google would be using googlebot crawl data rather than toolbar data. So I don't think this is some kind of mythology being fudded at us just to improve crawl times for googlebot. Many of the speed factors being promoted don't even affect googlebot one bit, because it doesn't render the page - it just takes the code itself and sends it back to Google's "universal cache".

My feeling, in this case, is what we see is pretty much what we get. Google has hitched their vision for the web's future to their considerable power to "persuade."

So I do think it's about happy users - and especially happy users of Google. I don't think Google cares quite as much whether our own users are happy with our particular site or not - that has always been our job. But if the top search results also load fast, then the average user will like their Google experience more.

Ever since the demise of the Bandwidth Preservation Society (anyone else remember those 5kb contests?) more and more site developers are forgetting about the end user experience. Why should any page take ten or fifteen seconds to load over broadband connection speeds? That's web developer/designer arrogance, or possibly (being very kind) ignorance.

So this is now what we live with -- and also ignore at our own peril. Side speed, measured on the user side, is a new ranking factor, and it's folded in with something like 200 other factors. I'm happy about it, because I can use this to push certain reluctant clients into improving their site performance - which will improve things for THEIR users and THEIR bottom line.

Heavy-hitter company with bajillions of backlinks may not see a lot of ranking improvements, but a speedier site will still impact their bottom line. And it does take such companies a lot of work to speed things up. That's why they've been reluctant. When every page calls 20 external CSS files and 30 external JavaScript files, then combining, minifying and maintaining that code mess will require a major effort and in some cases, a restructuring of responsibilities.

Remember this? Last year Google tested their own property and found a remarkable increase in user satisfaction from only a fraction of a second speed improvement. Yahoo also went through many of their properties recently and re-did the coding mess -- with solid results.

Speed really does matter and it always did.

tangor




msg:4114011
 9:51 pm on Apr 11, 2010 (gmt 0)

Wasn't wearing my tinfoil hat, and some of my comment was speculation... but we are stuck with that since there is little transparency from google. Comments on webmaster arrogance/ignorance are right on target as I also agree there is a "happy medium" of speed v content that should be observed.

OTOH, I'm not all that thrilled that an outside third party will be ranking "my" content against others in a speed contest, particularly since we don't know what kind of race is being run. There's a great deal of difference in a race between Nimitz class aircraft carriers and canoes. A canoe, of course, can take off faster than a carrier, it just can't transport as much. :)

Further commentary from the Register article mentioned above:

When Google first hinted it would make the change," Phil Payne, of web consultant ishram research, argued it would completely change the world of web design. "For experienced Google-watchers, this means Google has thrown web design as we know it into the trashcan," he told The Reg. "Web design as currently practiced is hereby DEAD. Flash becomes poison - lots of funny little blank pictures to build up a page's appearance will ensure no one ever sees it."

Payne also points out that the change may impact the webhosts as well. "Free gigabytes are no longer enough - 2 millisecond response times will be demanded. How many can control service time on virtualised servers?

"Adding a performance requirement to the web as we know it stands it on its head."

[theregister.co.uk...]

tedster




msg:4114013
 10:07 pm on Apr 11, 2010 (gmt 0)

Well, thankfully our sites only need to compete against a relatively few other sites - and not the whole ball of wax represented by the "% of all sites" number that we're shown in Webmaster Tools.

In the past few months I've been runnig both the Page Speed [code.google.com] and Y!Slow [developer.yahoo.com] tools against quite a few sites - and I've run into very few that are rated well at all. So my advice is to be an early adopter here - site speed has long been a secret weapon in web success, but now it's not so secret anymore.

TheMadScientist




msg:4114014
 10:15 pm on Apr 11, 2010 (gmt 0)

OTOH, I'm not all that thrilled that an outside third party will be ranking "my" content against others in a speed contest, particularly since we don't know what kind of race is being run.

Couldn't the same argument be used WRT the other 195+ or so variables they don't explicitly tell us about? At least they came out and said, 'Hey, you don't have to speculate or guess, we're using this...', which seems rare indeed, and I'm not sure how people who happily accept their content will be ranked by a third party based on variables they are not explicitly told will be used have any more of a problem with them explicitly telling us about a variable they will use, even if not exactly how?

I know I'm picking on you a bit tangor, but my point is more generally than that and IDK why people are whining and complaining when at least the people who used keywords for ranking purposes for years are telling us about this one rather than leaving it to be speculated and a guessed at to decide whether it's a factor or not, which means we can jump directly to how, doesn't it? Now go get your foil out, make yourself a shiny new hat and figure out how they are calculating speed, then let us know when you have it figured out, alright? Thanks! LOL.

tangor




msg:4114032
 10:51 pm on Apr 11, 2010 (gmt 0)

@TheMadScientist: Pick on me! I've felt a bit left out you haven't looked my way before. :)

Of the "200" methods of ranking, none of which are transparent, this one takes the cake. What google is doing with this speedometer flag is shutting out, killing, obstructing, preventing, misdirecting the "happy user" from content "because the site is slow".

On one of my sites a 128,000 word novel is offered in one page (some like it all, not in bits and pieces) and thus, will be "slow". On one of my client's pages there's very instructive flash tutorials (with a goldurned flash front end which I tried to talk them out of) that is not the most fleet of foot.

At some point it MUST be the Content that is of value, not the horse and buggy that brought it to town. Unless I've misread the Cutts comments and those of others regarding this topic the "speed factor" is server response time and that is a whole different kettle of fish. I know too many grand sites (many authority) that operate off workhorse servers with a bit of tooth, but are still robust and capable. And I have no doubt that those operating their own servers at best can afford pipes aren't going to be happy, either. </rant>

TheMadScientist




msg:4114035
 11:03 pm on Apr 11, 2010 (gmt 0)

@TheMadScientist: Pick on me! I've felt a bit left out you haven't looked my way before. :)

LMAO... I'll try to do better from now on!

I see what you're saying where it could easily be mis-applied, but I think it's more of a comparative use, and if your site with the book on one page is the only site presenting it, there would be no comparatively faster site for the content to be delivered from, so yours would be comparatively (relatively) the fastest. The same thing for the tutorial with the flash... It's probably unique, which would mean comparative analysis would determine it's relatively the fastest site delivering the desired content and speed would not remove it from the rankings or 'penalize' it in any way, but for sites serving 'dictionary definitions' or other 'facts' where there are likely to be a large number of results to choose from then the visitor would probably enjoy seeing the information faster rather than waiting for a slower site to load since they can easily 'get the facts' faster from one than they could from the other...

The preceding is why I think only a limited % of queries are effected at all, and Cutts and Co. are saying it will not be as important as relevance and other factors... In the case of 'relevance a tie' between sites presenting factual information which can be freely reproduced it IMO does no harm in trying to show the fastest sites to find the information first, which means tedsters 'tie break' theory could be correct, and I also think if a site takes 10 seconds to load and there are other relatively equal sites in terms of information (relevance to the query) it could tank a site in the rankings, but I think those are the two most likely situations where speed would be used, rather than as an 'overall factor' which could replace a relevant site with a less relevant resource based on speed.

Remember one of the biggest things they want to do is be the most used, and by removing the most relevant site from the results based on the load speed they lose their relevance as a search engine and drive visitors away, but when the information is freely available then they can make their visitors (and yours) happier by showing them the fastest version of the 'facts' they were searching for.

tangor




msg:4114044
 11:19 pm on Apr 11, 2010 (gmt 0)

Though I am from Texas, I have a Missouri mentality. I'll believe it when I see it.

As for "relative" or "comparative", who gets to make that determination? In this case it is google. Not an argument against google, but a strong suggestion that we should be glad there is Bing and a few others out there who... at this time... are serving serps based on content, not speed. Sadly, most "netizens" of the "google states of egalitarianism" blindly tromp to the trough for "freebies" and don't know how much they miss because of the corporate brainwashing.

TheMadScientist




msg:4114045
 11:28 pm on Apr 11, 2010 (gmt 0)

Yeah, but if relevance really comes first what's even funnier is how many people are changing servers and hosts to try and speed up an irrelevant (or spam) site thinking it will help them in the rankings... LOL... It may be spam, but it's fast spam, why isn't it number one yet? Google is broken... LOL.

Can you imagine if they do apply it the way it's suggested and people make a big deal about it thinking speed will trump PR and relevance, so they go change hosts? And, what really makes me laugh is people who change hosts or servers rather than just figuring out how to make their site load faster where it is...

I actually contacted a host one day about moving from their shared box to dedicated (this was a couple (few?) years ago, because I've always thought speed is important) and they recommended against it, because their shared cluster is basically a 'hot box' and they said there was no way I could compete with the speed of their shared box for less than $2000 (or something outrageous) a month... I decided shared is cool.

Let me add: Most of the site speed problems I've seen are onsite, not the server directly, and there are times when shared can be faster than dedicated if you cannot afford a 'super fast' dedicated box. Yeah, if you're on a Free or $2.95 a month shared box it's probably overloaded, and could be a speed issue, but for most of the smaller (lower traffic) sites I've worked with a good $10 to $12 a month shared plan with a reputable host can be as fast or faster than a dedicated box if the site is 'tuned' for speed... The host I was referring to said if there was a security concern I might want to move to the dedicated box either way, but the shared cluster(s) they have are faster than most people can afford from a dedicated box, so as long as bandwidth did not become an issue the best answer for speed was their cluster. (I followed the advice partly because they talked me out of the more expensive route where they would have made more money to make sure I was happy.)

There are some great onsite speed tips and tricks in the original 'site speed' thread here: Site Speed May Become a Ranking Factor in 2010 [webmasterworld.com]

This 72 message thread spans 3 pages: 72 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved