homepage Welcome to WebmasterWorld Guest from 54.167.238.209
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 131 message thread spans 5 pages: < < 131 ( 1 [2] 3 4 5 > >     
'Speed of Site' May Become Ranking Factor in 2010
anand84




msg:4024764
 4:58 am on Nov 14, 2009 (gmt 0)

Googler Matt Cutts has recently said in an interview that the company may start putting weightage on the speed at which the site loads as a ranking factor moving ahead in 2010.

[webpronews.com...]

How do you think it affects us? I think blindly taking into factor speed might not be appropriate since that would mean giving preference to a small two page site over a comprehesive Wikipedia article. Don't you think?

 

Receptional Andy




msg:4026968
 12:05 am on Nov 18, 2009 (gmt 0)

Sort of like asking whether your food should taste good or be nutritious, isn't it?

Perhaps, but I don't quite agree with the analogy. Do you want fast food or to wait for something to be prepared and cooked?

My analogy is flawed also, but the internet is not made up of good, fast cooks. To my mind, technical proficiency is not a good measure of the quality of a website's content, although, of course, I've been exploiting that advantage wherever I can ;)

TheMadScientist




msg:4027122
 5:12 am on Nov 18, 2009 (gmt 0)

Indeed, it doesn't matter that much. Most people are happy with a second or two wait: in my experience that is commonplace anyway. So google gets upset if it takes a few hundred milliseconds: that's THEIR problem not ours. The REAL visitors are quite content.

That's not what the numbers Google shared say:

At Google, we've gathered hard data to reinforce our intuition that "speed matters" on the Internet. Google runs experiments on the search results page to understand and improve the search experience.

...

Our experiments demonstrate that slowing down the search results page by 100 to 400 milliseconds has a measurable impact on the number of searches per user of -0.2% to -0.6% (averaged over four or six weeks depending on the experiment).

...

Similarly, users exposed to a 400 ms delay since the beginning of the experiment did 0.44% fewer searches during the first three weeks, but 0.76% fewer searches during the second three weeks.

[googleresearch.blogspot.com...]

They slowed result delivery time down by less than half a second and at the end of 3 weeks lost nearly one percent of the searches regularly conducted... I'm in the speed matters crowd personally.

ergophobe




msg:4027131
 6:09 am on Nov 18, 2009 (gmt 0)

I'm surprised nobody has mentioned Matt Cutt's extended comments on this at the last session of PubCon. A few takeaways I got.

1. He mentioned that page speed is already used as a metric for ads on the AdWords side of things, but said that it is not yet a factor in search.

2. He said he would not rule out that it might become one of the 200 factors used in determining ranking in the future.

3. He made some comment like "2010 is the year to speed up your site".

I have a site that is overloaded and slow and needs a tuning. I'm thinking that if nothing else, a slow site looks bad to Google in two ways

a. if they are taking bounce rate and page views per visitor into account based on toolbar data, GA data and logged in users, a slow site would tend to fair worse in those respects.

b. I imagine that a slow site is harder to crawl, and therefore to index properly.

I'm sure other people here can think of other consequences of slow sites that lead to them being indirectly hurt in the SERPs without speed being a first-order factor in determining ranking.

FranticFish




msg:4027175
 8:31 am on Nov 18, 2009 (gmt 0)

I think of speed this way:

Call a company offline, or email them. How fast do they answer the phone / respond? Do they sound alert and interested or half asleep?

I think it's rare for there to be a 'best' provider of any service. There's just the one that suits you best.

Whenever I'm shopping around and making enquiries, the companies that get back to me quickly or deal with my enquiries promptly and efficiently make an impression.

sem4u




msg:4027201
 9:19 am on Nov 18, 2009 (gmt 0)

I think it is fair to use 'speed of site' in the algo as one of the many factor. Not many people want to sit there and wait for a long time (i.e. a few seconds) for a site to load completely.

Hissingsid




msg:4027223
 10:34 am on Nov 18, 2009 (gmt 0)

I wonder if one motivation in this is to encourage us all to do some spring cleaning of our sites. I've just spent a couple of hours using the optimization tool and found some real howlers crying out to be fixed. The main issue was redundant rules in my css files some of which pointed to background images for non existent divs. The tool spiders all of the images in your .css files and counts them in your page size even if some of those images are only used on inner pages not on the page being analysed.

On some sites I use a core css file that is loaded by the home page and is cached by the browser. This contains some rules not used by the home page but that are used by inner pages. I guess as a result the home page loads marginally slower and inner pages laod marginally faster. The overall burden on bandwidth is lower especially where background images are reused on inner pages. My point is that analysing individual pages is not necessarily indicative of the performance of the site as a whole in either bandwidth or user experience (speed) terms.

Cheers

Sid

mysticalsock




msg:4027341
 2:56 pm on Nov 18, 2009 (gmt 0)

I think it's about time Google stopped thinking that they rule the internet.

Hissingsid




msg:4027352
 3:07 pm on Nov 18, 2009 (gmt 0)

I think it's about time Google stopped thinking that they rule the internet.

It is the one eyed man in the land of the blind.

anand84




msg:4027383
 3:55 pm on Nov 18, 2009 (gmt 0)

I think it's about time Google stopped thinking that they rule the internet.

They indeed do.

oddsod




msg:4027402
 4:32 pm on Nov 18, 2009 (gmt 0)

Fast pages was a recommendation in Brett's original 30 day article if anyone remembers that.

ergophobe




msg:4027431
 5:11 pm on Nov 18, 2009 (gmt 0)

I was thinking about how lax I've gotten. I used to religiously keep HTML clean and run xDebug/Cachegrind to profile PHP scripts for bottlenecks.

I took Matt's comments at Pubcon as a call to action to all of us who've gotten lazy.

maximillianos




msg:4027435
 5:13 pm on Nov 18, 2009 (gmt 0)

We have utilized server side caching solutions to greatly speed up our popular pages. Memcached is what we used. Great little piece of software.

explorador




msg:4027450
 5:29 pm on Nov 18, 2009 (gmt 0)

It would be amazing to see Google taking more in count the sites that aren't Blogs</rant>

johnnie




msg:4027478
 6:20 pm on Nov 18, 2009 (gmt 0)

I don't think you should take 'page speed' as an absolute measure. I'm inclined to think of it as a "speed per letter of readable content"-kind of thing. This would not only prevent a bias towards small sites, but would also stimulate site owners to think twice before adding that huge flash-animation etc.

Bewenched




msg:4027488
 6:29 pm on Nov 18, 2009 (gmt 0)

Whether they do it or not, what's the betting the negative SEOs are going to play around with near-DOS as a way of harming their clients' competitors?

I think it's already a factor. We recently had this happen to us ... first in spurts, then two days ago a full blown DOS attack against not only us, but our entire ISP.

it has affected our rankings.... nothing else has changed significantly.

Whether or not it was a competitor or a ticked off customer .. who knows... but it happened just the same.

wheel




msg:4027536
 8:00 pm on Nov 18, 2009 (gmt 0)

small two page site over a comprehesive Wikipedia article

Size of text on the page has little if anything to do with speed. take a page with say 300 words at 5 chars each - that's 1500 characters, so say 1.5k very roughly.

Boost that to 5000 words. That takes your page to 25k download.

Add in background noise of say 50K and the difference between a small 300 word page and a monster 5000 word article is 50k and 75k. Add in gzip and the noticeable difference in speed of loading of those two pages is quite simply nonexistent.

Speed of your webpages is dependent on two things (and neither of them is the amount of content): server response time, and how much graphics crap you've got on your page. Those are the two things you need to address. And both of those, while they seem like small things to me, are 'fair enough' reasons to spank or reward (though I'm not sure if they're indicative of content quality. We'll take Google's word for it).

I can say that I'm one data point on this - my main sites rock speedwise. My crap sites are on $5 a year hosting plans which frequently are slow loading.

httpwebwitch




msg:4027550
 8:24 pm on Nov 18, 2009 (gmt 0)

ditto wheel. Speed is achieved by using lean markup, optimized graphics, compression... and having a rocking good hosting plan on a fast server.

I've been accused of having very fast-loading sites. Here are my tips:

If you are doing complex data manipulation at runtime, stop it. Precompute and denormalize your data.
Benchmark.
Simplify your code.
Look for places where redundant or repetitive file I/O or SQL commands are happening. Kill them.
Using an external 3rd party API to get content for the page? Don't. Find another way.
Apply caching liberally.
If you have to, remove features.
If you can't remove a slow-loading feature, pull it out of the initial page response and load it asynchronously with AJAX.

Hissingsid




msg:4027567
 8:55 pm on Nov 18, 2009 (gmt 0)

Have any of you used the websiteoptimization tool that Google link to?

It doesn't matter what we think is important, what matters for Google ranking is what Google thinks is important. Could this tool be an indicator of what they think is important I wonder.

yaix2




msg:4027636
 10:21 pm on Nov 18, 2009 (gmt 0)

This is really good news. I always wondered why in the past 20 years computers got so much faster, but programs still need some seconds to load and pcs even longer to boot. Same on the web, with Mb size bandwidth, and some pages still need seconds to load.

I usually check WMT to see that load times are ok, they usually are around 1/2 second (~500ms). What is an acceptable value?

coachm




msg:4027653
 10:36 pm on Nov 18, 2009 (gmt 0)

Quick naive take on this: My concern would be that those with more dollars who can afford faster servers and associated hardware would "win", and I don't like to see any other blows to small business people in terms of search engine results.

I understand the user experience issue. I just don't want to have to "compete" on speed, since honestly, it's something where the competition is won with money.

tedster




msg:4027659
 10:48 pm on Nov 18, 2009 (gmt 0)

I can think of one SEO oriented site (it's really two) that loads v--e--r--y--s--l--o--w--l--y. But I put up with it from time to time because I really want to read something there. If your market is highly motivated, they'll put up with your slow pages too, to a degree at least.

But there are so many things you CAN do to speed up a site, even on el-cheapo hosting. They really don't take money, they just take the desire and caring to do it.

So don't do it for Google if that doesn't float your boat. But do it for your visitors. I can tell you, your business will do better because of the extra speed.

Hissingsid




msg:4027662
 10:55 pm on Nov 18, 2009 (gmt 0)

Tedster,

I just think this is one more factor to check against the competition. If you have more keyword density, more prominence, more relevant backlinks, more PR and a faster site than the current #1 then you ought to be able to take that top slot. Or perhaps if you are only matching the #1 for most SEO factors speed could be the thing that gives you a slight edge.

Cheers

Sid

tedster




msg:4027670
 11:30 pm on Nov 18, 2009 (gmt 0)

I agree Sid. And once I waded into these speed tools and associated best practices [developer.yahoo.com], I was amazed at the number of things I never considered that can make a difference in how fast a page loads. CSS sprites, gzip, number of HTTP requests, script minification, ETags, reducing the number of DOM elements... the list goes on for quite a bit.

DonMateo




msg:4027717
 12:52 am on Nov 19, 2009 (gmt 0)

I'm a bit concerned about how this will play out for international sites :-

Take two sites, both targeting the same local demographic, for argument's sake let's say Tasmania, Australia.

Site 1 is hosted in Hobart, Tasmania
Site 2 is hosted in San Francisco, California.

Which site is more responsive for Tasmanian residents? (1)

Which site appears faster to googlebot? (2)

It would be a pitty if 1 got a rankings boost over 2 based on it being faster, when in reality it's slower for the majority of its users.

Or perhaps the speed will be measured in other ways... Crome, etc.

ken_b




msg:4027718
 12:59 am on Nov 19, 2009 (gmt 0)

Hmm... somewhere around here I think I read that calling your images from a separate domain, or a subdomain would lead to faster loading.

Is that right?

And would it help in relation to this thread?

TheMadScientist




msg:4027761
 3:07 am on Nov 19, 2009 (gmt 0)

Uh, no, it won't help a bit... Don't do it.

That's my trick, and it works because you can force a browser to open more connections than it normally would to a single site (domain), but don't tell anyone alright? Thanks! (It might or might not help in relation to GBot, but I build sites for the real visitors and if GBot doesn't pick up on it, then I couldn't care less, because IMO load speed matters.)

tedster




msg:4027768
 3:28 am on Nov 19, 2009 (gmt 0)

It doesn't even need to be separate domains - just separate hostnames (subdomains) will allow more simultaneous browser connections, too. The link I gave above for "best practices" says this:

Split Components Across Domains

tag: content

Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org


TheMadScientist




msg:4027773
 3:45 am on Nov 19, 2009 (gmt 0)

Alright if we're going to have a 'spill the beans thread' I'll play along for another post...

If you (or your host) has domain wildcards set to on all you have to do is link to the subdomain and not canonicalize your images.

They still sit on the same server in the same directory as they do now, but the browser opens extra connections to the 'other' domain...

* Make sure you if you do this you exclude your images from any canonicalization first (at the top of your file) if you use a negative-match, otherwise you'll be redirecting the browser back to your domain and adding time to the process.

I usually only serve from two total 'domains', because when I tested with 3 my browser visibly slowed down on cable and a high-end host, but test for yourself, because your results may vary...

walkman




msg:4027782
 4:04 am on Nov 19, 2009 (gmt 0)

Probably means that from 0-x it's OK, but if takes too long you get a strike against you

TheMadScientist




msg:4027783
 4:15 am on Nov 19, 2009 (gmt 0)

I'll throw one more out there, just to pad my post count...

You'll have to test for yourself, but depending on how many images (items) you have to load, you can speed up actual page load time by using less larger images rather than more smaller images, because contrary to what seems to be popular belief, browsers usually stall out on upstream requests because service providers allow for a much faster download rate than upload rate to make their connections appear faster for a given amount of bandwidth.

The average browser only opens 2 requests to a given domain (unless the user knows how and adjusts the settings), so if you have 12 images (slices) and can decrease the number your actual time to display will decrease, because with only 2 connections a browser finishes the first download, then makes another request, finishes the second download, then makes another request, and so on, so your 3rd and 4th images don't even begin to download until another upstream request has been made after the first and second downloads (respectively) are complete...

The overall image size is going to be about the same, so all you really do by slicing is force the browser to make more upstream requests, wait for a response from your server and the delivery of the next file, then complete the download and repeat the process until all the image (or whatever) requests are complete.

I personally won't slice images, because they may appear, on paper, to 'load faster' overall individually, but most of the time making a single request for the same amount of data is faster because you eliminate steps from the process and the overall amount of data transferred is generally the same.

IMO and in agreement with the others who have posted something similar, there is usually something you can do to decrease your page load time without changing hosts or paying more or doing anything more than a bit of research into the best way to structure things WRT speed.

I actually don't remember where I read the information I just posted, but there's a link here at WebmasterWorld to the original source, because a few years ago I followed it... Happy hunting. :) I honestly think it might have been in the supporters forum, but really don't remember or I would post it to back up my statements.

EDITED: I just plain can't type sometimes. LOL. You'd think someone who works with a keyboard would figure out how at some point in time, but it's still a struggle for me on certain days of the week! LOL.

Cancellara




msg:4027805
 5:44 am on Nov 19, 2009 (gmt 0)

Good news for crappy pages with stolen text and no images + a load of Adsense ads on it!

+1

This 131 message thread spans 5 pages: < < 131 ( 1 [2] 3 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved