homepage Welcome to WebmasterWorld Guest from 54.234.228.64
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 42 message thread spans 2 pages: 42 ( [1] 2 > >     
Speed as a ranking factor ?
Whitey




msg:4280938
 6:06 am on Mar 13, 2011 (gmt 0)

CRAWLING
Server speed is a factor that helps for a better crawl, potentially more frequent per page, as well as deeper into the site. But it's not a factor for staying in the index.

INDEXING
The number of pages that actually stay in the index, rather than just get spidered, is related very strongly overall to the backlink strength of both the home page and how that link juice is distributed internally. Backlinks to deeper pages are also a big help. Many webmasters have noticed that there seems to be a formula (admittedly a moving target of a formula) for how many pages will be retained in the main index for any site. It's not a simplistic formula, however. It's not a flat percentage, nor is it a certain hard number.

RANKING
Page load speed is still not an active factor in ranking (or crawling or indexing) but it is definitely on the horizon as a ranking factor, possibly this year. The early notices and the push from Google to inform webmasters and give them somne tools and education began last year.

[webmasterworld.com...]


I'm following up from an 2010 earlier thread where it was speculated that Google would commence the use of this factor in it's algo. Has anyone noticed if site speed is now a ranking factor?

 

tedster




msg:4280940
 6:21 am on Mar 13, 2011 (gmt 0)

Late last year Matt Cutts indicated in a video that page load speed was live as a ranking factor. He said it would influence about 10% of all searches in a relatively small way - as a sort of tie breaker in close call rankings.

I can't say I have any hard data to confirm this, but I also have no reason to doubt that there can be a direct effect.

The secondary ranking effects of a faster page load speed interest me a lot more. Fast sites set off a cascade of effects that can indirectly improve rankings. Happy visitors have a lot of behaviors that help improve rankings, even if it's clicking on an internal link instead of hitting the back button in frustration.

Whitey




msg:4280942
 6:30 am on Mar 13, 2011 (gmt 0)

Late last year Matt Cutts indicated in a video that page load speed was live as a ranking factor

Could you reference that video

.... interest me a lot more. Fast sites set off a cascade of effects that can indirectly improve rankings

Me too. What would be the likely list be , in order of importance, that it flows through to.

tedster




msg:4280955
 7:42 am on Mar 13, 2011 (gmt 0)

Wish I could - it's not easy to find specific videos, especially when it's Matt Cutts and could be on any of several websites. I've already spent 30 minutes looking, and that's enough for now.

Trying to research this, I found that my memory was off on the timing. Speed became a small ranking factor in March 2010: [webmasterworld.com...] At that time, it was affecting about 1% of searches, not 10%.

I doubt that there will be official announcements if and when Google turns up that dial. This 2009 video on WebProNews [videos.webpronews.com] was the storm warning.

-----

Three page speed effects I see that can affect rankings indirectly:

More page-views per visit
Fewer immediate click-backs to the SERPs
Happy visitors and customers posting links and/or social mentions

I have always - and I mean ALWAYS - been a fanatic about fast pages. I considered it my secret weapon when I develop a site because so few webmasters seemed to care - even in the dial-up days.

I've been helping some clients with page speed problems in recent months. The most common issues seem to be:
  1. not using compression for ALL text assets (html, css, js)

  2. sub-optimal image compression (jpgs should usually be compressed to around 40% In Photoshop, png should usually be 8-bit with a lookup table of 16 or 32 colors, etc)

  3. Too many separate CSS and JS files

  4. Using the browser to resize large image files

  5. Not allowing browsers to leverage their cache

Just fixing that much can bring MAJOR speed gains. Google's research showed that speed gains of as little as 0.5% on their own search results page meant increases in the number of searches per session.

Sgt_Kickaxe




msg:4280966
 10:28 am on Mar 13, 2011 (gmt 0)

I'm more worried about the effects of changing anything to speed up a site than I am about everyone else getting faster.

I can't tell you how many times I read about someone making a (what they believe to be)positive change and losing their rankings as the result.

Whitey




msg:4280970
 11:26 am on Mar 13, 2011 (gmt 0)

When you spoke of a "cascade of effects" I was thinking along the lines of A leads to B leads to C , perhaps in the indexing and scoring of pages and how that leads to a stronger site in general terms.

vivalasvegas




msg:4280972
 11:58 am on Mar 13, 2011 (gmt 0)

I've done everything I could - compression, image optimization, browser caching etc. My pages load fast when I access them. However, G Webmaster tools reports a steady pageload of close to 3 seconds for my pages which is not exactly low. It seems that this is mostly due to the one Adsense unit I have, especially when the ad is an image - these can take forever to completely load. This being said I haven't noticed any change in rankings in the past 12 months.

mercedesP




msg:4280974
 12:08 pm on Mar 13, 2011 (gmt 0)

As of last friday (11th of march) I've removed the background (photo) on the css style from my site due to the lag on loading time and to explore the possibility of lowering the bouncing rate of the site (70%ish).

After Panda was implemented (on the 22nd) I've noticed in WMT a sharp drop in impressions, but not in visits or ranking positions for my kws. Traffic (according to analytics) has slowly been dropping, but nothing "too noticeable" to blame Panda for this. Everything on the site -content, graphics, photos- are unique and I beleive it to be of quality as well (I get natural links -mostly to inner pages- and over the last year I have had a few visits to inner pages from Google's IPs, after which those pages showed up on the first positions. That lead me to believe that it was a "manual revision".
I wanted to make clear to my visitors that my ecomerce site "was different" right from the very first glance, but I've been wondering whether the loading overhead of that background image made visitors bounce before they could properly judge its content (indepth explanation of categories and their products).
Although this experiment is aimed at lowering the bouncing rate (as I'm not changing anything about its content) I may have the chance to observe G.reactions to it, but improving rankings is not what I would expect as I'm quite happy for its ranking possitions.

Will see if this is a "positive" change

Note: site has 45 pages, no adverts and the domain (.com) is 2 years old, aimed at spanish market.

pageoneresults




msg:4280981
 12:46 pm on Mar 13, 2011 (gmt 0)

If you were an SEO of a large company, what would you include in your 2011 strategy?
Mar 7, 2011 - [YouTube.com...]

Matt Cutts: One thing I would pay attention to is speed.

It's the first thing Matt discusses in the above video.

Personally? I think speed has been a determining factor for quite some time. If it were not, they surely wouldn't have included the speed stuff in GWT.

Remember when we used to optimize images for modem speeds? Same still applies today! All you folks running around with your phat arse pngs. What were you thinking when you added 100 images in your CSS file that add up to .5MB+?

wheel




msg:4280996
 1:34 pm on Mar 13, 2011 (gmt 0)

Wish I could - it's not easy to find specific videos, especially when it's Matt Cutts and could be on any of several websites. I've already spent 30 minutes looking, and that's enough for now.

SEO linkbait. Have an intern transcribe all the MC Google videos they can find and publish it (if that's allowed).

CainIV




msg:4281130
 9:24 pm on Mar 13, 2011 (gmt 0)

I think it warrants always spending considerable time and resources ensuring websites are up to speed.

Several websites I have worked with have seen significant conversion and subsequent revenue increases by making speed changes that most webmasters would deem unnecessary.

Whitey




msg:4281141
 9:49 pm on Mar 13, 2011 (gmt 0)

Have you seen any lifts in traffic to those sites?

dickbaker




msg:4281154
 11:30 pm on Mar 13, 2011 (gmt 0)

Several websites I have worked with have seen significant conversion and subsequent revenue increases by making speed changes that most webmasters would deem unnecessary.


I've increased my page speed to anywhere between 83 and 91, but my rankings haven't changed. In fact, I was hit hard by the Panda update, so page speed isn't a factor right now. Maybe it will be when all of the other changes are made.

Whitey




msg:4281168
 12:45 am on Mar 14, 2011 (gmt 0)

Surely sites in the 90+ 'ish percentile are subject to penalties. Y/N ?

It's a bad user experience for Google to be serving, unless there are no content alternatives

btw - what are the estimated times for serving through each range of percentile band/s . Any references out there ?

dickbaker




msg:4281200
 2:30 am on Mar 14, 2011 (gmt 0)

Surely sites in the 90+ 'ish percentile are subject to penalties. Y/N ?


With the Firefox page speed tool, 91 is good, 100 is better, and 10 is horrible. My page speed is extremely good to excellent, better than most of my competitors.

CainIV




msg:4281221
 3:53 am on Mar 14, 2011 (gmt 0)

I haven't seen traffic lifts, and I would guess that Google likely isn't going after websites that have good site speed. They would likely target websites where a significantly slower site speed is a detriment to user value.

I was merely pointing out that from a marketing expenditure standpoint, it makes full sense to optimize site performance based on both Google and visitors.

Whitey




msg:4281233
 4:21 am on Mar 14, 2011 (gmt 0)

They would likely target websites where a significantly slower site speed is a detriment to user value.


Part of me say's this makes sense , the other half of me says - well some site's can't load that quickly since they may be dependent on a complex database retrieval with other 3rd party integrations - so that wouldn't be fair

TheMadScientist




msg:4281243
 4:41 am on Mar 14, 2011 (gmt 0)

so that wouldn't be fair

It's not a fair game we play ... Who's site is higher quality: The one that needs the slowed down 3rd party software or the one built for speed serving 'essentially the same' information?

IMO the days of the little guy buying (finding) 'off the shelf' software and 'slapping' [term used loosely] a site together and ranking are on the decline ... I think if you want to play this game long-term, you should probably up the ante a bit, and that's probably going to require some expense, either through having a better system built or an investment of time (bunches of time) to build a better one yourself ... The preceding is my opinion only of course, so YMMV by using the concepts.

I haven't ever seen a Google 'be fair' to all sites and pages motto or statement anywhere ... And it really is fair ... If your site is significantly slower than someone else's it's not (imo) the same quality, so the slower site is scored 'fairly' by not ranking as high ... It's actually much more fair to the people who invest considerable time (or pay others to invest considerable time) in developing a top-notch system ... In fact, how is that at all unfair?

tedster




msg:4281248
 4:59 am on Mar 14, 2011 (gmt 0)

I think if you want to play this game long-term, you should probably up the ante a bit

I've had such wake-up calls several times every year, beginning with my first year earning a living via the web. And it just keeps happening.

When Google first began talking about The Need For Speed [webmasterworld.com] in 2009, and Steve Souders published two books about his research, that was just such a moment. And I went down the rabbit hole - learning about all kinds of things I had been ignoring, such as eTags and how browsers multi-thread their http calls.

Still, the basics I learned in the 1990s are still the core of faster pages for me. Image compression is one very big deal, as well as understanding how browsers cache various objects. This information is well worth putting into practice, and it leads to much better comprehension of how these interwebs work.

TheMadScientist




msg:4281250
 5:07 am on Mar 14, 2011 (gmt 0)

Yeah, I think the 'little things' some of us do are finally going to be rewarded ... Example: One of the site's I work on is hosted with a host that doesn't serve ETag or Expires headers and I get an Internal Server error when I try the 'standard, serve the stinking headers' code, so I spent about 3 hours yesterday finding a way to serve them manually (for images) and still don't have it all the way implemented yet, but it's something I'm going to finish soon, and it's also something imo most will not do, but I also think that's one of the things that separates the sites I build from most people's ... And I think it's absolutely fair for Google to take 'the little things' into account, because they really can and do add up to a big difference in overall site performance.

tedster




msg:4281254
 5:23 am on Mar 14, 2011 (gmt 0)

I remember an annual contest in the 90s run by the Bandwidth Preservation Society - the idea was to build an entire site with less than 5kb of code. Oh yeah, that was a discipline! In those days, people used to say that "www" stood for "world wide wait". Of course many were on 8kb dial-up modems, too.

And now today, every time a page on Wired or whatever site grabs my browser and won't let go for 25 seconds, I want to grab their web team and scream at them.

Anyone notice how fast mobile's share is growing these days? If you don't have a dedicated mobile site, then paying attention to page speed is especially important.

TheMadScientist




msg:4281260
 5:38 am on Mar 14, 2011 (gmt 0)

And now today, every time a page on Wired or whatever site grabs my browser and won't let go for 25 seconds, I want to grab their web team and scream at them.

Sometimes I think 'broad band' is the new 'lazy' ... lol

I actually just now installed the expires header piece and I found one of the includes I have from an external site is not serving Expires headers either, so now I'm about to serve 'em for them, because I crazy like that ... It's a bit of work to make happen on my side, but I don't like to see the status bar not say done for more than half a second or so, especially when I should have everything cached.

Whitey




msg:4281348
 12:46 pm on Mar 14, 2011 (gmt 0)

so that wouldn't be fair

Just qualifying this a little more , i meant it in the context of Google not really knowing how to prefer a slow site over a fast one. e.g. A fast one may show largely static data , whereas a very slow one may be calling lot's of dynamic info onto the page to make it more " up to date".

TheMadScientist




msg:4281357
 1:08 pm on Mar 14, 2011 (gmt 0)

Ah, got it, thanks for the clarification ... I thought it was the 'but I can't afford to (or don't want to)...' not fair road again ... I've seen it around here once or twice ... lol

AussieDave




msg:4281392
 1:54 pm on Mar 14, 2011 (gmt 0)

Hi all,

Although Google has numerous data centers, all my sites reside in Australia. Considerably more hops, in-turn producing a much higher latency than if Google was access pages/sites on the other side of the Big Pond.

If speed is now a factor of the alog - albeit even if it's only 10%, would this not have an affect on my sites in the serps regarding speed?


Cheers

Dave

wheel




msg:4281396
 2:06 pm on Mar 14, 2011 (gmt 0)

Dave, It shouldn't. What I've been told is that Google doesn't use crawling speed for this measure,they use visitor data (probably from toolbars and the like) to measure this. So it's how fast your pages load for visitors.

In my case, and talking of data centers, I'm hosted in a center that has traffic sharing with most of the large residential ISP's in my country. So there's like no distance. When I traceroute from my house to my host, it goes:
- my isp
- my isp in the city that has my servers
- the traffic sharing facility
- my data center
- my server.

Google apparently is also part of the traffic sharing arrangement so I'm one hop away from their crawler too - but that's apparently irrelevant.

There are other factors in making a page load fast for visitors as well.

AussieDave




msg:4281401
 2:14 pm on Mar 14, 2011 (gmt 0)


I remember an annual contest in the 90s run by the Bandwidth Preservation Society - the idea was to build an entire site with less than 5kb of code. Oh yeah, that was a discipline! In those days, people used to say that "www" stood for "world wide wait". Of course many were on 8kb dial-up modems, too.


I got hooked up in 95 with a P1 120 and a 28.8kb modem...Best d/l speed I got was 2.4Kb/sec. Times have certainly changed since then. Though...some sites are so bloated they take just as long to load.


Cheers

Dave

AussieDave




msg:4281403
 2:17 pm on Mar 14, 2011 (gmt 0)


There are other factors in making a page load fast for visitors as well.


Not wanting to derail, I'll take a look at other threads here and hopefully be able to throw some input in them, whilst getting some info in return.

Thanks for the heads up wheel!


Cheers

Dave

pageoneresults




msg:4281408
 2:24 pm on Mar 14, 2011 (gmt 0)

Some sites are so bloated they take just as long to load.


I see that quite a bit these days and I'm running 6 down 1 up. I SHOULDN'T notice any delays with most websites but I do. Then I take a look at the makeup of those documents and find out why.

200+ http requests
100+ images (60 in CSS)
1MB+ total file sizes

I still hear clients today use the justification that if "they don't have broadband yet" then our content probably does not appeal to them. That's just wrong to think that way. Even though one may have broadband, when your documents are fat like the above example, your site is still slow. Just imagine what it is like for someone who has broadband but is only getting 1 down or less.

Whitey




msg:4281695
 12:06 am on Mar 15, 2011 (gmt 0)

I'll take the scenario i placed a bit further. Say a typical DB driven site has this choice :

1. Load page/s with static or cached content with 0.5 - 1.5 secs - bounce rate is around normal for industry - maybe top 10-20 percentile for speed

2. Load page with dynamic content takes 10 -15 seconds , always fresh and relevant . Users tolerate the load - bounce rate is around what competitors experience - bottom 10 percentile for speed ie 90% + . Ouch

I think the pro's and cons can be expanded a little into a fuller SWOT , including the DB performance issues associated with it, but how do you make a choice on the basis of this, and how does Google consider the dilema from a SERP's standpoint.

This is an aspect that i consider difficult to equate fairness to.

This 42 message thread spans 2 pages: 42 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved