homepage Welcome to WebmasterWorld Guest from 54.226.213.228
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 42 message thread spans 2 pages: < < 42 ( 1 [2]     
Speed as a ranking factor ?
Whitey




msg:4280938
 6:06 am on Mar 13, 2011 (gmt 0)

CRAWLING
Server speed is a factor that helps for a better crawl, potentially more frequent per page, as well as deeper into the site. But it's not a factor for staying in the index.

INDEXING
The number of pages that actually stay in the index, rather than just get spidered, is related very strongly overall to the backlink strength of both the home page and how that link juice is distributed internally. Backlinks to deeper pages are also a big help. Many webmasters have noticed that there seems to be a formula (admittedly a moving target of a formula) for how many pages will be retained in the main index for any site. It's not a simplistic formula, however. It's not a flat percentage, nor is it a certain hard number.

RANKING
Page load speed is still not an active factor in ranking (or crawling or indexing) but it is definitely on the horizon as a ranking factor, possibly this year. The early notices and the push from Google to inform webmasters and give them somne tools and education began last year.

[webmasterworld.com...]


I'm following up from an 2010 earlier thread where it was speculated that Google would commence the use of this factor in it's algo. Has anyone noticed if site speed is now a ranking factor?

 

Sgt_Kickaxe




msg:4281698
 12:21 am on Mar 15, 2011 (gmt 0)

...and how does Google consider the dilema from a SERP's standpoint.


Google doesn't differentiate between your two examples. Each site is scored for 100+ factors and they rank according to their totals, with some priority given for certain factors no doubt. Actually I think exact match page titles just took a hit but that's for another thread.

TheMadScientist




msg:4281703
 12:38 am on Mar 15, 2011 (gmt 0)

Yeah, I think Sgt_Kickaxe is probably right on there ... There are also many ways the dynamic site could be sped up to compete with the static site(s) ... Mine are actually all dynamic and they will put many static sites to shame as far as speed goes, so I see you were talking about a different type of 'fairness' but, as far as the situation above goes, the faster site will probably rank higher and again there are all kinds of ways to speed up the dynamic site, especially with the speed of processors these days.

If you're asking because you own a dynamic site that's slower, I'd recommend taking a long look at the select statements and storage ... That's where I've found the most speed ... Normalization is something I'd stay away from when it comes to storage for speed ... Data storage is cheap and the multiple 'chained' or 'sub' selects required to access normalized data can be very expensive from a speed perspective. ;)

Whitey




msg:4281705
 12:55 am on Mar 15, 2011 (gmt 0)

many ways the dynamic site could be sped up to compete with the static site(s)


Sometimes they can i guess , but sometimes they can't - for example if you're making XML calls in for live content that you can't cache, you'll be restricted by the speed of the provider.

If half your site behaves like this that's a big drain on the site speed stats average . So again , not all sites are equal in terms of how they might be considered by Google.

Another example might be a site that is having technical problems causing time outs. How tolerant is Google going to be with this until it flips it's lid?

TheMadScientist




msg:4281707
 1:01 am on Mar 15, 2011 (gmt 0)

Sometimes they can i guess , but sometimes they can't - for example if you're making XML calls in for live content that you can't cache, you'll be restricted by the speed of the provider.

Then a static site won't have the same content, so the competition would be with the source imo ... Increased speed could probably be accomplished by being 5 seconds behind too ... I can actually think of a few ways to 'speed it up' in this type of situation.

Another example might be a site that is having technical problems causing time outs. How tolerant is Google going to be with this until it flips it's lid?

How 'high quality' is a destination that has time-outs? I would guess a 'technical difficulty' once in a while is alright, but if it's 'often' the site may have issues.

zerillos




msg:4281712
 1:11 am on Mar 15, 2011 (gmt 0)

"Another example might be a site that is having technical problems causing time outs. How tolerant is Google going to be with this until it flips it's lid?"

Quite a bit. I had some technical issues a while back and nothing happened.

BillyS




msg:4281718
 1:29 am on Mar 15, 2011 (gmt 0)

There are a lot of lazy webmasters out there (thankfully). Over a year ago, I spent about two months looking at page speed and decreased page loads by 25%. Our YSlow score was around 70 when we started and it's around 90 now. I've one last step to take and it will be as perfect as possible.

Personally, I think speed is a ranking factor. It improves the user experience and that tells me it's important.

rhornsby




msg:4281741
 3:21 am on Mar 15, 2011 (gmt 0)

It would seem to me that Google has vested interest in pages loading faster with the advent of Google Instant, as it causes less load on their servers. So it would seem logical to reward faster sites that make Google's work easier.

Whitey




msg:4281757
 4:20 am on Mar 15, 2011 (gmt 0)

as it causes less load on their servers

I'm not sure that Google instant relies on site speed, correct me if I'm wrong.

tedster




msg:4281763
 4:59 am on Mar 15, 2011 (gmt 0)

I agree, Whitey. Google Instant is not in any way affected by the servers of the websites that are shown in the results. Instant is only showing data that resides on Google's own server farms.

TheMadScientist




msg:4281766
 5:11 am on Mar 15, 2011 (gmt 0)

An interesting take away for me from tedster's post is: Google relies on their servers for as much as they can ... IMO it's an interesting aspect to make note of and possibly implement ... One of the speed questions in this thread is about the reliability or speed of 3rd parties sites, and that question itself, combined with how 'the big boys' do things makes me wonder about the plan for the site.

I understand there may be cases where it cannot be avoided, but by itself the dependence of one site on another to serve the information visitors are looking for makes me wonder about the underlying plan, because there are so many variables outside of direct on-site control when you depend on a 3rd party site to serve your own ... One thing I think it might be important to keep in mind is the possibility of down-time or non-connectivity is effectively doubled for a site relying on the 3rd party site to function correctly ... IDK if I would want that as part of my business model or not?

Whitey




msg:4281775
 5:31 am on Mar 15, 2011 (gmt 0)

but if it's 'often' the site may have issues.

Exactly how 'often' is an issue for me. Anyone out there know ?

I've just witnessed a site lose it's meta titles due to some time out issues, and it almost instantly tanked in the SERP's on all affected pages . The expectation is that it will revive, now that it's fixed.

The implication might be that pages that can't index due to time outs, cannot stand by themeselves in the ranking or indeed support through internal navigation structures, other URL's - those factors may be instant. This isn't really what i was talking about, but i guess it does raise a systematic causal detriment to ranking caused by a slow site.

So when Tedster earleir spoke of "cascading effect" i really was thinking along these lines. There must be a myriad of effects out there Y/N ?

AussieDave




msg:4281801
 8:00 am on Mar 15, 2011 (gmt 0)

Another example might be a site that is having technical problems causing time outs. How tolerant is Google going to be with this until it flips it's lid?



I've just witnessed a site lose it's meta titles due to some time out issues, and it almost instantly tanked in the SERP's on all affected pages. The expectation is that it will revive, now that it's fixed.


Funny you should mention that Whitey...

Reiterating my sites all target Australian traffic. Being in a highly competitive vertical maybe that's the reason I keep getting hit by scrappers and what not. I made the decision to implement the an Apache geo ip county block. (yes I still allowed the US in).

One of my managed VPS boxes got upgraded, for some reason it took out the Apache mod. It rendered a 500 error for about 72 hours. In that time my site using the geo ip blocking got slaughtered. The site dropped from decent page 1 Google serps to page oblivion...

Since then a few more outages and I'm now in the process of changing hosts.

I think it's imperative if you don't want to end up like this you should make sure your using a top notch host with multiple high end peering.

This 42 message thread spans 2 pages: < < 42 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved