homepage Welcome to WebmasterWorld Guest from 54.227.141.101
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 131 message thread spans 5 pages: < < 131 ( 1 2 3 [4] 5 > >     
'Speed of Site' May Become Ranking Factor in 2010
anand84




msg:4024764
 4:58 am on Nov 14, 2009 (gmt 0)

Googler Matt Cutts has recently said in an interview that the company may start putting weightage on the speed at which the site loads as a ranking factor moving ahead in 2010.

[webpronews.com...]

How do you think it affects us? I think blindly taking into factor speed might not be appropriate since that would mean giving preference to a small two page site over a comprehesive Wikipedia article. Don't you think?

 

tangor




msg:4035992
 7:45 am on Dec 3, 2009 (gmt 0)

The only one to really benefit for millions of web pages to optimize for speed is the entity that wants to index those millions of pages. As a USER who opens several hundred pages/images a day a few milliseconds lost is no big deal. The only time that metric becomes important is if the USER is attempting to view millions of pages per day...

That said, you still want to avoid delay in delivery of content to the user.

levo




msg:4035997
 8:02 am on Dec 3, 2009 (gmt 0)

Wow, according to site performance, I should gzip "www.google-analytics.com/ga.js"

thayer




msg:4036135
 1:12 pm on Dec 3, 2009 (gmt 0)

< moved from another location >

In Webmaster Tools, G is now showing some of their ideas about site speed. I'd like to hear others' ideas about 1.choices that affect page/site speed significantly and 2.possible ranking choices by G.

There are many questions/ideas/choices that the WMT info raises for me...

1. Site speed vs. page speed vs. reload page speed vs. 2nd page of site speed.

2. Comparison with sites I compete with, not comparing with G's 1.5 sec idea about what is a "fast page".

3. Pages that are download-intensive (with many or hefty-sized javascript, graphics, and/or iframes) are not going to rank well for speed.

For example, if I have a "widget game" then it is best (for speed ranking) to have a page primarily of text describing the game to try to rank in G. The widget game itself with its sizeable downloads needs to be placed on another page. This is not what I think the average surfer wants. Optimally, G will be able to algorithmically downgrade the speed factor for searches that surfers want/expect/need heavy downloads.

[edited by: tedster at 2:32 pm (utc) on Dec. 3, 2009]

johnnie




msg:4036226
 3:52 pm on Dec 3, 2009 (gmt 0)

Hmm... Time to gzip google analytics?

Structuralist




msg:4036264
 4:16 pm on Dec 3, 2009 (gmt 0)

Johnnie, funny you should say that. Yesterday Google released asynchronous tracking for Analytics [analytics.blogspot.com].

londrum




msg:4036268
 4:18 pm on Dec 3, 2009 (gmt 0)

maybe i'm a cynic, but i reckon this has got just as much to do with the fact that it will save google a pile of money. if they can convince the web as a whole to reduce page weight by as little as 1/50th, that still saves them a collosal amount of money every time they send the spiders out.
it might save them a packet in storage costs as well, especially when the web is growing at the rate it is.

anand84




msg:4036284
 4:32 pm on Dec 3, 2009 (gmt 0)

maybe i'm a cynic, but i reckon this has got just as much to do with the fact that it will save google a pile of money.

A part of me agrees with you (the other part always loves Google so much that it cannot think of them being evil).

Could that mean this talk about speed of the site is all in the air and Google just wants people to focus on improving the speed of the site? I mean, while people go about improving the speed of the site, the SERPS might not actually be influenced much by them. I wish that would happen.

Simsi




msg:4036343
 5:41 pm on Dec 3, 2009 (gmt 0)

Ranking on speed doesn't seem to make much sense to me, not unless you are talking about several seconds. It would surely make more sense to see how a user behaves visiting the site. If they return to the SERPS pretty quick then whether it was a speed issue or a content issue, clearly the page hasn't done the trick. If they stick on the page and don't come back for another option, then it's more than likely it has. Monitoring user behaviour negates the need to measure speed IMO.

There are certain technologies and online applications where a second or two's delay will actually result in more accurate content being delivered, so penalising these sites is assumptive and not necessarily an improvement for the user. Most sites where a delay is necessary would also display "loading" messages too.

It seems logical therefore that as speed does not necessarily reflect quality of content, if there is an effect, it will be minimal.

Hissingsid




msg:4036351
 5:48 pm on Dec 3, 2009 (gmt 0)

it might save them a packet in storage costs as well, especially when the web is growing at the rate it is.

Google wins, you win and your users win. I can't find anyone that loses unless you take the decision to remove or degrade real content.

Since this is going to be a ranking factor and ranking is a competitive issue it is up to you to decide if you want to compete on this one factor. If you don't that's fine. If you want to not take simple steps to make your site faster then you can still compete by doing more on the other 199 factors.

I wonder how many of us before seeing Matt's comments knew how to implement gzip on our servers or how easy it is to add a Cache-Control line to our htaccess file. I'm sure some will have but I for one had not even considered these things but they make a massive sitewide difference for very little effort. I didn't realise that http includes a check that the browser can deal with gzip when the request for a page is received and then sends the page components zipped up only if the browser can unzip them.

Basically there is a shed load of easy to implement technology already available to us that the vast majority of us are not using and Google has data that shows this. Google also knows that most of us have made a few stupid errors in our coding or image compression at some time in the lives of our sites that sit their costing bandwidth every day and we are blissfully unaware. On my main $ site I've made one pass with the Firebug speed tool and found enough little inefficiencies and unimplemented speed technologies to make a massive difference to my users and to hopefully help cement that top slot on Google. This is one Google algo change that I applaud loudly. Whoever suggested this deserves a medal. Well done Google!

Cheers

Sid

vordmeister




msg:4036356
 5:55 pm on Dec 3, 2009 (gmt 0)

I think we are talking about several seconds here.

Problem is webmasters all live in big cities and have 100Mb/second connections. You ask what difference half a second makes?

For me on my 0.5Mb connection it adds 100 seconds to loading time. During a recent limited study I found an on-line news site took 76 seconds to load. Do you really expect me to hang around waiting for that? If I'm interested I'll find a different news site.

I'm fairly sure my connection is average when compared with the rest of the world. For new sites I log on to my 28k dialup, clear my cache, then make sure they load sensibly.

I can understand that other Google searches might get annoyed and click their back button just as often as I do. It's about time they stopped showing me sites I can't get to unless I have a tea break while they are loading.

Hissingsid




msg:4036364
 6:08 pm on Dec 3, 2009 (gmt 0)

It's about time they stopped showing me sites I can't get to unless I have a tea break while they are loading.

That's not the point. If a site/page is the best for a particular search then it will get listed no matter what it's speed unless there's one that is almost as good in every way but is also faster then speed may just give it enough of an edge to bump it up one place.

This is not about absolute speed it is about "speed as a ranking factor". ie your speed compared with the speed of your competitors as one of 200 factors.

Cheers

Sid

vordmeister




msg:4036403
 6:37 pm on Dec 3, 2009 (gmt 0)

Agreed Sid. I believe what you said is their intention for the moment. I can't see Google knocking the newspaper out of their search completely as they post great stuff and are not slow for people with quick connections.

If personal search develops in a way that suits me I suspect they will start offering me newspaper sites with similar content a little more often than the ones I can't reach. There's still a long way to go.

Shouldn't be a problem for anyone. There's no reason why a couple of kb of words should take anyone more than a second or two to download.

gouri




msg:4036447
 7:39 pm on Dec 3, 2009 (gmt 0)

WMT says that using gzip can make websites faster but I had a question.

Is gzip code that you have to include in your CSS files and/or in the code of individual pages or is it an add-on feature to include to your browser?

I watched the video that Google has on gzip but I am not sure.

TheMadScientist




msg:4036482
 9:03 pm on Dec 3, 2009 (gmt 0)

GZip happens between your server and the browser. You do not need to include any code in your files, but it needs to be enabled on your server for you to use it and the browser requesting the page has to be able to handle the compression for the page & images to display properly.

Here's some more info:
Mod_Deflate on Apache [httpd.apache.org]

Hissingsid




msg:4036568
 12:11 am on Dec 4, 2009 (gmt 0)

There's also mod_gzip.

On a shared server you enable it using your .htaccess file. If you don't have mod_gzip or mod_deflate enabled on your server but have .php you may be able to do gzip using this in your .htaccess

php_value output_handler ob_gzhandler

If you want to know what modules are enabled on your server try makeing an info.php file with this in it
phpinfo();

Put the file in your web space and navigate to it with your browser. Once you have finished delete the file.

Cheers

Sid

gouri




msg:4036571
 12:16 am on Dec 4, 2009 (gmt 0)

First, thank you to both of you for that information.

For the things that you guys mention, do you need access to the root host file?

TheMadScientist




msg:4036586
 12:44 am on Dec 4, 2009 (gmt 0)

Do you mean the httpd.conf or the root directory of your website?

The .htaccess solutions will work as long as the necessary module(s) are loaded on your server in the httpd.conf by your host if you do not have access... An .htaccess would go in the root directory of your website, not the root directory of your server.

IMO: The best thing for you to do since this is out of your area of expertise is probably to contact your host and ask them what the best practices are for your server configuration and then do exactly what they say, because even the .htaccess file, which you can either create or will have access to is a hidden file on your server, mainly because it's important to not only the basic operation of your site, but the overall efficiency. If you are even a character or two off in the file (even whitespace) you can generate a site-wide 500 Internal Server error, or even worse, could have an error which is unnoticed and has a negative impact on visitors or search engine rankings.

The preceding stated, my advice is to make the necessary changes, but:
1.) Make sure you know the proper configuration for your specific host and server.

2.) Make sure you triple-check the end result on multiple days in multiple browsers on multiple pages in multiple directories to ensure there are no situations where things do not work as expected. (Make sure you empty your browser cache in between checks to ensure you are getting a 'fresh' version of the page, too.)

gouri




msg:4036591
 12:55 am on Dec 4, 2009 (gmt 0)

This is great information. I am going through it to understand the different things that you have mentioned.

TheMadScientist




msg:4036599
 1:01 am on Dec 4, 2009 (gmt 0)

I forgot to mention:

The .htaccess or httpd.conf MUST be edited in a plain text editor, such as NotePad on a PC or TextWrangler on a Mac. If even just the line breaks are not correct (UNIX) you will generate a 500 Internal Server Error... Compression is good. Knowing what you are doing to make it happen in the files you work with is critical.

Reno




msg:4036678
 3:15 am on Dec 4, 2009 (gmt 0)

Very useful thread -- thanks to MadScientist and everyone else for sharing the server config info. I ran an environmental variable perl script and (amongst other things) see this:

HTTP_ACCEPT_ENCODING: gzip, deflate

So can I assume from that line that my hosting account's server handles gzip OK?

If so, what would a person put in their htaccess file to trigger its use?

.......................

TheMadScientist




msg:4036700
 3:51 am on Dec 4, 2009 (gmt 0)

Thanks Reno... I've found it useful too.

Here's a bit more info:
Actually, that is the header sent to your server by your browser, so your browser will accept those encodings. To know what is available on your server you will have to check the installed modules...

Your host should be able to tell you if you cannot find out easily.

Reno




msg:4036729
 5:24 am on Dec 4, 2009 (gmt 0)

That same perl script I mentioned says there are 2587 installed modules, including:

IO::Compress::Gzip::Constants

MIME::Decoder::Gzip64

... so I'm hopeful!

.................

Hissingsid




msg:4036776
 8:49 am on Dec 4, 2009 (gmt 0)

Reno,

Those are Perl modules, you are looking for Apache modules. If you use an info.php file with this in it it

phpinfo();

There is a section on loaded Apache modules.

The version of perldiver that I use does not show you this.

If you are going to play around with this you will need something to check that it is working as expected. I've been using a Firefox extension "Live HTTP headers" which prints out the http messages between browser and server so you can check if gzip is being used, if your images are cached etc.

Cheers

Sid

thayer




msg:4036887
 12:28 pm on Dec 4, 2009 (gmt 0)

The below 2 threads on Content Delivery Networks earlier this year focused upon possible effects on rankings (none seemed to be noted.)
[webmasterworld.com...]
[webmasterworld.com...]

BUT, what about speed for users? How much was that affected? And also, given the purported higher costs of the leading CDN's, have any here had any experience with any of the smaller CDN's?

c41lum




msg:4036910
 1:07 pm on Dec 4, 2009 (gmt 0)

Slightly concerned about this one.
We have an offer page that sends users onto 1 of 3 other websites depending on what the user is looking for; This page is restricted in robots.txt and the file checks that the user is a real user (as best we can) and if it is it send the user onto the correct site.

Its a high traffic page, but sends users through in half a second at most, yet Googles new Site Performance feature has been logging it as taking 8.1 seconds on average.

Is this going to be taken into account and push our site down the SERPs, surely Google shouldn't even be looking at that page!

Worrying!

londrum




msg:4036934
 1:51 pm on Dec 4, 2009 (gmt 0)

maybe they are getting some of the data from google toolbar users. that will likely give differing times depending on the user's connection.

i don't see how they can guage the true speed of a page anyway. if you block your images, for example, and they don't spider those, then that's probably a huge chunk of time they won't ever see.
and maybe they do download javascript files, but i'm pretty sure they don't actually run them. that is another huge chunk gone.

signor_john




msg:4037412
 1:28 am on Dec 5, 2009 (gmt 0)

I finally got around to adding a "deflate" line to my .htaccess file. Result: A little over 70 percent compression of text and HTML code, and pages that load noticeably faster. The cost to me? A few minutes of my time. I can thank Google and Webmaster World for the wake-up call. :-)

gouri




msg:4037439
 2:09 am on Dec 5, 2009 (gmt 0)

Is there a speed that if your site loads in less than that amount (i.e. less than 10 seconds) of seconds then the speed of your site is considered good?

levo




msg:4037465
 3:14 am on Dec 5, 2009 (gmt 0)

The line between fast and slow is ~1.3 seconds on site performance chart..

gouri




msg:4038442
 1:11 am on Dec 7, 2009 (gmt 0)

In WMT, I saw that the new Site Performance measure of speed for a couple of sites that I am working on have increased but I have not made changes to the site in a couple of weeks. The load time is now slower than it was several days ago.

Can anyone tell me why this may happen?

signor_john




msg:4038462
 2:26 am on Dec 7, 2009 (gmt 0)

The load time is now slower than it was several days ago.

I'm seeing the same thing in Webmaster Tools, even though my text and HTML are now "deflated" by 70+ percent over what they were a few days ago and text on pages is displaying perceptibly faster.

This 131 message thread spans 5 pages: < < 131 ( 1 2 3 [4] 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved