homepage Welcome to WebmasterWorld Guest from 54.197.147.90
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Googlebot Crawl Speed is NOT the same as Site Performance
tedster




msg:4178899
 11:42 pm on Jul 29, 2010 (gmt 0)

There seems to be a growing confusion and even mythology around Google and site speed. It's becoming painful to read some of the junk around the web on this topic - and also stories from the people who tried to improve their site speed and hurt their rankings instead.

Here's my take on it all. First, notice that there are two very different reports in Webmaster Tools: Crawl Speed and Site Performance.

1. CRAWL SPEED - under "Diagnostics > Crawl Stats"
This is a report on googlebot's experience requesting URLs from your server. The graph shows "time spent downloading a page" - in milliseconds.

2. SITE PERFORMANCE - under "
Labs > Site Performance"
This is a report on the average user's experience rendering pages on your site. The graph is shown with a scale of seconds.

#1 is all about your server - its efficiency, how fast database calls are returned, things like that. #2 is about a whole bunch of other things - everything that affects how long it takes to put the finalized page on a user's monitor.

#1 - your own server
#2 - the visitor's toolbar

Want to improve #1? You may need to move to a better server or hosting service. You may need to optimize your database calls. But there's not a lot more you can do here.

Want to improve #2? There are a whole lot of things you can do, and they were spelled out over a year ago when Google started talking about The Need for Speed [webmasterworld.com].

But looking further, why would we want to do any of this? Is it because Google said speed might be used as a ranking factor? FAIL! I say don't be Google's puppet. I want to be a web-MASTER, and not a web-lemming. If Google is using speed at all, it's still is a very minor part of the algorithm. In fact, this is something Matt Cutts has reinforced several times. As a straight-up ranking factor, site speed is extremely minor.

Google started all this fuss mostly to put the rendering speed issue back on the table, to begin raising awareness of the issue - and it is an important issue. As broadband spread around the globe, some developers forgot that speed still does matter. Heck, I've seen graphic designers who save every jpg at 100% and every png as 24-bit!

If your site speed metric is so bad that you are losing ranking position because of it, then your visitors have already been hating your site anyway. It's not just Google's algo that's punishing you, it's that the visitors that do manage to come in are having a sub-par experience.

So I say improve your site speed because you don't want to crap all over your visitors. Just the same way that you would fix a leaky pipe over the main door to a street store. But don't run will-nilly making big changes when you don't really understand what you are affecting or why.

The simplest speed improvement many sites can make is to activate gzip. Just that much is simple and can make a huge difference with little risk. I'm always looking for the best returns for the least resources spent, and gzip is low-hanging fruit. But you're not likely to find me playing around with e-tags in the near future.

[edited by: tedster at 9:51 pm (utc) on Jul 30, 2010]

 

tedster




msg:4178981
 4:36 am on Jul 30, 2010 (gmt 0)

As you can probably tell, I think graphic compression is also low-hanging fruit. It's a rare gif file that needs 256 colors (on small images, 16 is amazing); it's a rare online jpg that needs more than 40% for compression; and only png files that truly need an alpha channel should be saved at 24-bit.

I'm amazed at how graphics are abused online. The Photoshop option says "save for the web" - which, by the way, also removes some of the embedded xml data and that trims the file even further - so remember the "web" part.

Has anyone got any other low-hanging fruit for improving site speed?

levo




msg:4178998
 5:02 am on Jul 30, 2010 (gmt 0)

Setting expires & cache-control for jpg/gif/js/css/ico/png to 1 year; if you set it shorter, say 1 week, internet explorer sends if-modified-since every single time after 1 week. You won't notice the difference on low-latency connection, but the whole page flickers on high-latency connections.

phranque




msg:4179015
 6:36 am on Jul 30, 2010 (gmt 0)

decreasing the number of http requests can make a difference in site performance.
possible techniques for this include:
- combining style sheets and/or javascripts into a smaller number of files and/or specifying the css or javascript source within the document itself.
this will be a tradeoff with software packaging issues, reusability, configuration management and version control, etc.
- using css sprites where multiple images such as a set of icons or menu items are combined into a single image file and only the required part of the image is displayed where necessary.

another important factor for the user experience is to provide the correct cache control headers so that subsequent pages when rendered don't make requests for resources that have been previously downloaded, such as the images, favicons, style sheets or scripts that are common across multiple pages on a site or even when the same page is later requested.

tedster




msg:4179021
 7:07 am on Jul 30, 2010 (gmt 0)

Good one phranque. And since site performance uses toolbar data, those cache related actions will definitely make a difference. Some of the best site performance tests I've seen make use of a follow-up request for the same URL, just to check on caching issues.

yaix2




msg:4179022
 7:07 am on Jul 30, 2010 (gmt 0)

If your site speed metric is so bad that you are losing ranking position because of it, then your visitors have already been hating your site anyway.


That sums it up very well.

acemi




msg:4179034
 8:11 am on Jul 30, 2010 (gmt 0)

Putting static files such as css, scripts, graphics and flash on a cookieless subdomain/s increases the parallel download rate making the page render faster

phranque




msg:4179058
 9:26 am on Jul 30, 2010 (gmt 0)

sorry, levo!
i submitted my post long after previewing it and just now noticed i almost had the same answer regarding cacheing as you.

i would like to add to that a reference to this "Enabling If-Modified-Since Headers" [webmasterworld.com] thread from the Apache forum.
as usual pay special attention to what jdMorgan writes in that thread.

iambic9




msg:4179065
 9:51 am on Jul 30, 2010 (gmt 0)

Don't be Google's puppet


I couldn't agree more with the above, if you're primarily motivated by anything other than giving your users the best experience possible, then I'd argue that "Site Performance" (as measured by Google) is the least of your problems. I'm completely obsessed with UI design and user experience. Not a single thing is more important than the experience a user has when then arrive on your site.

PRELOADING

Something I've been experimenting with recently is preloading content based on the path a user is most likely to take through our site.

Now this can be as simple or as complicated as you'd like to make it, if your site structure and content is well managed, a users next click should be with some degree of accuracy predictable, and if you know where your user is going next, you're missing an opportunity to improve usability if you're not getting that content ready for them before they arrive.

OPTIMISED LANDING PAGES

A majority or our users arrive on our site via the home page. Our home page is static HTML (the rest of the site is a PHP+MySQL CMF), contains no Javascript, uses just enough inline CSS to render the content, everything is compressed, we load literally the minimum amount of content. The average users spends approximately 30 seconds viewing the home page, 90% of our users click on the main image that represents our newest content.

During the 30 second user ponder, I start preloading the master CSS for the site, a small JS library, a CSS Sprite and the main image for the significantly fatter linked page. The result is a home page that loads incredibly fast, and the second click from the user pretty much just snaps into place, it's particularly impressive for the user because the image on the second page is usually 850 x 500 and it appears in under a second, the user has no idea that they are preloading our content in bite size chunks and just assume it's quick.

YMMV, this practice works well for us because we use a lot of large glossy images of products that have an emphasis on design, and we spend a lot of time looking at user data so we're pretty clear on what to preload and what not to. Of course now and again you're going to load content the user doesn't need, but it's a small price to pay if you can afford it.

fabulousyarn




msg:4179189
 2:17 pm on Jul 30, 2010 (gmt 0)

OOOOOhhh this is good - tedster, where can I find specs on this that you would recommend reading. 'correct cache control headers ' I have been working on load speeds from a user exp. perspective and it has totally helped in terms of the big G.

bwnbwn




msg:4179208
 2:36 pm on Jul 30, 2010 (gmt 0)

Funny thing is all of a sudden site owners begin worrying about their site speed when for the last 10 years one of the most important parts of a site is how fast you can get it to pop for a visitor.
If they would have done their homework before launching the bloated sites we have now all this would really be a mute subject.
It takes someone in this case google to get the message across and all of a sudden we have all these sites worrying how fast their site is. Just kinda of ironic to me.

StoutFiles




msg:4179209
 2:36 pm on Jul 30, 2010 (gmt 0)

YSlow is still my favorite tool to monitor what I need to do to improve page speed. It goes over almost everything mentioned above and more.

Globetrotter




msg:4179217
 2:48 pm on Jul 30, 2010 (gmt 0)

I've spend quite some time on this. Here some of the things i've learned the hard way:

A browser will wait rendering the rest of the page until the javascript is loaded. Put your javascript files at the bottom of the page. This way the browser will render your site much faster because it doesn't have to wait. You can also you "deferer" (google for it) in the script tag to get the same result in some browsers. This doens't make your site load faster but render faster (which is a good user experience).

Make sure the browser doens't send any cookies to the CDN or subdomain you are using. It's a common mistake and a quick win.

Try to minimize the use of inline javascript en css. If you need to use it the order (js or css first) matters.

Each file you are downloading takes about 4kb of header data. So each file you can miss saves you an extra few kb.

Use yuicompressor to compress your css and js files, in my opinion it's the best tool right now.

If possible lazyload (javascript) files. (load basic files first then load other extra files).

If possible send your output (buffer) to the browser as soon as possible and keep doing this until your page is loaded.

If you really want the max out of it buy the book "building faster websites"

explorador




msg:4179260
 4:04 pm on Jul 30, 2010 (gmt 0)

Great thread

I'm using some Javascript link blocks, just as some footers. There are sites with a block of links repeated on every page. You could render that part with a javascript at the end of the page. In many cases that part of the page is never used, also, the visitor will do, see anything before reaching that point (while the server loads that part). This way the external JS will be cached and not served by the server again on every page. Just remember, those blocks are hardly being indexed as links on SEs but they can save a lot of time. (its just another approach to server sides includes which many use to add their footers)

TheMadScientist




msg:4179273
 4:30 pm on Jul 30, 2010 (gmt 0)

One very easy one, and often missed I think, is to use www. for 'needs cookies' and put everything else on non-www without cookies... The browser sends any cookie(s) with every request for every file because it doesn't know any better, so if you set your cookies to be sent for www. and move everything else to non-www the browser only sends the cookie(s) when it's actually needed to make the site operate. (Upstream is the slowest part of the connection, because connections are optimized for downloading and uploading is always a slower stream than down.)

NOTE: This DOES NOT WORK the other way around! You cannot set your cookie on non-www and then serve cookieless files from www. because the cookie for non-www is sent by default to any sub-domain, even if you try to say otherwise when you set the cookie by setting it for example.com (without the preceding .example.com).

tedster




msg:4179309
 5:36 pm on Jul 30, 2010 (gmt 0)

where can I find specs on this that you would recommend reading. 'correct cache control headers '

A good starting place is [developer.yahoo.com...] - this is the documentation for Yahoo's YSlow tool. It's an interesting note that Steve Souders [stevesouders.com] did most of the speed research while he was at Yahoo and then was hired at Google.

BillyS




msg:4179331
 5:57 pm on Jul 30, 2010 (gmt 0)

Here's one that's often ignored:

favicon.ico - Do you have one? How big is the file?

I have a favicon.ico in my browser cache that is 290KB (<-- Not a typo). I have many that are over 30KB in size. What a waste of bandwidth...

Mine is less than 1KB (and so is WebmasterWorld's). The original size of mine was 5KB, it took me 20 minutes find a solution and to fix the file size.

latimer




msg:4179435
 8:37 pm on Jul 30, 2010 (gmt 0)

What are some of you seeing when selecting "set custom crawl rate" from the WMT - Site Configuration - Settings - Set Custom Crawl Rate? Do high numbers here indicate a site has problem with speed?

The crawl rate affects the speed of Googlebot's requests during the crawl process. It has no effect on how often Googlebot crawls your site. Google determines the recommended rate based on the number of pages in your site.


We allow g to select the crawl rate. Based on number of pages on our site, it would currently take about 2 months for them to crawl all the pages per the g crawl rate setting.

As previously discussed here:
URL 09 thread on WMT speed stats [webmasterworld.com]

NoahSEO




msg:4180397
 2:25 am on Aug 2, 2010 (gmt 0)

very good post, i myself see website performance as a metric for user experiece much more than google ranking.
I mainly use css sprites to improve performance, because I am not so sure if gzip is secure enough for most users, or most browsers.

btw, iambic9 provide a very good idea, but i dont know how to set proloading.

dickbaker




msg:4180411
 3:15 am on Aug 2, 2010 (gmt 0)

I've gone through and made changes to my site to improved speed. According to the Firefox speed test, my site is an 84, and Google is a 90. Webmaster Tools, though, shows that my site is slower than 54% of the sites on the 'net.

How much of the page load time, if any, is affected by the user's connection speed?

tedster




msg:4180412
 3:19 am on Aug 2, 2010 (gmt 0)

The Webmaster Tools reports rests heavily on Toolbar data, so if your site has lots of users on slow connections that might be an influence. Still, I would "hope" that Google sees which toolbars are on slow connections and normalizes that data.

Lots of people have mentioned that the WMT data seems funky to them. It's a good thing that it's not a big part of ranking right now!

physics




msg:4181494
 10:57 pm on Aug 3, 2010 (gmt 0)


I want to be a web-MASTER, and not a web-lemming.

This is a classic!

incrediBILL




msg:4210783
 6:57 pm on Oct 3, 2010 (gmt 0)

specifying the css or javascript source within the document itself.


Of course this actually bloats the individual page downloads and all page loads appear to be larger and slower and loads junk into the page the SE's typically don't need.

Combining into the minimum number of files is the best idea for overall performance.

PRELOADING - Something I've been experimenting with recently is preloading content based on the path a user is most likely to take through our site.


When you get too much preloading going on it can impact overall server performance not to mention you're just wasting bandwidth based on an assumption.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved