Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Site speed - what side effects exist ?

         

Whitey

10:53 am on May 2, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Matt Cutts recently announced [mattcutts.com...] that Google would take into account site speed as a potential ranking factor. At the same time he say's " don't panic since Google considers many other elements in ranking sites.

But are there greater side effects with the non indexing of pages on slow sites that could cause concern. For example , would Google fail to index pages that don't load within a reasonable time frame and could this drag a site backwards.

tedster

7:14 pm on May 2, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have never seen a page with good PR excluded from the index if the loading speed is slow - and I look at a lot of slow sites. It is "possible" I suppose, but the really important factors seem to be not enough link juice or unique content.

The biggest side effect I can see that comes with slow pages is lower traffic and also less exploration of the site by the traffic the site does get. That kind of user/traffic data could have an indirect (but very real) effect on Google indexing and ranking.

TheMadScientist

7:44 pm on May 2, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For example , would Google fail to index pages that don't load within a reasonable time frame and could this drag a site backwards.

I would guess, yes, but... I would think it would be when they have to use a reduced crawl rate to keep from taking a site down which limits their ability to crawl a site, so in that respect it could be considered to have been part of the rankings indirectly for quite a period of time.

I think the user experience issue(s) a slow loading site causes which tedster points out is a much better reason to tackle the speed question than any other, including rankings. IOW: I make user experience the biggest reason I optimize for speed.

tedster

8:38 pm on May 2, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Note that crawl speed, on its own, is not the key factor in the Site Speed metric. Instead it includes everything about the speed with which a page renders - and googlebot does not render a page.

For instance, googlebot may be able to crawl the HTML page quite efficiently - the server is handling its load well, the database calls are cooking right along or the source code is served from a cached version, etc.

But if that page makes 70 more HTTP requests for external files, images, etc., you may have a good crawl speed but still be in Site Speed trouble. The secondary HTTP requests don't enter into the crawl speed because googlebot osn only requesting the HTML.

But the overall picture DOES enter into the Site Speed number, which comes predominantly from human visitors using the Google toolbar.

With the growth of broadband, I've seen many sites begin to ignore the file size of images, or the sane creation of JS and CSS files. Just last week I was asked to look at a site where the users were complaining about slow pages. The JPG images were uploaded with no JPEG compression at all - and every page called 26 external .js files and 20 external .css files.

I've also been seeing jpgs where the file has the same pixel dimensions that a digital camera originally created. The HTML is resizing the image on screen, but first the entire monster file must be downloaded.

A lot of good housekeeping for a website seems to more rare since broadband entered the picture. However, not everyone with the Google Toolbar installed is accessing your site via broadband.