homepage Welcome to WebmasterWorld Guest from 54.197.111.87
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 34 message thread spans 2 pages: 34 ( [1] 2 > >     
Site speed and Panda
shallow




msg:4312415
 3:17 pm on May 14, 2011 (gmt 0)

According to Google Webmaster Tools, my site is slower than 76% of sites (slightly better stats than when I checked two weeks ago).

My site developer spent quite a bit of time speeding up the site (WordPress)early last year. His assessment, as well as Alexa stats, are in direct opposition to Google's stats.

But we know whose stats really count, right?!

My site has been hit very bad by Panda and, after doing a lot of reading in these and other forums, I'm not convinced there is much I can do to change things without spending much more than I can now afford.

Will improving site speed in any way help some of the negative effects of Panda? If so, I'll invest the money.

Thank you.

 

tedster




msg:4312424
 4:23 pm on May 14, 2011 (gmt 0)

Right up front, I assume you understand that any discussion about what "will or won't help" is mostly theoretical. The overall level of solid Psnda certainty is still quite weak.

Site speed is only a minor ranking factor at this time, and I doubt that Panda changed that. However, if Panda looks at user stats (and that seems likely) and if visitors are bailing on the site because the pages are slow to load - then you have a good reason to address site speed. and giving visitors a good experience should be the main focus anyway, not Google.

Google's thinks of their job as creating results that offer THEIR users the best experience available. And every site owner should know that their primary job is to offer the best experience they can to their users, too.

Site Speed can't be the only factor influencing a Panda dfevaluation, no matter how indirectly. But doing what you can to improve should only be a help

shallow




msg:4312430
 5:01 pm on May 14, 2011 (gmt 0)

Right up front, I assume you understand that any discussion about what "will or won't help" is mostly theoretical.


Absolutely! I know everything is theoretical at this time.

and if visitors are bailing on the site because the pages are slow to load - then you have a good reason to address site speed. and giving visitors a good experience should be the main focus anyway,


I agree; it always has been.

But until recently, I was unaware of the stats provided by Google WM. I was going by what two web developers told me after their testing my site, and Alexa that says 87% of sites are slower than mine.

Go figure.

I will ask my web developer to see what else he can do.

Tedster, thanks once again for your insights and help.

tedster




msg:4312434
 5:08 pm on May 14, 2011 (gmt 0)

He can do a lot better that looking up your site on Alexa. He can install the YSlow or Google PageSpeed tools and run the tests himself. There are free online tools pages that will run both for you and give you very thorough reports that include "waterfall graphs" of each http request and how much time it's adding.

Site Speed is an entire country, and an Alexa report is just a tiny, tiny entry port to that country.

shallow




msg:4312457
 6:58 pm on May 14, 2011 (gmt 0)

Sorry for the misunderstanding. He did not use Alexa; I did. I don't know what methods of testing he used. Whatever, there is obviously room for improvement and

I have faith that he will be able to eek more speed out of the site. It is graphic intensive and uses NexGeneration Galleries, which I've recently read may slow things down. I've also read about some WordPress solutions too.

shallow




msg:4312470
 7:32 pm on May 14, 2011 (gmt 0)

I just tried a few pages using Site Speed.

The heaviest page in my site contains several large jpegs, adsense ads and an imbedded YouTube video. It loaded in 1.62 seconds.


Home page loaded in .36 seconds. Along with text, it has one large image and many thumbnail images. A list of recent comments, most commented as well as a NEXGen gallery run along the bottom the home page and every page in thesite).

All other randomly tested pages (about 15) load between .04 and .44 seconds.

If Site Speed is correct, then I'm dumbfounded why Google says that, on average, pages in the site take 5.0 seconds to load.

They must use a different method to calculate speed?

dataguy




msg:4312502
 8:40 pm on May 14, 2011 (gmt 0)

Google uses toolbar and Chrome data to measure page load time, so if your site has a high ratio of people visiting with slow Internet connections, your page load time, according to Google, will be slower.

I've prided myself in having a site that Webmaster Tools reported at being in the 90-95th percentile in speed. Since Panda I've lost 65% of my traffic, and WT shows my site in the 57th percentile. I still can't figure out how this can be accurate.

Broadway




msg:4312517
 9:29 pm on May 14, 2011 (gmt 0)

As part of my post-panda website remake, I've improved my website's speed by 1/3 (although it's still only at the 50% of all websites mark).

I'm still in the Panda dog house.

koan




msg:4312521
 10:00 pm on May 14, 2011 (gmt 0)

If you have a lot of visitors from developing countries, your web site performance may be slower for those with limited bandwidth, processing power or older software. If they use the Google Toolbar, that may skew the results. I know I have sites that have an average of say (from the top of my head) 4-5 seconds loads, but if I myself refresh a page with cache disabled, it's still always around 1-2 seconds top. So Google's numbers are pretty unscientific.

Sgt_Kickaxe




msg:4312522
 10:00 pm on May 14, 2011 (gmt 0)

I think "Panda side effects" might be coming into play for some lucky (unlucky?) sites. When a site owner, such as yourself, is hit by the bear they make changes which may be helping them rank better and possibly at your expense.

I'm seeing one particular site writing articles with the intentional goal of taking my site rankings. The site is writing a series of very similar articles and using very similar titles to try and "blanket" the sought after keyword. Like a machine gun they keep going and going... think "big blue widgets" and then "blue widgets big" and then "widgets big and blue" etc. Just what the internet needs...

shallow




msg:4312651
 1:21 pm on May 15, 2011 (gmt 0)

Google uses toolbar and Chrome data to measure page load time, so if your site has a high ratio of people visiting with slow Internet connections, your page load time, according to Google, will be slower.


Is a site like Site Speed a reliable way to measure page load time? And, if so, why are the page load times so much faster than what Google's telling me.

If Google measures site speed based on the Internet connections used by site visitors, well there is not much I can do about that.

So what are some of the ways to increase site speed in the eyes of Google?

dickbaker




msg:4312659
 2:02 pm on May 15, 2011 (gmt 0)

So what are some of the ways to increase site speed in the eyes of Google?


Google lays out suggestions in Webmaster Tools under the site speed heading. It's a pretty comprehensive list.

My site speed according to Google is faster than 68-72% of all sites. According to the speed tool in Firefox, it has a ranking of 85-90. Google is at 90.

I haven't changed anything recently that would change site speed. I think what's happened is that the considerably lowered traffic after Panda has resulted in less demand on the server, so that pages load a bit faster.

pageoneresults




msg:4312674
 3:03 pm on May 15, 2011 (gmt 0)

If Site Speed is correct, then I'm dumbfounded why Google says that, on average, pages in the site take 5.0 seconds to load.


From Google...

Site Performance shows Page Speed suggestions based on content served to Googlebot (as opposed to a regular user's browser).


More...

Site Performance attempts to show you the best estimate of the page load time. It often represents an aggregate of thousands of data points, collected from all around the world, over various network connections, browsers and computer configurations. It's quite possible that any one user might experience your site significantly faster or slower than this aggregate. Site Performance data works best when it has lots of data points to aggregate over. If your site is small, or doesn't attract a lot of traffic, the results may be slightly skewed.


Page Speed suggestions as shown in Site Performance are based on the version of your page as seen by Googlebot, Google's crawler. For various reasons—for example, if your robots.txt file blocks Googlebot from crawling CSS or other embedded content— these may differ slightly from the suggestions you get when you run the Page Speed extension for Firefox.


Webmaster Tools › Help Articles › Using Webmaster Tools › Labs › Site Performance
[Google.com...]

Site speed is only a minor ranking factor at this time.


I'm going to disagree. I'd say it is in the top 5 list. In fact, it should be the first thing that is addressed. It's kind of difficult to do all the other things required if the site is not performing at its best.

shallow




msg:4312726
 5:49 pm on May 15, 2011 (gmt 0)

Thanks Pageone. I tried the Page Speed Online because I don't know how to add the Page Speed browser extensions. I'm going to forward the info to my web developer and ask him to continue optimizing the site for increased speed performance.

[pagespeed.googlelabs.com...]

I do believe you are absolutely right that site speed is in the top five list!

pageoneresults




msg:4312741
 6:42 pm on May 15, 2011 (gmt 0)

I had this really long well thought out technical reply and just nixed it. Why? Because it really does get technical and some of this stuff is still over me head. I'm learning every day.

I do know that HTTP Requests are a key factor in the equation. Not only that, but how many Round Trips are made to the server for each HTTP Request.

One of the commonalities amongst many of the sites I saw as examples that were affected by Panda was document performance. A large percentage of them were downright abusive making 300-400+ HTTP Requests and weighing in at 2-5MB per document. Some of those requests were cached, many were not.

Now, to counter that, sites that improved also "appear" to be abusive with the requests. But, if you compare the technologies being used and the performance of the server networks involved, certain sites can serve larger documents more efficiently than others.

One of the largest contributing factors to the sheer volume of HTTP Requests came from image references within CSS files. It appears that many designers are not using CSS Sprites and will gladly serve up 250 to 400 images via CSS. That's not going to work in your favor from a performance perspective.

I looked at the makeup of those images for one site. There were 225 being referenced via CSS. They could have been combined into 3 or 4 CSS Sprites reducing the number of HTTP Requests from 225 to 3-4. That's a major improvement in document performance. Yes, the CSS Sprite images themselves can get rather large, but if planned properly, they can still be served at a reasonable file size. You really have to focus on optimizing your png images in this scenario. Every single byte counts.

Popular web sites spend between 5% and 38% of the time downloading the HTML document. The other 62% to 95% of the time is spent making HTTP requests to fetch all the components in that HTML document (i.e. images, scripts, and stylesheets). The impact of having many components in the page is exacerbated by the fact that browsers download only two or four components in parallel per hostname, depending on the HTTP version of the response and the user's browser.


CSS Sprites: What They Are, Why They’re Cool, and How To Use Them
[CSS-Tricks.com...]

I found the above site to be quite informative when it comes to CSS Sprites. Easy to understand too.

dataguy




msg:4312760
 7:18 pm on May 15, 2011 (gmt 0)

I highly recommend converting to sprites, it seems this is one of the easiest things to do and has the most bang for the buck.

dataguy




msg:4312765
 7:33 pm on May 15, 2011 (gmt 0)

Here's another theory:

Site speed may be one of the top 5 ranking criteria, but probably more indirectly than one might think. For years (first I recall is Pubcon / New Orleans 2005) Google representatives have told stories about how site speed is incredibly important to end user experience. So much so, it overshadows other aspects of a website. Many of the tests they perform for other quality factors using human judges are disproportionally effected by site speed. Great sites are often rated low quality because they are slow, bad sites are often rated high quality because they are fast. Anyone who follows search team engineers has heard these stories.

My theory is that they have had another 'aha!' moment while working on Panda when they realized that shiny websites rate disproportionally higher by human judges, too. This is backed up by Matt Cutts suggesting that every websites should look like an Apple product, and some SEO's finding that sites with div's that contain backround-images have generally fared better with Panda than sites without. Background images generally give beveled edges, rounded corners and smooth shadows, just like Apple products.

At that Pubcon 2005 a search team engineer told me that if he were me, he'd invest in the fastest servers and the fattest pipe I could afford. His advice increased my revenue by $300/day.

Today, next to fundamental on-page SEO like removing stubs and making sure your content is unique and compelling, I think the search team would advise to hire the best graphic artist to design the coolest website we can afford. That's what I'm doing.

shallow




msg:4312781
 8:37 pm on May 15, 2011 (gmt 0)

I read the information about sprites and SpriteMe. It doesn't sound like something I can do myself because I don't know CSS. I'll have to ask my web developer to do that too.

I had my site moved last March to a Virtual Private Server, and I thought that was supposed to improve things. It did. Traffic grew, income improved. Then Boom: Panda.

Slowly but surely I'm learning to accept what has happened, must cut back expenses a lot but I will spend money on improving site speed.

advise to hire the best graphic artist to design the coolest website we can afford.


I had planned to do that this year but can no longer afford it.

shallow




msg:4313614
 1:26 pm on May 17, 2011 (gmt 0)

My site has sped up according to Google:

May ? - 8.6 sec to load, slower than 91% of sites.
May 7th - 7.1 seconds to load; slower than 86% of sites
May 10th - 5 seconds to load; slower than 76% of sites
May 14th - 4.9 seconds to load; slower than 74% of sites
May 15th - 4.6 seconds to load; slower than 71% of sites

Nothing has been done to the back end of the site yet. But I've been redirecting many crawl errors (404-Not found). There were about 1500 when I first started and now there are "only" 1030.

Am I right that all the redirects have been a factor in Google rating my site slow?

I've been redirecting urls with 10 or more errors. So far I've manually redirected about 240.

But I'm confused because I read a site shouldn't have two many redirects. But how many is too many?

Some of the errors go back years, when the site was managed with FrontPage (html) and then Coranto (php).

pageoneresults




msg:4313631
 2:05 pm on May 17, 2011 (gmt 0)

Nothing has been done to the back end of the site yet. But I've been redirecting many crawl errors (404-Not found). There were about 1500 when I first started and now there are "only" 1030.


Site Performance shows Page Speed suggestions based on content served to Googlebot (as opposed to a regular user's browser).


Am I right that all the redirects have been a factor in Google rating my site slow?


Yes, from a crawling perspective.

But I'm confused because I read a site shouldn't have two many redirects. But how many is too many?


The site shouldn't have any hard coded internal redirects. Redirects to capture potential 404s, etc. don't count.

Some of the errors go back years, when the site was managed with FrontPage (html) and then Coranto (php).


There's a good chance you'll see some improvements but that is just my personal experience based on correcting errors over the years. 70%+ of most Webmaster woes can be attributed to technical errors somewhere in the pipeline.

Edited to correct incorrect information. See tedster's next reply. He busted me. ;)

[edited by: pageoneresults at 2:52 pm (utc) on May 17, 2011]

tedster




msg:4313648
 2:33 pm on May 17, 2011 (gmt 0)

Site Performance is based on Googlebot's crawl times

Sorry, PageOne - not true. Site Performance is based on data from browsers with the Google Toolbar installed. It has to be, because it's "the time it takes to load in a browser" and a crawler does not render the page, it just downloads the HTML source code.

[edited by: tedster at 3:26 pm (utc) on May 17, 2011]

scooterdude




msg:4313667
 3:08 pm on May 17, 2011 (gmt 0)

I buy dataguy's theory

Now, all i need after, is that increase in my income :)

reblaus




msg:4313668
 3:10 pm on May 17, 2011 (gmt 0)

Did anybody try the new Site Speed feature in Google Analytics that Google launched about a week ago?
I get weird results. Several of the pages take like 200 seconds according to this. There are also similar pages that are listed with under 10 seconds.
If Google takes site speed into account it would be good to know what information they take. In WMT the average site speed is way lower and closer to my local tests than what I see in Google Analytics.

Do you also see strange long load times with your pages in Google Analytics?

johnmoose




msg:4313682
 3:47 pm on May 17, 2011 (gmt 0)

Yeah, I did see that too including the weird page load times.

pageoneresults




msg:4313683
 3:47 pm on May 17, 2011 (gmt 0)

Site Performance is based on data from browsers with the Google Toolbar installed. It has to be, because it's "the time it takes to load in a browser" and a crawler does not render the page, it just downloads the HTML source code.


My mistake, I've edited that reply and removed that portion.

Page Speed suggestions as shown in Site Performance are based on the version of your page as seen by Googlebot, Google's crawler. For various reasons—for example, if your robots.txt file blocks Googlebot from crawling CSS or other embedded content— these may differ slightly from the suggestions you get when you run the Page Speed extension for Firefox.


A crawler does not render the page.


Argh! I always get busted when discussing this, there's even a topic I started years ago verifying this language too. :)

How Do Search Engine Robots Work?
Jan 3, 2007 - [WebmasterWorld.com...]

I'll be back later with more information. I believe Googlebot Crawl Times and Site Performance are directly related.

londrum




msg:4313704
 4:29 pm on May 17, 2011 (gmt 0)

when googlebot can't reach pages because they hang, or whatever, im sure that gets incorporated into site speed. because the user wouldn't be able to reach it either.
eg: if the user returns a time of 3 secs for page load, but googlebot finds the page unavailable 5 times out of 10, they would be nuts to go ahead with the value of 3 secs. so googlebot must have some kind of effect on the site speed.

some sites have millions of pages too... and a fair amount of them will only be visited once a week i reckon, if that.
eg: how many people will search amazon for an obscure book thats been out of print for 10 years? not many. but googlebot will still crawl it.
if google has no data from users, it makes sense that they would guess site speed by other means.

tedster




msg:4313706
 4:31 pm on May 17, 2011 (gmt 0)

And I am certain there is a correlation, but not a direct relationship - not a cause. I can easily code a very fast html page that takes a very long time to actually render. All it takes is a couple big images, or a complex <canvas> element.

And if your site doesn't make intelligent use of the browser cache, then no matter how well optimized you are for a bot crawl you will not achieve good Site Performance stats.

tedster




msg:4313707
 4:33 pm on May 17, 2011 (gmt 0)

if the user returns a time of 3 secs for page load, but googlebot finds the page unavailable 5 times out of 10, they would be nuts to go ahead with the value of 3 secs.

I'm also sure it's not ONLY the browser data, but rather some complex compilation that is mostly based on the browser/toolbar.

shallow




msg:4313730
 5:34 pm on May 17, 2011 (gmt 0)

My web developer is going to cut down the number of HTTP requests by combining as many of the js and css scripts as possible and by combining as many of the images as possible using CSS sprites. He's also going to try another form of the lazy load for images.

I'm going to continue checking crawl errors weekly and redirect those as mentioned above.

Don't know if any of this will help take a nibble out of the effect of Panda but, as has been suggested, at the very least it will give site visitors a better experience.

tedster




msg:4313733
 5:50 pm on May 17, 2011 (gmt 0)

Look at the image files themselves, too. Many graphics workers don't want to compress jpgs below 60% in Photoshop, but 40% is just fine with today's compression algo. Similarly, browser resizing of larger images is way too common, use of 24-bit png when you only need 8-bit, etc, etc.

Gzip compression for text assets can be a biggie, too. HTML, JavaScript and CSS can be zipped up and give a good bit of savings.

Then there's browser cache issues I mentioned in passing above. Cache-control, ETags, all that good stuff. If images can't be cached, you just took a big performance hit. I would check into all the above before I worried about CSS sprites or the number of http requests.

Google introduced an Apache module called mod_pagespeed that will automatically handle many common issues. I find it's very worth testing that out if you're on Apache.

This 34 message thread spans 2 pages: 34 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved