| 12:37 am on Apr 12, 2010 (gmt 0)|
Here's a speed tip not in the other thread when thinking about shared v dedicated... Think about how fast a shared box (cluster) has to be at it's core and what type of resources it has to have available for users (and revenue) to be maximized, then think of a way to try and take advantage of those resources without hogging them...
I'd wager there are ways to do things on most websites to take better advantage of the resources available on a shared box than most people think there are (it's definitely true for database access, IMO), and you may even be able to do things to the extent a shared box serves your site faster than most 'basic' dedicated boxes, as long as the shared server is not overloaded.
| 2:46 am on Apr 12, 2010 (gmt 0)|
|Google has started to include site speed in web search rankings. |
They better measure site speed without AdSense, then. And I suspect it's the same for Google Analytics (I don't use that product).
Google scripts such as AdSense make pages s-l-o-w as noted in this thread [webmasterworld.com]
For Google to use page speed in ranking when their own scripts are as heavy as lead seems ironic at best, and hypocritical at worst.
| 3:23 am on Apr 12, 2010 (gmt 0)|
Related to Google Analytics, I mentioned above the new Asynchronous Tracking Code [google.com] version that became available to all GA users last December. Sounds like a good idea to me.
| 12:29 pm on Apr 12, 2010 (gmt 0)|
I wish they would take code vaildation into account.
| 2:14 pm on Apr 12, 2010 (gmt 0)|
A lot of .gov and .edu sites haven't been SEO'd, and as a result are poorly-designed and slow.
| 3:28 pm on Apr 12, 2010 (gmt 0)|
You're telling me with the experience you have that it takes WMT to tell you how slow a sites might possibly be.
No. I am saying that client only woke up and began to smell the coffee after Google told him about the problem. Before that he was like "so what, I type example.com in my browser and the site appears. No problem. There's nothing to fix."
Now that Google says the site is in the lowest 7%, he's changed his tune. Site is a mess of PHP scripts, and cobbled together bits and pieces, but re-organising the code should shave a chunk or two off the times.
Best tools for the job are YSlow (a Firebug extension) and the timeline viewer (in the hidden 'developer' section) in Safari.
| 3:30 pm on Apr 12, 2010 (gmt 0)|
The only thing that's left to add is automation.
| 3:32 pm on Apr 12, 2010 (gmt 0)|
I don't like this system at all: it doesn't really weight site SPEED, but page weight: even if you have an excellent host with broad bandwidth and fast ping, but if your site is rich in images, you will get penalized and rated poor performance because the system think your page is too heavy.
So even if you have a lightning fast host, but your frontpage navigation elements take a little bit to load entirely (without having a negative experience for the user), and even if those elements get cached and load instantly when user moves from page to page, you can still be penalized being a "slow site".
| 4:01 pm on Apr 12, 2010 (gmt 0)|
We've been tweaking our speed as well since the announcement from Google at Pubcon about this. We are now in the top 40% in regards to speed... so hopefully is acceptable enough, because as others have stated, all we have left to drop is Google's scripts themselves... =)
| 4:23 pm on Apr 12, 2010 (gmt 0)|
|Though I am from Texas, I have a Missouri mentality. I'll believe it when I see it. |
It's been in place since mid-January...
What are you seeing WRT how your pages are ranking? ;) LOL
The point is, there is really nothing definitive yet about how (or even where) they're using it, so most of this thread is about what could happen and is simply speculation... They could start using keywords again at any time, but I don't see too many people suggesting huge changes there.
There are a number of reasons it may only be affecting 1% of searches in the US, and one I have not seen mentioned (I hope I don't cause panic, because this is speculation too) is it may only be applied on one dataset that is not routed to very often for testing...
| 5:50 pm on Apr 12, 2010 (gmt 0)|
I do not know if this is appropriate but I have been discussing a tip to speed up a site here:
| 5:52 pm on Apr 12, 2010 (gmt 0)|
BillyS and others - I wouldn't necessarily worry about the speed of your site versus Google's declared overall averages.
I'd worry about being faster than your competitors.
You really only have to outperform them - different types of sites will naturally have different average site speeds. For instance, travel industry sites that have to poll current databases from many airlines typically have extremely slow page speeds. If you were in that industry, I think you just need to be faster than your competitors to achieve ranking benefit.
| 6:00 pm on Apr 12, 2010 (gmt 0)|
The point I want to keep hammering home is Google confesses in WMT the measurement tool is of low accuracy. Duh, why use it then to begin with.
Secondly this 1 percent figure that keeps getting bandied around as a small figure is a whole lot of searches. Likely millions every day.
Thirdly these 200 other factors Google takes into account donít mean squat if site speed becomes a primary figure in site evaluation as it did in Adwords.
| 6:34 pm on Apr 12, 2010 (gmt 0)|
|Duh, why use it then to begin with. |
Does it matter why? They are...
|Likely millions every day. |
It's reported as < 1%, so any guess as to the actual number is purely speculative...
|Thirdly these 200 other factors Google takes into account donít mean squat if site speed becomes a primary figure in site evaluation as it did in Adwords. |
That's true of any factor isn't it? None of the other factors matter if any one becomes the pre-dominant factor, and IMO page speed is gaining PR type hype (and mythology) only weeks after it's introduction, without a single post I've seen about a definitive change in the results a page is ranked for being reported...
Fascinating isn't it? We don't even know how or where it will be used and it's already being touted as 'the next big thing' and is being alluded to as a factor which will trump relevance, PR, TR, and all other factors combined (even though G states the opposite if I remember correctly).
It's only been around for weeks and only affects < 1% of queries, yet IMO it's well on it's way to being a 'legendary' and 'critical' factor in SEO lore, without a single bit of evidence having been produced (or reported that I've seen anyway) to support the supposition...
Can anyone offer a single case or a single bit of evidence of a change in the rankings of a page or site which can be directly attributed to speed?
| 8:28 pm on Apr 12, 2010 (gmt 0)|
|Related to Google Analytics, I mentioned above the new Asynchronous Tracking Code [google.com] version that became available to all GA users last December. Sounds like a good idea to me. |
Okay, so Matt talks about speed later this year. It happens to be incorporated sooner. But we all knew what he said - right.
So I went looking and found nearly a second so far (from 4.3 to 3.4). I placed the Asynchronous Tracking Code in over the weekend and cleaned up some unused CSS code.
|BillyS and others - I wouldn't necessarily worry about the speed of your site versus Google's declared overall averages. |
Silvery - you're right on with that comment. I look at competition, not averages.
| 9:46 pm on Apr 12, 2010 (gmt 0)|
Does the page load time include AJAX calls that fire when the document onload event triggers? Anybody know?
| 10:55 pm on Apr 12, 2010 (gmt 0)|
I did some checking on pages from my site versus the competitors pages that outrank mine.
My pages run about 50-80K. The competitors who are ahead of me have pages that are 300K to as much as 900K.
I've noticed that one major competitor has been slipping in the rankings, and I thought it might be because of the size of their pages. However the sites that are now outranking that competitor (and me) have much heavier pages.
Should be interesting to see what happens.
| 11:33 pm on Apr 12, 2010 (gmt 0)|
I rather see some good sites rank first then sites that load quick, speed dont matter if the content is what im looking for, speed only matter when im driving my V8
| 11:42 pm on Apr 12, 2010 (gmt 0)|
I bet it's for contention in the top few spots on popular queries. A slow site that gets lots of visitors will tend to provide a poor visitor experience, and reflect poorly on Google for ranking them so highly.
| 1:17 am on Apr 13, 2010 (gmt 0)|
|I've mentioned this elsewhere... it is NOT about the "happy user" it is about google being happy with not having to wait for your server to send material to their bots! Do it quicker, they can do more without expending more resources (time really is money for google). |
I raised this issue months ago in supporters, talking about being hosted somewhere that had peer sharing with Google. I was told (tedster perhaps) quite specifically that Google used user data to measure the speed, not their bot speed.
However, I would note the following as being relative common sense:
1) Speed isn't the biggest ranking factor so don't get too bent unless your page is very slow.
2) speed is the easiest thing to fix
3) host somewhere that has peer sharing with whatever ISP your visitors use. In the US don't many people have broadband over cable? Then find a hosting provider that peer shares with AOL or roadrunner or whatever it is. Being physically close helps.
4) gzip your website (have apache compress the pages)
5) take the whitespace out of your site - that'll decrease your page size by 30%. I'm not even kidding.
6) take the whitespace out of your css too. big savings there.
Do all that and I wouldn't worry about it beyond that.
What Google's looking to hammer I think, are my 100 spammier sites I have on my lovely $5 a year shared hosting plan that's down on the first of every month and the other 20 days is super slow. That's the crap they probably want to devalue - and that's fair enough.
| 2:40 am on Apr 13, 2010 (gmt 0)|
Ironically Google's maps and adsense are the slowest part of my site, according to their own tool(page speed).
| 3:36 am on Apr 13, 2010 (gmt 0)|
If a site having less contents, that speeds more.Is it liked by Google serving SERPs? It is difficult to understand G whatever we discuss here. Every now or then their alg changes. Somtimes goes against user exp. We need other contenders to compete with them.
| 9:23 am on Apr 13, 2010 (gmt 0)|
Im really hoping the site speed they are using in their algorithm is not the figure that shows up Google webmaster tools. As this figure is so inaccurate. Also, if you have an upload facility's on the website, The upload time is included with your overall website speed.
Generally though, I like the concept, as page loading time is part of the quality experience. I hate long pages and large flash content that I have to sit and wait to load.
| 9:42 am on Apr 13, 2010 (gmt 0)|
|Does the page load time include AJAX calls that fire when the document onload event triggers? |
Interesting question -- and potentially a bit sneaky thinking as well. You'd think that the onload event itself would be the end of the measurement, wouldn't you?
But we'd really need to be inside Google to give a definitive answer on that one, and I certainly am not. Might be an interesting thing to test, eh?
| 10:40 am on Apr 13, 2010 (gmt 0)|
> The main thing I want to get across is: don't panic... we still put much more weight on factors like relevance, topicality, reputation, value-add, etc. ó all the factors that you probably think about all the time. Compared to those signals, site speed will carry much less weight.
Huge understatement, IMO.
Yes, Google's own study of its site showed differences when there was a faster site but it was so small. The only reason they did anything is because of scale. Unless your site gets about as much traffic as Google it could be a huge waste or time and money to now try faster hosting.
One of the best ways to speed up your site is not faster hosting but making the ads load last.
Google could have made site speed a ranking factor a long time ago. It would have made much more sense ten years ago. Now it's relevance is miniscule. I'd guess of the 400+ factors in the google algo it's No. 401... and 0.00001% of ranking relevance.
| 12:16 pm on Apr 13, 2010 (gmt 0)|
As always, we can never know for sure, but my rankings started dropping a LOT two weeks ago. I didn't do anything to the site. So I wondered if my host had changed something or had been down. Nothing reported on webmaster tools... Hmmm.
So I go and read the news: WOW site speed is "1%" of ranking. Then I go and fix what was wrong. Site is "fast" now according to google. Bam, rankings back. That 1% sure made a 99% difference in my case....
| 12:38 pm on Apr 13, 2010 (gmt 0)|
ranking sites on speed is like ranking footballers on how fast they can run. it doesn't tell you anything. i agree with that guy who said it's more about saving google money. the quicker our sites, the quicker they can download it. and the smaller our pages, the less space they need to store it.
| 1:11 pm on Apr 13, 2010 (gmt 0)|
Rankings have gone up on a site I've been paying particular attention to. Standard stuff - link building, better internal linking using keywords, better directory structure, 301 redirects for defunct pages etc. Only 1 page has updated content once a month (not homepage). Webmaster Tools shows much improved relevance for our main keywords but we're still 50% slower than all the rest!
I think less speculation and more work based on what we know plus some common sense will help. Just make your sites a better experience for anyone and anything. Of course a faster site is better for everyone, bots included, but let's not lose sleep over it?
| 4:59 pm on Apr 13, 2010 (gmt 0)|
|As always, we can never know for sure, but my rankings started dropping a LOT two weeks ago. I didn't do anything to the site. So I wondered if my host had changed something or had been down. Nothing reported on webmaster tools... Hmmm. |
It's been in place for more than two weeks, so if you had dropped when it was initially installed, then I would see more of a direct relationship, but it could be there... My traffic has changed over the same two weeks too (for the better), and I didn't adjust anything, not even speed.
|So I go and read the news: WOW site speed is "1%" of ranking. Then I go and fix what was wrong. Site is "fast" now according to google. Bam, rankings back. That 1% sure made a 99% difference in my case.... |
Let me clarify for readers, because this is a mis-statement... (AFAIK)
It's a factor only affecting < 1% of queries.
It is not stated anywhere to be 1% of your ranking score.
If anyone has seen otherwise and an official Google statement has said it's 1% of the overall score of a site (or page), please let me (us) know and cite your source, because what I've read says it's a factor only affecting < 1% of queries. I have not seen the number of sites or pages affected in any way stated officially, nor seen a percentage of the overall score of a page (or site) officially stated by Google at all.
@ js2k9 ... I'm more trying to keep the mythology down than anything else, so I thought it would be good to point out the difference in the wording you used which is different than what the official sources I've seen are stating, because IMO Site Speed has already replaced PageRank as the most over-hyped and over-rated factor in the entire algo, and it's still a 'brand new' factor, so hopefully we can dispel some of the misconceptions and mis-statements now and let it gain infamy, legend, and mythological misnomers over time, as it should. ;) Besides, how do you think PageRank feels about being replaced as the 'gotta have it' factor by some brand new factor called speed? It's probably all bent out of shape...
| 6:55 pm on Apr 13, 2010 (gmt 0)|
|These suggestions are based on the Page Speed Firefox / Firebug plugin. In order to find the details for these sample URLs, we fetch the page and all its embedded resources with Googlebot. If we are not able to fetch all of embedded content with Googlebot, we may not be able to provide a complete analysis. |
When looking at flagged issues regarding common third-party code such as website analytics scripts, one factor that can also play a role is how wide-spread these scripts are on the web. If they are common across the web, chances are that the average user's browser will have already cached the DNS lookup and the content of the script. While these scripts will still be flagged as separate DNS lookups, in practice they might not play a strong role in the actual load time.
Site Performance in G's WMT [googlewebmastercentral.blogspot.com]
I thought it might be good for people to see what they have to say about site speed in their Webmaster Tools since in reading some of the posts here it seems many people have not read how the scores are derived... IOW They know the actual load time displayed is probably not accurate in all cases, but rather relative.
| 7:22 pm on Apr 13, 2010 (gmt 0)|
|Yes, Google's own study of its site showed differences when there was a faster site but it was so small. The only reason they did anything is because of scale. Unless your site gets about as much traffic as Google it could be a huge waste or time and money to now try faster hosting. |
Pretty sure that studies have shown that increasing page speed increases conversions. So faster hosting or faster whatever makes a difference to your bottom line evenwithout a change in ranking. YOu should speed up your site no matter what google says.
| This 72 message thread spans 3 pages: < < 72 ( 1  3 ) > > |