|Yes, Google's own study of its site showed differences when there was a faster site but it was so small. The only reason they did anything is because of scale. Unless your site gets about as much traffic as Google it could be a huge waste or time and money to now try faster hosting. |
Pretty sure that studies have shown that increasing page speed increases conversions. So faster hosting or faster whatever makes a difference to your bottom line evenwithout a change in ranking. YOu should speed up your site no matter what google says.
What seems to be missing here is: it's not the server so much as the punter.
IF the speed info is being collected by GTB then there are SO many factors that can influence the actual download speed, including:
1. Several downloads going on simultaneously. Eg I AND my wife both loading pages at the same time - happens a lot - and some of her files can be large MOV files (a dozen or more today) and some of mine are zips. At the same time I might be running something on a third computer and I run a backup mail server locally on a fourth machine that processes quite a bit of spam. Etc. What we don't do is run stupid GTB, but if we did... A lot of people who view TV programs online or download videos, music etc probably DO have GTB installed. And probably a load of other rubbish such as alexa, funweb and similar report-home tools. What speed do they get when looking at a mere web page from the other side of the world?
2. A slow ISP (some people in the UK are still running 512K broadband download, some are STILL using dial-up!). How does G figure that into the equation? Even with "fast" broadband it's well known that in the UK the stated speed is far slower than stated by the ISP. Contention ratios also apply: 10% of the users on MY contention may be downloading videos or watching TV: bang goes my own speed.
3. At one time some broadband caps on "cheap" lines were in the nature of a slow-down rather than complete turn-off. Not sure if that still applies but another factor if true.
4. In the UK we ALMOST got clobbered with Phorm's DPI by BT and potentially Virgin (BT did run it illegally for a while). If that ever gets off the ground it will slow downloads, possibly seriously (it did in the BT trial). Similar thing happened in the US (Nebuad). Also, the UK Gov's attempt to ingratiate themselves with media companies by running DPI on downloads for "piracy detection" (currently, I think, in limbo pending the election).
I'm sure others can think of similar reasons why downloads could be slow. I'm sure the list above applies to a lot of users. So is it any wonder that G's WMT shows longer delays than the sites' owners can prove?
Site content is (or should be) the most important thing. If I really want to see something I will tolerate a half-second or so extra download time - the browser may well take a week to render the downloaded page anyway (well, a long time anyway: this happens on a lot of GOOD sites I visit)! If I'm not too fussed about which site I view and the time delay becomes too annoying then I'll dump the site and try another.
I personally don't care one bit about G's own speed problem nor how they are perceived by their "customers". My HOPE is that so many good sites fail to show up on G due to their recent round of anti-website activity that people desert G in droves.
No idea where they'd go - bing will doubtless follow suit soon. Perhaps Cuill? Or better still a meta engine that gets the best from a dozen SEs.
Great post OnlineContent... I sooooo wish I could post my affiliate link, because I'd be stinkin' rich after the plug I just gave the cloud host I'm referring to, but since I already gave the farm away I'll have to just hope people who know how visit my site and leave it at: Personally, I definitely recommend against dedicated moving forward, especially so if you're concerned about speed, but as always, DYODD.
With site speed being so important, what about sending the 304 Not modified response for pages that aren't updated? Will that make Google to view the site as super fast?
If at all Google fetches an existing page that is not modified, which is often the case in most sites,it just receives the head "304 not modified" response.
This happens by default in html pages but the problem is with almost all CMS including wordpress, which never reponds correctly to this.
If not directly, it could definitely have an indirect benefit to the speed of a site by returning a 304 correctly, because even if GBot does not send a conditional GET most browsers do and even if G did not factor a 304 response into their speed calculation, by lightening the load on your server through the use of a 304, when applicable, you by default speed up the serving of the full content to those who do not already have the most recent version in their cache...
IOW: Yes, it's probably a good idea. I would not recommend doing it all the time for all visitors unless you can serve it correctly (meaning they have the most recent version), but if you can find when the page was last updated or what the most recent version of the page is, then it would probably be a good hack to put in place. I don't work with WP, and it might be a fairly extensive change on some CMS systems, but I would probably do it, because if even 10% of your visitors have the most recent version of everything, then it would seem to lighten the load on your server quite a bit.
Google has been requesting that sites send proper 304 responses for quite a while. We really don't know if it will ever be a ranking factor (my guess is probably not) but it is still a darned good practice.
|With site speed being so important... |
Nobody has any proof that it's so important. Didn't witch hunting start like this?
I agree with Wheel. Speed up your site irrespective of Google. Faster sites are primarily a better experience for the user.
There is still such a thing as site appearance. Sometimes, the look of the site can be as important as its content to visitors. Making images tiny so they load fast (and can't be viewed easily by visitors) won't encourage them to stay on your site.
What was it Google used to tell us? Build your site for your visitors, not for Google?
I agree with you about appearance AndyA, it's very important IMO. The trick is to balance both looks and speed. If G says to build sites for users then we'd need to know whether users prefer speed or looks? Is Google logic saying that a better site for users is a faster site, therefore faster sites should rank better? I personally prefer speed over looks in general, but this whole subject is maybe for another thread...
Making images tiny so they load fast (and can't be viewed easily by visitors) won't encourage them to stay on your site.
Sometimes a good image will cause someone to create a link to one of your pages, especially if you have hotlink protection. If you reduce the size of your images, you might lose some backlinks that you would have otherwise gotten.
I try to keep larger images on inner pages and not on landing pages. I'm sure we have all gotten bored waiting for a page to load and just gone elsewhere, same as flash sites that take forever to start displaying. Unless the page is a unique resource I am out of there after 15 or 20 seconds or even less if it looks like it is doing nothing at all.
I've been working on speed/performance issues and something I've noticed that others will likely want to keep in mind when making their changes:
A faster site doesn't necessarily mean better visitor retention as suggested elsewhere. I've trimmed my design and cut my page loading time nearly in half, results? Pages/visitor and time on site rates have dropped a smidge for me and daily new subscription rates have dropped as well (the biggest bummer).
It's nice having a faster loading site but some of the bling I've removed is hurting my visitor retention rates (the bling obviously did its job).
There are a few small tweaks I can make yet to trim the bloat but the biggest speed booster will be to start removing images or reduce their sizes...some pages have several images for detailed articles. I'm not going there, they're necessary or helpful for readers and if a site takes a second longer to load, so be it.
Another option I could try is splitting longer articles or articles with lots of comments into more and more pages for visitors to click over to and read. Not something I relish doing since the setup I have now is more user friendly.
If however, it's clear that fast loading text-only sites with articles spread out over 2, 3 and 4 clicks are winning the race to the top in google, I'll have to reconsider options.
| This 72 message thread spans 3 pages: < < 72 ( 1 2  ) |