|The load time is now slower than it was several days ago. |
I'm seeing the same thing in Webmaster Tools, even though my text and HTML are now "deflated" by 70+ percent over what they were a few days ago and text on pages is displaying perceptibly faster.
I just checked. My page load speed got 1/10 of a second faster on average in the last few days, with no help from me.
I may have missed it, but I don't get where they are getting the speed from.
I'm pretty sure that on a 56k dialup speed would be a lot slower than on a high speed broadband.
So the speeds they use are on what kind of connection?
And is their connection actually related to anything near what the average surfer uses?
I think someone posted earlier the speed was calculated from users toolbar data. So it is an average from users all over the world on all kinds of connections. I also think it takes a few days for changes to filter down. At least that is what I found with crawl stats, like a 5 day delay if I reduce page sizes to see lower crawl times in WMT.
|I think someone posted earlier the speed was calculated from users toolbar data. So it is an average from users all over the world on all kinds of connections. |
I think this is what you mean but I just wanted to ask: Users that accessed a page from a site from different internet speeds, and from all those load times an average is calculated?
We have been looking further into an alarming load time of 14 seconds (listed as 8 second last Friday) on our outbound page.
From previous post "We have an offer page that sends users onto 1 of 3 other websites depending on what the user is looking for."
The page listed as taking the time, actually loads and passes the user onto the affiliate website in less than 1/10th of a second.
This third-party page lists a 'thank you' page while they assess exactly where to send the user. The user is no-longer under our control and the address bar reads as a different URL.
Would I be right in thinking that until the user/toolbar registers a 200/OK response header on the penultimate website the timer continues to run logging the time as US?
I had a problem enabling gzip compression on one of my sites because the server does not have mod_gzip or mod_deflate on it. I put this in the top of each php page instead:
<?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>
This checks to see if the browser can deal with gzip and if it can it sends it compressed data otherwise it sends uncompressed. I tested it with GIDZipTest (just Bing for it) and it shows a 60% saving.
PS Forgot to say you can prepend this to each file in your htaccess file I'm just playing with this on one site to check for possible problems.
I was having some problems with image files being compressed by gzip on top of them being jpgs etc. In the end I am using the following code in my .htaccess file which applies gzip compression using php only on files with the following extensions htm¦html¦css¦js¦php. This seems to be working well and is showing around 70% reduction in file size.
php_flag zlib.output_compression on
php_value zlib.output_compression_level 6
I agree @dstiles. A friend of mine's website was recently sandboxed/penalised. the reasons... who knows? What it did mean was he was giving google PPC (around £500 per day) to keep his business going. for 3 months!
My point is this. Whats stopping google from penalising who they want when they want? my friend didnt agree with my point of view however, I think it is to much of a cash generator to overlook on googles part.
If Google considers 1.5 seconds to be a fast page, isn't this something that can be done only if a website is accessed with a T1 internet connection?
If a website is accessed with a dial-up, for instance, then I don't think most sites can load in 1.5 seconds.
i think that 1.5 seconds should be taken with a pinch of salt.
it seems to me that they have totalled up all the websites they spider, and averaged it out. and if you are in the top whatever percent then you have a fast site.
maybe i've missed something... but i don't think it takes any account of your niche. if you have a gaming site, for example, then none of your competitors are likely to be anywhere near that. trying to compete in speed with the rest of the web is a totally meaningless exercise. you need to compete with your competitors.
Instead of Google determining what time constitutes a fast site by using the internet connection speed that visitors to a website are using, can they determine site speed only by looking at the load time by connection?
If you look at the internet connection speed of visitors to a website, visitors to a site of yours might be using a connection slower than the visitors to one of your competitors but if you look at the load time alone your website could be faster.
| This 131 message thread spans 5 pages: < < 131 ( 1 2 3 4  ) |