| 12:47 am on Nov 17, 2009 (gmt 0)|
Matt Cutts said about three times during Pubcon that the page load time will become important sometime in 2010. He didn't mention whether this would be averaged across a site or whether it would be evaluated per page. My guess is the former. I don't read his blog every day so I don't know if he has elaborated there.
We turned on gzip compression at our site a few months ago, which is something most sites can achieve easily.
| 4:35 am on Nov 17, 2009 (gmt 0)|
Ok ..thanks for the tip anallawalla:)
BTW...I think the speed they are talking about would be in secs that a webpage takes to load completely.
| 4:55 pm on Nov 17, 2009 (gmt 0)|
I've theorized for quite a few years that Google could easily use a number of factors to assess Quality of webpages, similarly to the Quality factors they use in AdWords. So, I've been proven right to some degree by the recent announcements.
VirginTech - there are a number of factors affecting webpage loading speed, and you are right that Google appears to be focusing more on factors that webmasters can influence on sites and webpages than network lagtime between the webserver and endusers. To that end, they released an application that can be plugged into Firefox browsers in combination with Firebug -- the Google diagnostic application tests a webpage's delivery and construction and reports suggestions for how to improve its speed. Google's application for this is called "Page Speed".
The Google application is fairly immitative of an earlier Yahoo! application which did the same sort of stuff - Yahoo's app was named "YSlow", and Google docs even recommend using it as one of many performance improvement tools.
Page Speed and YSlow reprt on stuff like how compact the JS and CSS files are, whether a page's images are compressed as much as they could be, whether elements are cached, etc.
And, Page Speed also assesses whether a page is delivered in gzip compressed format or not -- so anallawalla's mention is apt. Gzip is one way that many web applications can improve performance.
| 6:45 pm on Nov 17, 2009 (gmt 0)|
Oh that's a good piece of information silvery!
I am going to give try to page speed.
| 6:36 pm on Nov 30, 2009 (gmt 0)|
Don't go in detail, following damping factors that hurts page load time:
Navigation with Drop down Menu
Body area exceeded from the given limit i.e. 40k..
images with high dpi, recommended 200 dpi image..
use gif images these are compatible with the most broswers..
use 11px font size....and use font style Arial Normal
try to put less images on the home page...
anchoring links and associated pages with them should not exceed the given body limit...
and lots of factors could be involved while hurting a page load time.. and more thing.. your server respond time.. try to get hosting service on static IP... don't use dynamic IP... mostly hosting services running so many websites on a single IP.. this could also hurt page load time...
| 6:56 pm on Nov 30, 2009 (gmt 0)|
I noticed in Google Webmaster Tools there is a graph of average crawl time. Mine are displaying at 200ms, and given the data comes from googlebot I guess they are looking at server response time and the size of the html part of the page only. Googlebot doesn't grab all the images at the same time.
It they are giving us that info and genuinely want to increase the speed of the web (I believe they are genuine) then I guess that's the info they will use in their algorithm. The data from their speed tester might suggest other factors they are concerned about and could add in the future.
I think they might also take server downtime into account. My datacentre caught fire recently and the crawl times displayed in Webmsater tools increased by 30%. Should be interesting to see if that reduces if we can stay fire free for a few weeks.
| 7:26 pm on Nov 30, 2009 (gmt 0)|
Googlebot automatically "throttles up" and "throttles down" on making requests, trying to dynamically adjust so that their requests do not negatively impact server performance. I think that automatic throttling is likely unrelated to page load time experienced by endusers -- a page could deliver quick from the server and still take a long time to render in a browser if designed nonoptimally. (Of course, it should go without saying that if your server grinds down to so slow that Googlebot cannot effectively crawl the site, this would eventually impact search performance and attendant rankings.)
vordmeister, care to share who your data center host is?!? Having it catch fire is a pretty dramatic contretemps! Was it in Europe?
| 8:11 pm on Nov 30, 2009 (gmt 0)|
I won't share the location on this occasion. The fire was very unfortunate. One of their many transformers went down in a dead short and that caused all the other power supplies and the emergency generators to shut down to protect themselves. Then the faulty transformer caught fire and the fire brigade wouldn't let anyone back inside the building to sort things out. It's an excellent datacentre and I wouldn't want to criticise them in any way. I assume they will have changed transformer suppliers by now.
Google will go on the data they have. That was my point. They will improve their data if that doesn't work.