Forum Moderators: anallawalla & bakedjake
We turned on gzip compression at our site a few months ago, which is something most sites can achieve easily.
VirginTech - there are a number of factors affecting webpage loading speed, and you are right that Google appears to be focusing more on factors that webmasters can influence on sites and webpages than network lagtime between the webserver and endusers. To that end, they released an application that can be plugged into Firefox browsers in combination with Firebug -- the Google diagnostic application tests a webpage's delivery and construction and reports suggestions for how to improve its speed. Google's application for this is called "Page Speed".
The Google application is fairly immitative of an earlier Yahoo! application which did the same sort of stuff - Yahoo's app was named "YSlow", and Google docs even recommend using it as one of many performance improvement tools.
Page Speed and YSlow reprt on stuff like how compact the JS and CSS files are, whether a page's images are compressed as much as they could be, whether elements are cached, etc.
And, Page Speed also assesses whether a page is delivered in gzip compressed format or not -- so anallawalla's mention is apt. Gzip is one way that many web applications can improve performance.
Navigation with Drop down Menu
Body area exceeded from the given limit i.e. 40k..
images with high dpi, recommended 200 dpi image..
use gif images these are compatible with the most broswers..
use 11px font size....and use font style Arial Normal
try to put less images on the home page...
anchoring links and associated pages with them should not exceed the given body limit...
and lots of factors could be involved while hurting a page load time.. and more thing.. your server respond time.. try to get hosting service on static IP... don't use dynamic IP... mostly hosting services running so many websites on a single IP.. this could also hurt page load time...
thanks,
bilal
It they are giving us that info and genuinely want to increase the speed of the web (I believe they are genuine) then I guess that's the info they will use in their algorithm. The data from their speed tester might suggest other factors they are concerned about and could add in the future.
I think they might also take server downtime into account. My datacentre caught fire recently and the crawl times displayed in Webmsater tools increased by 30%. Should be interesting to see if that reduces if we can stay fire free for a few weeks.
vordmeister, care to share who your data center host is?!? Having it catch fire is a pretty dramatic contretemps! Was it in Europe?
Google will go on the data they have. That was my point. They will improve their data if that doesn't work.