Forum Moderators: Robert Charlton & goodroi
Right up front, I assume you understand that any discussion about what "will or won't help" is mostly theoretical.
and if visitors are bailing on the site because the pages are slow to load - then you have a good reason to address site speed. and giving visitors a good experience should be the main focus anyway,
Google uses toolbar and Chrome data to measure page load time, so if your site has a high ratio of people visiting with slow Internet connections, your page load time, according to Google, will be slower.
So what are some of the ways to increase site speed in the eyes of Google?
If Site Speed is correct, then I'm dumbfounded why Google says that, on average, pages in the site take 5.0 seconds to load.
Site Performance shows Page Speed suggestions based on content served to Googlebot (as opposed to a regular user's browser).
Site Performance attempts to show you the best estimate of the page load time. It often represents an aggregate of thousands of data points, collected from all around the world, over various network connections, browsers and computer configurations. It's quite possible that any one user might experience your site significantly faster or slower than this aggregate. Site Performance data works best when it has lots of data points to aggregate over. If your site is small, or doesn't attract a lot of traffic, the results may be slightly skewed.
Page Speed suggestions as shown in Site Performance are based on the version of your page as seen by Googlebot, Google's crawler. For various reasons—for example, if your robots.txt file blocks Googlebot from crawling CSS or other embedded content— these may differ slightly from the suggestions you get when you run the Page Speed extension for Firefox.
Site speed is only a minor ranking factor at this time.
Popular web sites spend between 5% and 38% of the time downloading the HTML document. The other 62% to 95% of the time is spent making HTTP requests to fetch all the components in that HTML document (i.e. images, scripts, and stylesheets). The impact of having many components in the page is exacerbated by the fact that browsers download only two or four components in parallel per hostname, depending on the HTTP version of the response and the user's browser.
advise to hire the best graphic artist to design the coolest website we can afford.
Nothing has been done to the back end of the site yet. But I've been redirecting many crawl errors (404-Not found). There were about 1500 when I first started and now there are "only" 1030.
Site Performance shows Page Speed suggestions based on content served to Googlebot (as opposed to a regular user's browser).
Am I right that all the redirects have been a factor in Google rating my site slow?
But I'm confused because I read a site shouldn't have two many redirects. But how many is too many?
Some of the errors go back years, when the site was managed with FrontPage (html) and then Coranto (php).
[edited by: pageoneresults at 2:52 pm (utc) on May 17, 2011]
Site Performance is based on Googlebot's crawl times
[edited by: tedster at 3:26 pm (utc) on May 17, 2011]
Site Performance is based on data from browsers with the Google Toolbar installed. It has to be, because it's "the time it takes to load in a browser" and a crawler does not render the page, it just downloads the HTML source code.
Page Speed suggestions as shown in Site Performance are based on the version of your page as seen by Googlebot, Google's crawler. For various reasons—for example, if your robots.txt file blocks Googlebot from crawling CSS or other embedded content— these may differ slightly from the suggestions you get when you run the Page Speed extension for Firefox.
A crawler does not render the page.