joined:May 11, 2012
I've read mixed things in the past about how a slow page ultimately effects the ability for Googlebot to properly crawl your site. Not sure how much of that is true?
I did notice last week in Google Webmaster Tools that almost 20+ urls of my sitemap were submitted but not indexed. I than ran a report on my sitemap.xml fiel using HEADMasterSEO software which showed that all of my urls had a 200 status code which was good, but about 20+ (the same number that were not indexed by googlebot) had a response time of 7.50 to 24.49 seconds. I don't know what a good response time is, but I assume that 24.49 is very high?
Now I have built my site using Magento and I notice that almost all of the "slow response times" are coming from category and product urls. Yet when I take one of the category urls that showed with a response time of 25.05 in the report and run that same url through bytecheck.com, I see that the "time to first byte" is 0.440. Does this tell me that the problem is not with the server necessarily or the host but with the page itself? Because 0.440 means that pretty quickly the server responds to the request but then the page itself takes forever to completely load? The report at bytecheck also shows a "speed download" is 5568.00 which is total seconds to load the page from start to finish, I believe.
Below is actually what bytecheck showed me with my url edited out.
Time To First Byte0.440
I know these results will fluctuate as I request results at different times. In fact, I actually ran the same sitemap.xml through HEADMasterSEO multiple times and the response times for the exact same category url were 25.05 that first time, 23.60 the second time and the third time was 14.60. I think the shift downward might have something to do with the fact that I use a content delivery network which caches my site for faster loading times. Maybe the site is still caching and the more it caches, the faster it will load?
Still, the response time for products and categories is substantially higher than CMS pages. Is there a certain time that I should be aiming for? I'm not sure what role this plays in googlebot's ability to index or crawl certain pages but I do believe looking at the percentage of indexed vs. crawled urls in Google Webmaster Tool's sitemap report, that there is some truth to this? Although I don't think a product or category url will ever load as fast as a CMS page, I would like to get it to a point where Google indexes 90%+ of my urls if not 100%.
Like I said the numbers are much better on CMS pages but very bad on products and categories. I didn't know if there were other tools that would let me see the exact elements on the page that had trouble loading? Maybe I'm running a module that takes a lot of resources to load?