homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / WebmasterWorld / Webmaster General
Forum Library, Charter, Moderators: phranque

Webmaster General Forum

Client-side caching - how?
Urgently need to fix but no idea how - guidance much appreciated!

5+ Year Member

Msg#: 4374838 posted 9:18 am on Oct 15, 2011 (gmt 0)

I've just found the following message in my Adsense dashboard:

We have analysed some of the pages on your sites that serve the most ads, and have detected Page Speed problems that create a highly negative user experience for some users. Frequently viewed pages on your sites are not making full use of client-side caching. Learn how to fix this problem:


I've visited the help page provided and it might as well be Russian as far as I am concerned! I have no idea what to do next, and whether it is something that I can do myself or whether I need to outsource (and if so, where to find someone who can help me). My site runs on a dedicated server at Hostgator and I can find my way round WHM and Cpanel to an extent.

Can anyone point me in the right direction? Any help would be much appreciated. Traffic has dropped 45% year on year since Thursday (having crept back to near-normal after April Panda hit) and I'm wondering whether this is why.



5+ Year Member

Msg#: 4374838 posted 4:36 am on Oct 16, 2011 (gmt 0)

Presumably your content is dynamic, if so you need to confirm what it's running on. Apache and PHP?

Dynamic content that changes a lot or varies with user interaction can't be cached, for obvious reasons. If this is the case then you need to re-work your scripts and databases, or upgrade to faster hardware, to fix the speed issue.

If it's dynamically generated content that nonetheless isn't different for every visitor, you can implement HTTP caching that would otherwise be taken care of by Apache defaults (for static content). In this case you need to confirm that URLs correspond to content that wants caching, and there isn't a bunch of trivial URL parameters creating multiple versions of essentially the same content.

Let's hear answers to the above and go from there.


WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time

Msg#: 4374838 posted 9:32 pm on Oct 16, 2011 (gmt 0)

Client side caching implies that your content is not being cached by browsers causing a long "reload" time.

Run your site (index page AND typical page with adsense) through a free service like webpagetest.org and/or gtmetrix.com for a report and pay particular attention to the SECOND load times, e.g. the time it takes for the page to load from browser cache (2nd load). It should be near instant and there should not be as many queries made as on first load.

You'll most likely find that you need to gzip your content, allow images to be cached and set proper headers. That sounds complicated but it can be done via htaccess, depending on your reports.

If you use firefox there is an adon called "Live HTTP Headers" that will let you see what headers your pages are sending and if those allow for proper client side caching.

Another great set of free tools is the pagespeed and firebug extensions, each giving you even more data.

note: all of the sites I've mentioned here are linked to from various Google pages and are free, I'm not playing favorites.


WebmasterWorld Administrator phranque us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Msg#: 4374838 posted 7:29 am on Oct 17, 2011 (gmt 0)

what type of server are you running?

if it's apache you may find some helpful guidance in this thread.
Cache control in .htaccess:
http://www.webmasterworld.com/apache/4108533.htm [webmasterworld.com]

the tools mentioned above by badbadmonkey and Sgt_Kickaxe are very helpful but they are only going to give you essentially the same information provided generally on the adsense help page, although with specific requirements suggested for each resource requested on the page being tested.


5+ Year Member

Msg#: 4374838 posted 11:13 am on Oct 17, 2011 (gmt 0)

Thank you all for your replies.

badbadmonkey: No, sadly my site is a very old, out of date Frontpage-built static html site. I started it as a hobby site many years ago and it took off in popularity, but being self-taught I think I did pretty much everything along the way wrong! I've muddled along so far but I think this is beyond my capabilities.

Sgt-Kickaxe: Thank you for your suggestions. I've run my home page and a typical Adsense page through webpagetest.org and gtmetrix.com (the latter I found particularly useful, with its downloadable report) and the results are not good. For the home page 8.126 seconds first load, 6.542 seconds second load. And for the typical Adsense page, 8.126 seconds first and 6.542 seconds second load. I also ran a newly designed page through which I am rolling slowly through the site and that performed better: 5.787 and 3.643, although obviously it still needs improvement.

I can cope with some of the recommendations made by gtmetrix.com myself - minimising html, shrinking images further etc. But much of their recommendations seem to relate to javascript from my advertising agency and Adsense, and I don't understand what I can do about it and with browser caching and gzipping both given a high priority and being out of my comfort zone, I think it might be time for me to seek some outside help with htaccess, especially as I've found in the past that Frontpage complicates htaccess.

phranque: Yes, apache. Thank you for that link - I understand more about what I am supposed to be doing now - and it confirms that I am not confident about doing it myself!


5+ Year Member

Msg#: 4374838 posted 12:07 pm on Oct 17, 2011 (gmt 0)

Well implementing HTTP caching for static content on Apache is quite easy.

Firstly Apache usually is configured to do most of the hard work for you, especially with images. You want to get Firebug for Firefox or use the MSIE developer tools (F12) to inspect the headers on all your content. Identify if you're always getting 200s, or if caching is already working with some elements.

Confirm that Apache is sending last-mod and/or etag headers.

Then have a think about a policy. Default expiry times, whether you want the client to always go back to the server to check if updated, etc. If you are never updating the site you can be liberal with expiry times.

Adding site-wide default caching headers to build on the above is not at all difficult.

Gzipping might be able to be done in htaccess, but depending on the server it might be disallowed and you'd have to start using PHP to do it manually.

Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Webmaster General
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved