|Need Way to always show current site|
Looking way to avoid visitors caching site
I had a client ask me to change phone numbers on her site. I did so immediately and she phoned me to ask why I hadn't done it. I explained that I did it within an hour, but that her browser was showing her a cached page and explained how she could make changes in the browser to always show the current site. Now she is worried that people who have been to the site before won't see the current site. So, is there a way for me to change the metatags to always show the current version and not to cache? I realize the site will load a little slower, but if that's the price I need to pay, so be it. I need to get this naive client off my back. Any suggestions would be appreciated.
CACHE-CONTROL & EXPIRES are good for that sort of thing.
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
CONTENT="Mon, 22 Jul 2002 13:15:21 GMT">
Thanks, I'll give that a try. And, the expires back dated that far will work?
If the expiry date is before the current date, then it is expired immediately.
But that was just an example ;-)
... but note that this won't do you a particle of good if the user's ISP does their own remote caching. Once they've decided that a given page is current enough, no power on earth can make them reload before they are good and ready. A local provider recently got some bad press (which they completely ignored) thanks to caching anything and everything-- including things like registrations for popular social sites with the most recent visitor's information all nicely filled in.
Naive "webmasters" are just as bad. I recall one guy on a different forum reposting abuse about his hosting service for supposedly not publishing his recent uploads despite previous responses explaining how caching works.
I used to have an ISP that did their own caching, you could opt out but you needed to be savvy enough to understand what they were doing and what your choice meant. For most locations I would have thought this level of caching would now be unecessary.
|For most locations I would have thought this level of caching would now be unecessary. |
It still goes on . . .proxy servers etc . . . the down side is not only do customers not understand this stuff, they don't want to know.
This is a never ending battle with customers. Page headers **usually** work but it relies on the client (browser) to do what you tell it, and in the case of <cough> "some browsers" it doesn't always. The only sure way is to configure the server to send cache expire headers, which a) is not always possible and b) forces all pages to load uncached, which may slow the browsing experience for your visitors. Lose-lose.
For images, CSS and other included resources, it's easy.
<link rel="stylesheet" type="text/css" href="mycss.css?67876858">
You just add a query string with some unique number after it and the browser is forced to download a new resource. Many dynamically output sites use this method and the number is always different on every page load. Since these are static resources, nothing really happens with the query string, it just gets tossed out.
However, this is a very bad idea for the page itself as it will affect how it indexes, any links to it would now change.
When I make updates to static files, almost without thinking, I include this line: "... if you don't see the changes, hold down the CTRL key on your keyboard (Apple for Mac) and press F5, this will force an uncached download of the page."
Path of least resistance. :-)
Earlier this Summer I had such an issue (with web host proxy caching) after
a site move from an old server to a new server.
The feedback from the SysAdmin at that time was:
"Well your seeing caching the way it is supposed to work. If you do not
want us to cache your site at the perimeter then I can setup you up with
a non caching proxy. Another alternative is to set a non-cache header or
simply set a cookie. Any of the above would work."
My 'problem' was in/with a .php request, and I simply sent my own
"non-cache header" at the top of the script. WFM.