homepage Welcome to WebmasterWorld Guest from 184.73.40.21
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

    
Need Way to always show current site
Looking way to avoid visitors caching site
masterwebman

5+ Year Member



 
Msg#: 4375560 posted 6:59 pm on Oct 17, 2011 (gmt 0)

I had a client ask me to change phone numbers on her site. I did so immediately and she phoned me to ask why I hadn't done it. I explained that I did it within an hour, but that her browser was showing her a cached page and explained how she could make changes in the browser to always show the current site. Now she is worried that people who have been to the site before won't see the current site. So, is there a way for me to change the metatags to always show the current version and not to cache? I realize the site will load a little slower, but if that's the price I need to pay, so be it. I need to get this naive client off my back. Any suggestions would be appreciated.

 

Dijkgraaf

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4375560 posted 2:46 am on Oct 18, 2011 (gmt 0)

CACHE-CONTROL & EXPIRES are good for that sort of thing.

<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="EXPIRES"
CONTENT="Mon, 22 Jul 2002 13:15:21 GMT">

masterwebman

5+ Year Member



 
Msg#: 4375560 posted 2:50 am on Oct 18, 2011 (gmt 0)

Thanks, I'll give that a try. And, the expires back dated that far will work?

Dijkgraaf

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4375560 posted 2:54 am on Oct 18, 2011 (gmt 0)

If the expiry date is before the current date, then it is expired immediately.
But that was just an example ;-)

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4375560 posted 5:14 am on Oct 18, 2011 (gmt 0)

... but note that this won't do you a particle of good if the user's ISP does their own remote caching. Once they've decided that a given page is current enough, no power on earth can make them reload before they are good and ready. A local provider recently got some bad press (which they completely ignored) thanks to caching anything and everything-- including things like registrations for popular social sites with the most recent visitor's information all nicely filled in.

piatkow

WebmasterWorld Senior Member piatkow us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4375560 posted 1:10 pm on Oct 18, 2011 (gmt 0)

Naive "webmasters" are just as bad. I recall one guy on a different forum reposting abuse about his hosting service for supposedly not publishing his recent uploads despite previous responses explaining how caching works.

I used to have an ISP that did their own caching, you could opt out but you needed to be savvy enough to understand what they were doing and what your choice meant. For most locations I would have thought this level of caching would now be unecessary.

rocknbil

WebmasterWorld Senior Member rocknbil us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4375560 posted 4:49 pm on Oct 18, 2011 (gmt 0)

For most locations I would have thought this level of caching would now be unecessary.


It still goes on . . .proxy servers etc . . . the down side is not only do customers not understand this stuff, they don't want to know.

This is a never ending battle with customers. Page headers **usually** work but it relies on the client (browser) to do what you tell it, and in the case of <cough> "some browsers" it doesn't always. The only sure way is to configure the server to send cache expire headers, which a) is not always possible and b) forces all pages to load uncached, which may slow the browsing experience for your visitors. Lose-lose.

For images, CSS and other included resources, it's easy.

<img src="myimage.jpg?435345435345">
<link rel="stylesheet" type="text/css" href="mycss.css?67876858">

You just add a query string with some unique number after it and the browser is forced to download a new resource. Many dynamically output sites use this method and the number is always different on every page load. Since these are static resources, nothing really happens with the query string, it just gets tossed out.

However, this is a very bad idea for the page itself as it will affect how it indexes, any links to it would now change.

When I make updates to static files, almost without thinking, I include this line: "... if you don't see the changes, hold down the CTRL key on your keyboard (Apple for Mac) and press F5, this will force an uncached download of the page."

Path of least resistance. :-)

Jonesy

5+ Year Member



 
Msg#: 4375560 posted 5:05 pm on Oct 22, 2011 (gmt 0)

Earlier this Summer I had such an issue (with web host proxy caching) after
a site move from an old server to a new server.

The feedback from the SysAdmin at that time was:
"Well your seeing caching the way it is supposed to work. If you do not
want us to cache your site at the perimeter then I can setup you up with
a non caching proxy. Another alternative is to set a non-cache header or
simply set a cookie. Any of the above would work."

My 'problem' was in/with a .php request, and I simply sent my own
"non-cache header" at the top of the script. WFM.

HTH,
Jonesy

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved