Forum Moderators: open

Message Too Old, No Replies

How to suppress browser caching?

the method I am using is not search engine friendly

         

john5

5:37 pm on Jun 6, 2002 (gmt 0)

10+ Year Member



Hi everybody

I have a small auction site written in ASP. It is paramount for me to supress caching on some pages. I am using random numbers in the url like www.mydomain.com/mypage.asp?r=[randomnumber]. It works perfect, but when Google spiders these pages, she follows the same urls for days again and again. Although I have not yet had any problems with Google and Brett Tabke assured me that I should not be worried as long as it does not exceed a certain percentage, I would prefer to replace the random number method with one that is more search engine friendly. I also use the following code, but do not have too much trust in its efficiency.

Response.Expires = -1
Response.AddHeader "anyname", "no-cache"
Response.AddHeader "cache-control", "no-store"

Does anybody have a solution that works a 100% ?

Thank you in advance

Xoc

7:09 pm on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



On those pages, place the following in the head section of the page:

<meta name="GOOGLEBOT" content="NOARCHIVE,NOFOLLOW"></meta>

john5

7:46 pm on Jun 6, 2002 (gmt 0)

10+ Year Member



Xoc

I do not want to do this. I would lose a few hundred product pages that are well indexed and bring me some good traffic.

Brett_Tabke

7:54 pm on Jun 6, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



recap:
You want to supress browser caching, but do it in a search engine friendly manner?

How about going with a "no cache" http header? It's cross browser friendly. I don't know how you would do that on a iis.

john5

8:07 pm on Jun 6, 2002 (gmt 0)

10+ Year Member



Brett

"no cache" http header - I am not familiar with it. Is it an additional information in the http header that has to be set somewhere?

Xoc

8:33 pm on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't get what is it you are trying to do. Are you trying to keep the browser from caching the pages, keep Google from caching the pages, or keep Google from following the links to random numbered pages? Or some combination of the above. I'm assuming you are talking about ASP, not ASP.NET.

To stop the browser from caching the pages, you need to follow the directions in RFC2616 [ietf.org] and add the following line to your ASP before anything else:

<%
Response.AddHeader "Pragma", "no-cache"
Response.AddHeader "Cache-Control", "no-cache"
%>

To keep Google from caching the pages, add the following meta tag to the html head:

<meta name="GOOGLEBOT" content="NOARCHIVE"></meta>

To keep Google from following to the random pages, add the following to the html head:

<meta name="GOOGLEBOT" content="NOFOLLOW"></meta>

[edited by: Xoc at 4:37 am (utc) on June 16, 2002]

john5

9:13 pm on Jun 6, 2002 (gmt 0)

10+ Year Member



Xoc

thank you. You gave me the answer.

<%
Response.AddHeader "Pragma", "no-cache"
Response.AddHeader "Cache-Control", "no-cache"
%>

Sorry if I expressed myself badly. As Brett Tabke said, I want to suppress browser caching, but want to do it in a search engine friendly manner. The way I do it now, the Google crawler follows the same url for days in a near to endless loop. The Googlebot thinks it is a different page because of the random number attached to the url, although it is not. Takes the good guy to the same pages maybe several hundred times. And I certainly do not want to drive the good little guy nuts! My traffic depends on him for at least 50%.

Anyway, thank you! You guys are really good on this forum.