Welcome to WebmasterWorld Guest from

Forum Moderators: Ocean10000 & incrediBILL & phranque

Message Too Old, No Replies

Enabling If-Modified-Since Headers



9:42 pm on Mar 26, 2010 (gmt 0)

10+ Year Member

How do I enable If Modified Since headers?

On my php_info page I found this:


Directive | Local Value | Master Value
engine | 1 | 1
last_modified | 0 | 0
xbithack | 0 | 0

In order to enable If Modified Since headers, do I need to change the values of last_modified to 1? If so, how do I go about this. If not, what other steps do I need to take in enabling If Modified Since headers?

I didnt have any luck finding my answer on Google but feel free to point me in the right direction if the answer is already out there somewhere.



12:13 am on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

If-Modified-Since is an HTTP request header that the client sends to your browser along with a request for content. Last-Modified is an HTTP response header that your server returns to the client along with the requested content.

I can't answer PHP-specific questions -- wrong forum and lack of expertise, I'm afraid. But there are two parts to this:

In order for your server to send Last-Modified (in order to make it possible for a client to later send an If-Modified-Since), you do need to "enable Last-Modified" somehow -- though again, I cannot tell you if your PHP tweak is what is required.

But in addition, you will also need to check (in your script) the date/time in the If-Modified-Since header sent by the client against the Last-Modified date of the file --or the Last-Modified-Date of *significant* changes to the code produced by your script. If there have been no *significant* changes since the If-Modified-Since date/time sent by the client, then send a 304-Not Modified status response with no content-body. If there have been changes, then send a 200-OK response status, a new Last-Modified header, and the updated content.

This function may be available as a library function -- I do not know.

Your definition of *significant* changes will greatly affect the efficiency of this HTTP feature: If you consider minor changes to be important, then you will end up sending new content almost every time a request is received, and the page-loading speed and bandwidth-reduction advantages of client-side and network caching will be lost. If on the other hand, you set the bar too high, then visitors may see old or stale content for a long time. The basic problem is that the Last-Modified date/time of script-generated content is a very "fuzzy" concept.

If your content doesn't change often, then also look into server-side "caching" -- Saving static-file copies of dynamically-generated pages, and returning those instead of even running the PHP script when those files are still "current." Also look into the "Expires" and "Cache-Control" headers. It can get very complicated, but all of these headers can 'play together' to significantly reduce the load on your server and to improve the page-load time and visitor experience on your site.



4:56 pm on Mar 27, 2010 (gmt 0)

10+ Year Member

Thanks for the great information, Jim. This is much more technical that I had assumed. I just want to prevent search engines from crawling the same pages over and over again when the content hasn't changed which is a big problem on a site with thousands of pages.

Maybe I'll just play with the meta robots tag to tell the search engines when to crawl each page instead of messing around wih if modified since/last modified in order to avoid accidentally breaking something.


6:39 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

The robots tag will not do what you want. They still have to access the page to see the tag. Then they decide whether to show that page in the SERPs or not.

If-modified-since actually reduces the served bandwidth by returning a shorter response instead of the whole page.


9:15 pm on Mar 30, 2010 (gmt 0)

10+ Year Member

I'm using less than half a percent of my monthly bandwidth allowance. My biggest concern at the moment is getting my content rich pages indexed in the search engines so I think putting a noindex on all my pages that dont have content will be a good idea, even if it doesnt save any bandwidth for me.


11:25 pm on Mar 30, 2010 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

In addition to saving your server bandwidth, implementing last-modified and correct cache-control headers will save your users' time by speeding up page loading (any shared objects such as images, css, and external JS will not have to be re-loaded). And since Google has declared that page-load speed is now a ranking factor, this is still something to consider...


Featured Threads

Hot Threads This Week

Hot Threads This Month