Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Why should you care about "IMS"? When a smart spider like Googlebot comes around, IMS lets you tell the spider that a page hasn't changed. Then Googlebot can use the old copy of the page. That frees up the bot to download more pages while saving bandwidth. Because of the bandwidth savings, IMS hits are almost "free" in terms of server load. Plain apache can serve _lots_ of IMS queries per second before slowing a machine down.
IMS can work for dynamically generated pages too. Someone posted how to do it for PHP-generated pages, for example. The bottom line is that if your server supports IMS correctly, you can tell Googlebot about more pages without as much server load or bandwidth on your part. As Google crawls more often to make the web a fresher place, adding this flag will help you and search engines.
If he really wanted to get webmasters to work on it he would tell them that you would get some of the bandwidth back for deeper crawls. Or drop some sort of hint that sites that return 304s will get a very minor boost in the SERPs.
Imagine how much faster surfing would be if they were to do that!
Create an .htaccess file using notepad with the following:
Header append Cache-Control "public"
ExpiresDefault "modification plus 1 day"
place the .htaccess in your root directory. More Info [linux.oreillynet.com]
This will set the Last-Modified var for Googlebot, while setting the Expires var to 1 day to ensure returning visitors check for pages (I think most browsers check/store the Expires var).
Try the Cacheability Engine [mnot.net] (no affiliation) to check header results.
GG please let us know if this triggers any penalty filters?
had no problem at all! I'm doing better than ever - allthough i lost some positions for general one word searches, now more deeper, specific pages moved into the top results - and i have a lot of them!
I was just a bit joking in my previous post ... :)
BTW: it's often misinterpreted if someone here at WebmasterWorld writes about irrelevant and/or spammy google results. It's not always complaining - and i didn't complain at all. My critique has nothing to do with my own results ... allthough my "good positions" are sometimes surrounded by spam! ;) However it's obvious that google teached seo's some things with the latest changes, don't you think?
joined:Oct 19, 2002
...adding this flag will help you...
How will it help webmasters?
How will it help webmasters?
It reduces your bandwidth.
It frees up googlebot to crawl more pages. Some of those pages might be yours.
As this request mostly applies to dynamic pages, you will be reducing the resource loading on your system, giving a faster response time to your users.
Sending a 304 to a user will lead to a much faster loading page than having to transfer the page before rendering it.
Given some time, I'm sure that I would come up with more.
It also would not be difficult for Google to implement a system where they will watch sites that use IMS more closely with their freshbot. Crawling deeper on sites that implement IMS would also be a possibility.
joined:Oct 19, 2002
I pay for unlimited bandwidth, so bandwidth is not a concern for me.
Almost all my sites get a fresh tag everyday or every other day - so Google is crawling all my pages anyway.
I have very few dynamic pages, what few I have all load in less than 5 seconds on a 56k dialup - faster response time, I donít think my users will care about a nano second or two.
As I said perhaps some Webmasterís may benefit from using this flag - But I donít see any benefit to implementing it myself, unless using it is going to improve my rankings.
What language do you use for your website?
Straight HTML - The server is most likely to already be set up and you are fine. If you are on a virtual server, it is up to the administrator, but the default setting is to support IMS.
HTML + SSI - It is up to you to set it. There was mention in the first couple of pages on how to do this.
PHP - You are responsible for sending this header using the header() function.
ASP - works in a similar way to PHP I assume.
Perl, c, other cgi languages - Don't know.
Because it reduces network use it would generally lower download time. Besides if you are serving static files you can use Apache's mod_negotiation and pre-compress your files so your server's cpu usage don't go up.
Indeed, but perhaps raises costs at Googles side in order to compensate the loss in performance. Though it saves traffic costs in the long run it could produce a nice investment in the short-run.
> Besides if you are serving
> static files you can use Apache's mod_negotiation
> and pre-compress your files so your server's cpu
> usage don't go up.
I agree with you, we use mod_gzip by ourself, but on one hand it's a question of the distribution of dynamic and static pages, and on the other hand - do you believe that everybody would implement it the smart way ;-)
Nevertheless - I love google and mod_gzip :)