Now, now. All he was doing was passing on some info when I asked him about cutting down on the freshbot load.
If he really wanted to get webmasters to work on it he would tell them that you would get some of the bandwidth back for deeper crawls. Or drop some sort of hint that sites that return 304s will get a very minor boost in the SERPs.
Imagine how much faster surfing would be if they were to do that!
I think Yidaki had a problem with the September Update :) !
For anyone with a hosted apache Web server and no access to the server config, this might save you a few hours.
Create an .htaccess file using notepad with the following:
|Header append Cache-Control "public" |
ExpiresDefault "modification plus 1 day"
place the .htaccess in your root directory. More Info [linux.oreillynet.com]
This will set the Last-Modified var for Googlebot, while setting the Expires var to 1 day to ensure returning visitors check for pages (I think most browsers check/store the Expires var).
Try the Cacheability Engine [mnot.net] (no affiliation) to check header results.
GG please let us know if this triggers any penalty filters?
had no problem at all! I'm doing better than ever - allthough i lost some positions for general one word searches, now more deeper, specific pages moved into the top results - and i have a lot of them!
I was just a bit joking in my previous post ... :)
BTW: it's often misinterpreted if someone here at WebmasterWorld writes about irrelevant and/or spammy google results. It's not always complaining - and i didn't complain at all. My critique has nothing to do with my own results ... allthough my "good positions" are sometimes surrounded by spam! ;) However it's obvious that google teached seo's some things with the latest changes, don't you think?
My mistake - I am just a rookie to this forum. I do agree whole heartedly with you that Google has taught some SEO's a thing or two.
Does Google use ETags (with If-None-Match validation)?
|Does Google use ETags (with If-None-Match validation)? |
Good question! GG, please answer that.
|...adding this flag will help you... |
How will it help webmasters?
|How will it help webmasters? |
It reduces your bandwidth.
It frees up googlebot to crawl more pages. Some of those pages might be yours.
As this request mostly applies to dynamic pages, you will be reducing the resource loading on your system, giving a faster response time to your users.
Sending a 304 to a user will lead to a much faster loading page than having to transfer the page before rendering it.
Given some time, I'm sure that I would come up with more.
It also would not be difficult for Google to implement a system where they will watch sites that use IMS more closely with their freshbot. Crawling deeper on sites that implement IMS would also be a possibility.
I can see where adding this may help some Webmasterís, but for me I donít see an advantage on any of my sites.
I pay for unlimited bandwidth, so bandwidth is not a concern for me.
Almost all my sites get a fresh tag everyday or every other day - so Google is crawling all my pages anyway.
I have very few dynamic pages, what few I have all load in less than 5 seconds on a 56k dialup - faster response time, I donít think my users will care about a nano second or two.
As I said perhaps some Webmasterís may benefit from using this flag - But I donít see any benefit to implementing it myself, unless using it is going to improve my rankings.
For hackers like me, this thread is hard to follow. Google guy mentioned modifications to the server. Does this apply to folks who have a website hostes on a virtual server.
What language do you use for your website?
Straight HTML - The server is most likely to already be set up and you are fine. If you are on a virtual server, it is up to the administrator, but the default setting is to support IMS.
HTML + SSI - It is up to you to set it. There was mention in the first couple of pages on how to do this.
PHP - You are responsible for sending this header using the header() function.
ASP - works in a similar way to PHP I assume.
Perl, c, other cgi languages - Don't know.
If have seen no information what Google thinks about deploying gzip-support. And as far as I could see nobody mentioned the tradeoff between bandwidth reduction and processor load. The (de)compression of zipped files would/could reduce the speed of the servers on both sides - botserver and customer webserver - perhaps that's the reason why Google has been so careful till now. But I am sure they will implement it :)
I have seen only one bot that fetched gzipped pages, it was the Openbot from openfind engine.
>I could see nobody mentioned the tradeoff between bandwidth reduction and processor load. The (de)compression of zipped files would/could reduce the speed of the servers on both sides
Because it reduces network use it would generally lower download time. Besides if you are serving static files you can use Apache's mod_negotiation and pre-compress your files so your server's cpu usage don't go up.
> Because it reduces network use it would generally
> lower download time.
Indeed, but perhaps raises costs at Googles side in order to compensate the loss in performance. Though it saves traffic costs in the long run it could produce a nice investment in the short-run.
> Besides if you are serving
> static files you can use Apache's mod_negotiation
> and pre-compress your files so your server's cpu
> usage don't go up.
I agree with you, we use mod_gzip by ourself, but on one hand it's a question of the distribution of dynamic and static pages, and on the other hand - do you believe that everybody would implement it the smart way ;-)
Nevertheless - I love google and mod_gzip :)
| This 75 message thread spans 3 pages: < < 75 ( 1 2  ) |