Forum Moderators: open
On the majority of my sites, I get visits from the Googlebot that requests files using http 1.0, but on a particular site I see Googlebot with http 1.1.
Is there a particular reason why this would be, and given the choice, which one would your rather have crawling your site?
I've also noticed, when using a certain 'whois' service, that with the detailled results they return, all my sites are giving the 'whois' bot a 206 response code, while some of my competitors sites are returning a 200.
Using a server header checking tool, my sites return a 200, and I see visitors (including Googlebot) get a 200 as well. I only wonder because some of my competitors sites return a 200 code using the same 'whois' tool, could it be because they are using an older version of Apache that doesn't deal with 'partial get' requests?
Thanks in advance for any thoughts on this.
I'm a bit confused what you mean about how this relates to name based hosting, but I'm looking into it.
I have unique IPs for all my sites, but it does say somewhere in the control panel my host type - 'name based'.
From what I'm reading on the Apache site, it's easier to have name based unless you need ip based for SSL etc.
Look at your logfiles. You may see the byte count on your pages is much smaller than usual.
All common webservers (APACHE, IIS) support dynamic GZIP compression now, but many, many web hosts do not turn it on (webhosts; your customers and their customers want GZIP). 56K modem users would benefit tremendously as would Google, it could crawl the web 3 times faster. Web page text would load 3 times faster!
Web servers and therefore hosts also support precompressed web pages, which are a great potential way to "cloak".
Googlebot could be checking for cloaking.
For a lot more see this link:
[webmasterworld.com...]