Forum Moderators: open

Message Too Old, No Replies

http compression & Googlebot

         

NovaW

11:05 pm on Jun 17, 2003 (gmt 0)

10+ Year Member



This may be a dumb question - I have not been able to find a clear answer by searching.

If the server is set to use http compression - is google's spider capable of decompressing - or will it see the junky compressed file?

The compression gives a good benefit - but it's no use if it makes pages unreadable by spiders

Thanks!

nuhkweb

11:18 pm on Jun 17, 2003 (gmt 0)

10+ Year Member



Hi,

As I know, files are not stored compressed on the server. It's only when a browser that "can accept compression", asks for a file, that then the server compresses the file at that moment and send it compressed to the user.

If a browser that does not support compression asks a file, then the un-compressed file is send.

So that if the crawler of Google asks a file, the server will know if the crawler can read compressed files or not and will send the file that Google can read.
Dont worry.

Greetings

JasonHamilton

12:23 am on Jun 18, 2003 (gmt 0)

10+ Year Member



The web server only sends compressed pages if the browser clearly states it can accept it.

Googlebot doesn't appear to say this, so the pages sent to it are not compressed.

I wish the crawlers would use compression, as they do use up a lot of bandwidth.

NovaW

1:59 am on Jun 18, 2003 (gmt 0)

10+ Year Member



Thanks guys!