Forum Moderators: goodroi

Message Too Old, No Replies

Is the robots.txt header imp?

It is passing html header to plain text

         

AjiNIMC

7:24 am on Jun 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The pages under domain X is not getting cached and the header for its robots.txt says,


HTTP/1.x 200 OK
Date: Tue, 13 Jun 2006 07:06:47 GMT
Server: Apache/1.3.36 (Unix) mod_fastcgi/2.4.2 mod_auth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 PHP/4.4.2 FrontPage/5.0.2.2635.SR1.2 mod_ssl/2.8.27 OpenSSL/0.9.7a
X-Powered-By: PHP/4.4.2
Content-Type: text/html
Connection: close

The pages under domain Y is getting properly cached and the header for its robots.txt says,


HTTP/1.x 200 OK
Date: Tue, 13 Jun 2006 07:07:19 GMT
Server: Apache/1.3.36 (Unix) mod_fastcgi/2.4.2 mod_auth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 PHP/4.4.2 FrontPage/5.0.2.2635.SR1.2 mod_ssl/2.8.27 OpenSSL/0.9.7a
X-Powered-By: PHP/4.4.2
Content-Type: text/xml
Connection: close

The difference that I can see if
Content-Type: text/html and Content-Type: text/xml

Can this be a possible reason for no cache with Google? What are the other possibilities for the pages not getting cached (rather loosing the caches from Google)? I am assuming no penalty and no filters from Google as these sites were doing good and are very helpful ones with no SEO done.

Thanks,
AjiNIMC

AjiNIMC

9:49 am on Jun 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There was a mistake with the header posted. The corrected header to the working domain Y is,


HTTP/1.x 200 OK
Date: Tue, 13 Jun 2006 06:28:33 GMT
Server: Apache/1.3.36 (Unix) mod_fastcgi/2.4.2 mod_gzip/1.3.26.1a mod_auth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 PHP/4.4.2 FrontPage/5.0.2.2635.SR1.2 mod_ssl/2.8.27 OpenSSL/0.9.7a
Last-Modified: Tue, 02 May 2006 08:44:40 GMT
Etag: "3801cb-132-44571bf8"
Accept-Ranges: bytes
Content-Length: 306
Content-Type: text/plain
Age: 11710
Connection: keep-alive

here the Content-Type: is text/plain

Thanks,
AjiNIMC

abates

9:51 pm on Jun 13, 2006 (gmt 0)

10+ Year Member



Why are you serving text files as text/html? You will find that many browsers will have problems and try to render the contents as HTML...

Dijkgraaf

7:28 am on Jun 14, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes. Try making sure it is serving robots.txt with
Content-Type: is text/plain
and hopefully the problem will go away.

AjiNIMC

6:15 am on Jun 15, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Why are you serving text files as text/html? You will find that many browsers will have problems and try to render the contents as HTML...

This happened by mistake and I have corrected it.

Yes. Try making sure it is serving robots.txt with
Content-Type: is text/plain
and hopefully the problem will go away.

Thanks, I have corrected it and hopefully I will see some improvements with the cache. I will keep this thread updated with the happenings.

Thanks,
AjiNIMC