Forum Moderators: phranque

Message Too Old, No Replies

issue with dynamic robots.txt

         

topr8

1:11 pm on Apr 16, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i've been a bit thrown with a dynamically written robots.txt file

i've rewritten robots.txt to robotstxtprocessingfile.php

and it all works except when i view robots.txt in the browser where all the directives are in one long line, however if i view source they are on seperate lines as expected.

Code in the file:
echo "User-agent: *\r\n";
echo "Allow: yadda\r\n";
echo "Disallow: nothere\r\n";
exit;

As it appears in the browser:
User-agent: * Allow: yadda Disallow: nothere

As it appears in view source
User-agent: *
Allow: yadda
Disallow: nothere

any thoughts? i'm stumped? thanks!

topr8

2:16 pm on Apr 16, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



for future refenrence i resolved this in the following way:

i figured it was a headers problem
the robots.txt was being sent as text/html

so i reset it by putting this at the top of robotstxtprocessingfile.php

header('Content-type: text/plain');

jdMorgan

3:24 pm on Apr 16, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Note that "just in case," you might want to make that "Content-Type" with a capital "T."

Headers are "supposed to be" case-insensitive, but...

Jim