Welcome to WebmasterWorld Guest from 54.159.165.175

Google site maps and "robots text too long" error

   
2:20 am on Jun 13, 2006 (gmt 0)

10+ Year Member



When logged into my google site maps account, I tried to test my robots.txt and got error message

" Must be at most 5000 characters "

I'm blocking google from about 600 pages to avoid dup content filter. (tried meta tag block, but yahoo responds to "googlebot" meta tag also)
(<META NAME="GOOGLEBOT" CONTENT="NOINDEX, NOFOLLOW">)

Question?
Is this just an error that google site maps is posting. Or can your robots.txt be too long?

3:25 am on Jun 13, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I, too, also get the error. I have a long robots.txt file.
In the box that shows the contents of your robots.txt, clear half of it and then test it with the various user agents.
Then check the other half of the robots.txt file with the user agents.

If there are more then 5000 characters in that box (which only shows what is contained in your robots.txt, all edits are not saved to your robots.txt file) you will also get the error.

Hope that helps, your not alone on that message.
Jake

12:38 am on Jun 14, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I remember the Google URL Console (Removal Tool) has a limit of 50 lines for the robots.txt file.

It would be nice if all of the Google Tools had a consistent limit that was well advertised.

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month