Forum Moderators: goodroi

Message Too Old, No Replies

Can a robots.txt be too long?

         

gingerspice

7:58 pm on Oct 12, 2005 (gmt 0)

10+ Year Member



We have hundreds of folders and files on our web server that shouldn't be crawled and I've been told by IT that it won't be cleaned up anytime soon. I've created a robots file to disallow all of these files and directories. It ends up I am disallowing about 200 files and top level directories leaving about 10 to be crawled.

Is there any chance crawlers could get bogged down with this amount of disallow instructions and leave the site altogether?

My company has never used a robots.txt file before and there are many old, bad pages out there showing up on the search engines. I am trying to play clean up.

Should I be concerned about using such a long robots.txt?

thank you,
gingerspice

JerryOdom

8:00 pm on Oct 12, 2005 (gmt 0)

10+ Year Member



Check this out

[webmasterworld.com...]

crglmb

3:32 am on Oct 13, 2005 (gmt 0)

10+ Year Member



This is a good one also:

[whitehouse.gov ]

Craig