Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Testing robots.txt

Why won't my site get spidered



7:51 pm on Feb 4, 2004 (gmt 0)

10+ Year Member

I have started working with robots.txt to try and understand what is being spidered and what I need to filter, etc.
In testing I noticed (using the spider simulator on search engine world) that my site refuses the spider. (Even if I don't have a robots.txt) I get a 403 denied error and it says that I have to allow directory browsing. I have other sites that I don't have directory browsing allowed on and they get spidered and I looked at the folder permissions and they are the same. Even the iis properties are the same. Just to test it. I checked off allow directory browsing and it worked. But obviously I don't want to leave it like that. Any suggestions as to what might be the issue?



5:56 pm on Feb 5, 2004 (gmt 0)

10+ Year Member

Maybe your directory doesn't have a default page (like index.html)?

Featured Threads

Hot Threads This Week

Hot Threads This Month