I have started working with robots.txt to try and understand what is being spidered and what I need to filter, etc.
In testing I noticed (using the spider simulator on search engine world) that my site refuses the spider. (Even if I don't have a robots.txt) I get a 403 denied error and it says that I have to allow directory browsing. I have other sites that I don't have directory browsing allowed on and they get spidered and I looked at the folder permissions and they are the same. Even the iis properties are the same. Just to test it. I checked off allow directory browsing and it worked. But obviously I don't want to leave it like that. Any suggestions as to what might be the issue?