homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

Testing robots.txt
Why won't my site get spidered

 7:51 pm on Feb 4, 2004 (gmt 0)

I have started working with robots.txt to try and understand what is being spidered and what I need to filter, etc.
In testing I noticed (using the spider simulator on search engine world) that my site refuses the spider. (Even if I don't have a robots.txt) I get a 403 denied error and it says that I have to allow directory browsing. I have other sites that I don't have directory browsing allowed on and they get spidered and I looked at the folder permissions and they are the same. Even the iis properties are the same. Just to test it. I checked off allow directory browsing and it worked. But obviously I don't want to leave it like that. Any suggestions as to what might be the issue?




 5:56 pm on Feb 5, 2004 (gmt 0)

Maybe your directory doesn't have a default page (like index.html)?

Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved