Forum Moderators: goodroi
Example:
User-agent: *
Disallow: /b
I found out that by not putting a slash after the b, I am restricting all directories starting with that letter! ARgh!
Now that I have corrected it, will it still be possible to get Google et al to include the once restricted directories? How long should that take? I am now linking heavily to the pages in question and they are included in my sitemap. Please tell me what other things I can do. Thanks!
See also this thread [webmasterworld.com] about why it is sometimes not a good idea to have a trailing slash for disallowing directories.
I'd suggest making your directory names more meaningfull and distinct from any other pages/resources, so your disallow can refer only to those things you want to disallow.
Just make the changes to your robots.txt and naming conventions and they will get indexed eventually.