Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
All pages under the "store/scripts/" have been put in the supplemental results, and have no cache.
Could that wildcard at the end confuse Google (problems wht Yahoo too, not MSN) into not knowing what "emailFriend.asp*" or "contactUs.asp?emailSubject*" means, and thus, just not indexing everything under "scripts/"?
the syntax to disallow a directory is simply:
Which would block everybody from indexing anything in scripts.
Is either going to be ignored, or will lead to just those two pages being ignored.
is no different than
since any string matching "/store/scripts/emailFriend.asp" will be disallowed anyway.
Only googlebot (and a few select others) allow a wildcard in the disallow line, this should be directed only at specific robots. You should never use a wildcard in the disallow feild of user-agent: *
This robots.txt would cause an error for all bots except googlebot and for googlebot it would be pointless since any characters after the end of the query string are included in a match anyway.