Welcome to WebmasterWorld Guest from 188.8.131.52 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
wildcards in page names Disallow: /store/scripts/emailFriend.asp* jimsthoughts
I've got a new client whom in his robots text has these 2 lines with wildcards at the end:
All pages under the "store/scripts/" have been put in the supplemental results, and have no cache.
Could that wildcard at the end confuse Google (problems wht Yahoo too, not MSN) into not knowing what "emailFriend.asp*" or "contactUs.asp?emailSubject*" means, and thus, just not indexing everything under "scripts/"?
* doesn't work as a wildcard in robots.txt
the syntax to disallow a directory is simply:
Which would block everybody from indexing anything in scripts.
Disallow: /store/scripts/contactUs.asp?emailSubject* Is either going to be ignored, or will lead to just those two pages being ignored.
Is either going to be ignored, or will lead to just those two pages being ignored. or could it wipe out the whole directory under that page? Sanenet
It COULD... but it shouldn't. * Should be ignored according to specs. Reid
a wildcard at the end of a line is pointless.
is no different than Disallow: /store/scripts/emailFriend.asp
since any string matching "/store/scripts/emailFriend.asp" will be disallowed anyway.
Only googlebot (and a few select others) allow a wildcard in the disallow line, this should be directed only at specific robots. You should never use a wildcard in the disallow feild of user-agent: *
This robots.txt would cause an error for all bots except googlebot and for googlebot it would be pointless since any characters after the end of the query string are included in a match anyway.