Welcome to WebmasterWorld Guest from 188.8.131.52 , register , free tools , login , search , subscribe , help , library , announcements , recent posts , open posts Pubcon Website
Disallow: /*? - is it ok ? nex99 msg:3761098 7:51 am on Oct 8, 2008 (gmt 0) i wants to block all affiliate links, so i create my robots.txt as...
is it ok for google and other search engines ? if not, please guide me the correct one.
jeffposaka msg:3761360 3:53 pm on Oct 8, 2008 (gmt 0)
This will block all urls with the ? in the url. Is that what you want to do?
I have found this to be helpful:
...] google.com nex99 msg:3762027 1:14 pm on Oct 9, 2008 (gmt 0)
Yes. and thanks for the link. jimbeetle msg:3762117 3:13 pm on Oct 9, 2008 (gmt 0)
Just be sure you know exactly what you want to do here. Robots.txt is used to block spiders from URLs on your site. They will still follow outgoing links unless the pages on your site on which the links appear are blocked. Receptional Andy msg:3762120 3:16 pm on Oct 9, 2008 (gmt 0)
I've had no issues with that syntax. Note that the safest approach is to only disallow those bots known to understand the wildcard, which I believe is googlebot, msnbot and slurp of the majors.
Unfortunately, the robots standard has developed into not much of a standard at all. I believe the below
User-agent: googlebot User-agent: slurp User-agent: msnbot Disallow: /*? g1smd msg:3770362 1:22 pm on Oct 21, 2008 (gmt 0)
Don't forget that you need at least one blank line after the last record too.
User-agent: googlebot User-agent: slurp User-agent: msnbot Disallow: /*?