Welcome to WebmasterWorld Guest from 107.22.14.254

Forum Moderators: goodroi

Message Too Old, No Replies

Disallow: /*? - is it ok ?

     
7:51 am on Oct 8, 2008 (gmt 0)

10+ Year Member



i wants to block all affiliate links, so i create my robots.txt as...

User-agent: *
Disallow: /*?

is it ok for google and other search engines ? if not, please guide me the correct one.

3:53 pm on Oct 8, 2008 (gmt 0)

5+ Year Member



This will block all urls with the ? in the url. Is that what you want to do?

I have found this to be helpful:

[google.com...]

1:14 pm on Oct 9, 2008 (gmt 0)

10+ Year Member



Yes. and thanks for the link.
3:13 pm on Oct 9, 2008 (gmt 0)

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Just be sure you know exactly what you want to do here. Robots.txt is used to block spiders from URLs on your site. They will still follow outgoing links unless the pages on your site on which the links appear are blocked.
3:16 pm on Oct 9, 2008 (gmt 0)



I've had no issues with that syntax. Note that the safest approach is to only disallow those bots known to understand the wildcard, which I believe is googlebot, msnbot and slurp of the majors.

Unfortunately, the robots standard has developed into not much of a standard at all. I believe the below should work:


User-agent: googlebot
User-agent: slurp
User-agent: msnbot
Disallow: /*?
1:22 pm on Oct 21, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Don't forget that you need at least one blank line after the last record too.

User-agent: googlebot
User-agent: slurp
User-agent: msnbot
Disallow: /*?

 

Featured Threads

Hot Threads This Week

Hot Threads This Month