Welcome to WebmasterWorld Guest from 54.159.50.111

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing specific query strings

     
12:48 pm on Jun 8, 2004 (gmt 0)

New User

10+ Year Member

joined:June 8, 2004
posts:2
votes: 0


I'm rather new to the world of robots.txt and i'm wondering if someone can help me out. I run a wiki that runs on usemod, and we've been having issues with people spamming the sandbox in order to increase their google rating. I'm wondering if i can disallow certain query strings on a perl script, while still allowing everything else.

For example, my wiki script runs in /cgi-bin/wiki.pl and to get to the sandbox page, the url would be /cgi-bin/wiki.pl?Sandbox...to get to the home page, it would be /cgi-bin/wiki.pl?Home...etc.

I still want google to index all my other wiki.pl? pages, but i'd like for it to disallow the indexing of the sandbox (/cgi-bin/wiki.pl?Sandbox). If i do something like this, will it work?

User-agent: googlebot
Disallow: /cgi-bin/wiki.pl?Sandbox

12:56 pm on June 8, 2004 (gmt 0)

New User

10+ Year Member

joined:Mar 11, 2004
posts:22
votes: 0


I don't know if this works with robots.txt, but how about a different solution?

Since you use a Perl script, why not adding a conditional <META NAME="ROBOTS" CONTENT="NOINDEX"> into your output if the query string contains "Sandbox"?

3:05 pm on June 8, 2004 (gmt 0)

New User

10+ Year Member

joined:June 8, 2004
posts:2
votes: 0


The .pl i use is from usemod.com, so if i ever want to upgrade, i'd have to re-write that part of the code...

I was hoping there was an easier way to do that, perhaps through robots.txt