Welcome to WebmasterWorld Guest from 54.147.44.93

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing specific query strings

     

duke33

12:48 pm on Jun 8, 2004 (gmt 0)

10+ Year Member



I'm rather new to the world of robots.txt and i'm wondering if someone can help me out. I run a wiki that runs on usemod, and we've been having issues with people spamming the sandbox in order to increase their google rating. I'm wondering if i can disallow certain query strings on a perl script, while still allowing everything else.

For example, my wiki script runs in /cgi-bin/wiki.pl and to get to the sandbox page, the url would be /cgi-bin/wiki.pl?Sandbox...to get to the home page, it would be /cgi-bin/wiki.pl?Home...etc.

I still want google to index all my other wiki.pl? pages, but i'd like for it to disallow the indexing of the sandbox (/cgi-bin/wiki.pl?Sandbox). If i do something like this, will it work?

User-agent: googlebot
Disallow: /cgi-bin/wiki.pl?Sandbox

tafkar

12:56 pm on Jun 8, 2004 (gmt 0)

10+ Year Member



I don't know if this works with robots.txt, but how about a different solution?

Since you use a Perl script, why not adding a conditional <META NAME="ROBOTS" CONTENT="NOINDEX"> into your output if the query string contains "Sandbox"?

duke33

3:05 pm on Jun 8, 2004 (gmt 0)

10+ Year Member



The .pl i use is from usemod.com, so if i ever want to upgrade, i'd have to re-write that part of the code...

I was hoping there was an easier way to do that, perhaps through robots.txt

 

Featured Threads

Hot Threads This Week

Hot Threads This Month