Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
Maybe this is a case for cloaking specialists?
Then when its time for people to vote, check their ip address against the ip database, and if one row matches, do not allow them to vote. If you would like more help on it this way which imho is the best way, post back or sticky me.
The IP addresses seem like the best approach right now, on the other hand it would something I would need to check on a regular basis. Also I'm not quite sure which spiders are the culprit - I caught Googlebot in the act, but I'm not sure who and how many the others might be.
You can use pattern matching in robots.txt as follows:
Patterns must begin with / because robots.txt patterns always match absolute URLs.
* matches zero or more of any character.
$ at the end of a pattern matches the end of the URL; elsewhere $ matches itself.
* at the end of a pattern is redundant, because robots.txt patterns always match any URL which begins with the pattern
Krapulator that _would_ be perfect but I have no idea how to do it. Any concrete advice?