Welcome to WebmasterWorld Guest from 54.221.9.209

Forum Moderators: phranque

Message Too Old, No Replies

SE spiders are voting on my site :)

How to stop them?

     
4:47 pm on Jul 30, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


I have some items on my website that visitors can vote upon, and I have noticed that the search engine bots "cast" their votes just as happily as human visitors - simply by following the links. How can I make sure I only get human votes?

My best bet is to make the voting in JavaScript, which currently can't be read by most search engines. But then, this might change in the future, and I will also exclude all visitors from voting who have JavaScript disabled. Any better suggestions?

Thanks!

5:02 pm on July 30, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:July 3, 2002
posts:82
votes: 0


I would use form buttons instead of hyperlinks
1:09 pm on Aug 1, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


Good suggestion, however for this particular purpose text links look much better, and afak they don't go well with a form...
1:51 pm on Aug 1, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:May 13, 2003
posts:53
votes: 0


why don't you block the spider from following the link in the robots.txt?

HostLead

3:48 pm on Aug 1, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


Well as far as I know I can only block the spider from followiing the links on the page alltogether (nofollow-tag) or from accessing the files in a certain directory (robots.txt). In this case however those are text links looking like /mypage.php?vote25&para=12 ...

Maybe this is a case for cloaking specialists?

12:29 am on Aug 6, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


Some more input on this issue would be greatly appreciated. Thanks!
4:00 am on Aug 6, 2003 (gmt 0)

Full Member

10+ Year Member

joined:Apr 27, 2003
posts:243
votes: 0


Maybe you could code it in manually by ip address. Make a small database of all the search engine ips that are bothering you and put them in there.

Then when its time for people to vote, check their ip address against the ip database, and if one row matches, do not allow them to vote. If you would like more help on it this way which imho is the best way, post back or sticky me.

9:07 am on Aug 6, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 4, 2002
posts:666
votes: 0


Use a form button with an image as the button:

<INPUT TYPE="IMAGE" SRC="images/mybutton.gif">

You can then make an image which looks exactly like the current text link

10:00 am on Aug 6, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


Hm... as I said, for this particular purpose text links look much better... I guess it's not possible using a form with a text link?

The IP addresses seem like the best approach right now, on the other hand it would something I would need to check on a regular basis. Also I'm not quite sure which spiders are the culprit - I caught Googlebot in the act, but I'm not sure who and how many the others might be.

12:30 pm on Aug 7, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 4, 2002
posts:666
votes: 0


You could make it look exactly the same as the link text, there would be no visual difference at all.
12:34 pm on Aug 7, 2003 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 6, 2003
posts:630
votes: 0


You could maybe do some trickery with robots.txt to block the pages in question. Have a look at [webmasterworld.com...]

You can use pattern matching in robots.txt as follows:

Patterns must begin with / because robots.txt patterns always match absolute URLs.
* matches zero or more of any character.
$ at the end of a pattern matches the end of the URL; elsewhere $ matches itself.
* at the end of a pattern is redundant, because robots.txt patterns always match any URL which begins with the pattern

10:03 pm on Aug 7, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 8, 2003
posts:161
votes: 0


Mmm I'd rather not try any trickery with the robots.txt file when it comes to excluding files only because of different parameter strings. Last time I did this hundreds of my pages accidentially flew out of the index.

Krapulator that _would_ be perfect but I have no idea how to do it. Any concrete advice?

3:07 pm on Aug 8, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 4, 2002
posts:666
votes: 0


Sticky me the urls that the current text links point to when you vote, and I will send you a concrete example.