Forum Moderators: open

Message Too Old, No Replies

trapping spiders - Google ban result?

making a spider trap

         

sanuk

12:27 pm on Jun 13, 2003 (gmt 0)

10+ Year Member



hi,

If mod_rewrite is new to me, perl is certainly not!
with some example code I found on these forums, I want to set up my own spider-trap for the ones that do not obey robots.txt.

All of the examples I found indicate to put the link to trigger the Spider Trap in a 1x1 transparant gif-image.

Now my only worry is that Google will consider this as a hidden-link and/or hidden text.

Will doing as above, not ban or penalize my page at Google and other mayor search engines?

Best Regards,
Sanuk

jimbeetle

1:58 pm on Jun 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



1x1 transparent gifs have many legitimate uses -- just think of the untold number of pages with them in affiliate tracking codes -- and a spider trap certainly is legitimate. There's not problem with using them.

trillianjedi

3:21 pm on Jun 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What does it matter if you're building the page to trap the non-behaving robots?

Google *does* obey robots.txt, so will not visit the page in the first place.

TJ

jimbeetle

3:41 pm on Jun 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



TJ,

The 1x1 is on an allowed page with a link leading to the disallowed page.

Jim

trillianjedi

4:58 pm on Jun 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Jim,

Of course, thanks.

TJ

sanuk

6:14 pm on Jun 13, 2003 (gmt 0)

10+ Year Member



Hi,

Thank you all for responding.
I will (try) to build my spider trap and add the invisible link on my index-page.

But as a precation I will in the beginning, whenever I catch an intruder, write the result to a normal file and change my HTaccess file manualy.

Once I am sure it works without bugs, then (maybe) have the resukt translated in a new disalow-line in the HTaccess file.

Thanks all for the responses.
Regards,
Sanuk

carfac

5:34 pm on Jun 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sanuk:

I use spider traps a couple different ways.... this being one of them. No problems with the google.

dave

kewlbeezer

10:27 pm on Jun 16, 2003 (gmt 0)

10+ Year Member



is it okay to just link to the banned page? Like, in my robots I ban /guestbook/ ... then on every page of my site (at the very top, the very first link the robot would see), i have a link like:

<a href="/guestbook/"></a>

Would that catch every bad bot? It seems to do the trick so far.