Forum Moderators: coopster

Message Too Old, No Replies

Anti spam class

based on project honeypot org

         

henry0

1:08 pm on Feb 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Today PhpClasses has a new class dedicated to AntiSpam
this class utilizes antispam work from the projecthoneypot.org

what is your feeling about that org?
Does it worth following their work?

This is not the class, just the org's url
[projecthoneypot.org...]

I guess if you are interested I might be allowed to post a link to the class?

dreamcatcher

7:56 am on Feb 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Henry,

If the class is on the PHP Classes Repository, a link is fine. Not heard of Project Honey Pot. And after a brief look at their site I can`t decide whats its supposed to be or do.

dc

henry0

12:58 pm on Feb 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks DC
It's all said Here [projecthoneypot.org]

It looks ambitious but has I believe the capability to become a real player.

However I am not able to further on my own any evaluation.

I am no longer dealing with clients, so I may not install it somewhere and check it out but only working with my partner on putting out within a few weeks a large project.
I might create a useless site that will be used to test a few stuffs, but not before a couple of months.

whoisgregg

5:38 pm on Feb 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Quite a few webmasters have their own honeypots: pages that are blocked by robots.txt and only linked to by hidden links (that humans wouldn't ever see or follow). They track robots that hit those pages and flag them as scrapers/email harvesters then ban them from their site.

For an example, WebmasterWorld runs honeypots according to it's robots.txt [webmasterworld.com]:

# Honey pots are - and have been - running. If your access has been blocked for bot running - please sticky an admin for a reinclusion request.

The gist of project honey pot (as I understand it) is to standardize the method of tracking and for all the different webmasters to share the IP addresses/User Agents of bad robots.