| This 66 message thread spans 3 pages: < < 66 ( 1 2  ) || |
|Best way to stop mailto: robot spidered spam?|
We have published our email addresses on our web sites for almost 10 years using simple mailto: or forms. In the past year, the amount of spam has become intolerable. We wish to reduce that to manageable levels, by changing all pages with mailto: to something not spiderable for spambots. Would would be great is a simple "search and replace!" Yep, im lazy too..
What is the best way?
1. Just unlink all email addresses, so those who are REALLY interested in emailing you have to copy or paste into their email program rather than just clicking. This seems to be a sensible wolution at first thought, very easy to implement.
2. USe js, (not sure exactly how to do that) disadvantages.. some non-js enabled set ups wont work with it, and make such users frustrated and think your page is broke!
3. Direct all hyperlinked email addresses to our mail form, or even a new form (system) where the email address appears in the "send to" field when they get there?
Which of these do you suggest, or are there any other quick and dirty solutions?
|kapow msg #:35 |
Someone suggested using a list of rubbish email addresses to 'spoil' the spam list. I don't think this will work because those spam bots detect and delete 'bounce backs' (non-working email addresses).
|pmkpmk msg #:37 |
As for "poisoning" spammer databases with bogus email addresses: yes, sophisticated spmmer tools actually DO check the adresses and delete bounce backs.
Sounds like a huge endorsement for MailWasher which will bounce messages back to the sophisticated spammers. I'd be interested to know whether mail poisoning honey-pots have any effect at all on corrupting these guy's data. It certainly sounds like a good idea.
well, the tool I mentioned - sugarplum - creates ON THE FLY new webpages with bogus addresses, and each page has multiple links to - again - on the fly generated pages.
So even if poisoning the database will NOT work, it will at least slow down the bot if not even trap it altogether.
The downside is, that it slows down your bandwidth as well.
For more information: [devin.com...]
>> creates ON THE FLY new webpages with bogus addresses <<
Hmm. If there are millions of machines each serving hundreds of these pages a day, are they not going to, at least occasionally, generate a "bogus" email address that actually does belong to someone else and is still a valid address? If the program generates addresses that are never going to be valid, then the robot could be easily programmed to spot the pattern and ignore all such addresses. How does the program generate addresses that it is sure are not valid? If it works to some sort of list, then as soon as the spammers have access to the same list, the technique becomes useless. Generating "bogus" email addresses sounds like a triumph of marketing hype over actually thinking through whether those addresses are really bogus or not.
Let me know.
|are they not going to, at least occasionally, generate a "bogus" email address that actually does belong to someone else and is still a valid address? |
If you look at sample pages from something like what wPoison generates you could see that the likelihood of this is very small. However, if a spam robot did harvest all of the bogus addresses as well as this valid address the damage has already been done. They'll have tons of unusable addresses from that source which will have effectively poisoned their database.
My question still stands, are these fake address generators effective against e-mail harvesters?
Maybe it was lost somewhere in this thread: one of the ideas of Sugarplum is, to "spice" the bogus addresses with REAL addresses of known and convicted spammers.
So whenever real spam comes in, you can copy the address out of it (make sure it's the real one) and paste it into Sugarplums database.
nothing is 100% effective against SPAM and address harvesters. Luring the bots into a "honeypot" or "tarpit" is only ONE out of many defensive features. I like to refer to it as the second line of defense. The first line is to actually try to get the bots away from your page via means in the webserver (htaccess-banning) or camouflaging your mailadresses. Second line of defense - as mentioned - luring the bots in a tarpit. Third line of defense is blocking incoming SPAM by e.g. means of blocklists in your mailserver and forth line of defense is keyword filtering in your mailclient.
I can only say that it has worked for me: before I started, I got 20-30 spam mails a day. Now, after implementing all 4 lines of defense, it's down to 1-2.
| This 66 message thread spans 3 pages: < < 66 ( 1 2  ) |