Forum Moderators: open
Not sure that I have the correct procedure and hoping someone may advise?
Thanks in advance
# Referer "contains word", and referring page "is" than deny
RewriteCond %{HTTP_REFERER} AnyTerm
RewriteCond %{REQUEST_URI} RequestedPage.html
RewriteRule .* - [F]
I do the following which seems to keep them out:
1. Reject GET and accept only POST to the form
2. Require the HTTP_REFERER to be my domain
3. Reject all posts containing any URLs or HTML
Then they start to adapt to all those things and you have to get creative.
4. Require a cookie
They started taking the cookie!
Finally, I got smart:
5. Stop giving error messages for failed form submits, just display "Thank You!" and pretend to accept things that were tossed in the trash automatically.
Now they can't test your code until they find a workaround because it always appears to succeed.
6. Last but not least, make the form for humans only using javascript:
6.a. Obfuscate the entire form in javascript and make it only appear in the page when decoded and inserted with document.write(); so they never find the form in the first place using automated form detection methods.
6.b. Javascript typing detection. Record a mathematical made from each key typed in and include this checksum-type value with the form submission to prove the typing was done at the keyboard.
Then there's always this fave:
7. Include an extra field hidden using CSS, not a type="hidden" which can be detected automatically. If the bot fills in the hidden field you know it's not someone at the console typing.
So far the above tricks have punted 50K spam submission attempts to one site alone YTD opposed to the 9.8K valid submissions that got past all the anti-spam crap.
Got a few other tricks, but you're about to have fun!
[edited by: incrediBILL at 12:33 am (utc) on Dec. 14, 2008]
Actually I don't believe it's a "they", rather than a person and/or site alone.
There are half-a-dozen successive request for the page from non-North American IP's (which eat 403's on my sites), which is followed by as many as six successive submissions from NA-IP's, which appear to be open proxies.
You'd have a field day collecting the data ;)
I have some pages that are keyword attractors (especially to non-NA searches) as sort of entry pages into my sites. The attraction was not by design, it just worked out that way.
This particular page is an entry to a few articles from a book publsihed in 1904.
My sites are simple by design to include very old browsers, such as early versions of WEB-TV (discussued here previously).
Nor do my sites utilize Java or scripts, thus Java is not an option for me.
I've never been able to get a handle on cookies and although I remain interested, would need to begin with some really basic.
Any links to tutitorials?
Nor or extra fields on the form an option as many of these widget folks fail to complete the existing fields with information which provides valid references to what they are seeking.
Many thanks for the second "post" submission as well.
I may use it, in the event this present pest circumvents the 403's.
Don
That's because those older obsolete and obtuse devices don't tend to support cookies, https and javascript which even a Wii game machine with their version of Opera handles quite nicely.
As far as the extra fields go, widget folks would never see them if they're using a modern browser as I hide them in CSS.
Perhaps your audience is lagging behind in the technology curve but I can't find too many logical reasons to support things so statistically insignificant that they don't even warrant a slice in my browsers pie chart.
That's just me, your mileage may vary.
Re: cookies, what server side language do you use?
Re: cookies, what server side language do you use?
If I told you English, would that be an acceptable answer ;)
This host offers Perl, PHP and perhaps many more. These are tasks I simply don't explore.
BTW, on another site and with another host (not being bothered by these pests) I use a PHP submission form, which was implemented through control panel.