Welcome to WebmasterWorld Guest from 54.210.61.41

Forum Moderators: phranque

Message Too Old, No Replies

Banning IPs in htaccess: is it a good idea?

I make my own list of spammer IPs, is this a good idea?

     
8:29 pm on Feb 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 11, 2004
posts: 582
votes: 0


Hello all

There is no dedicated section on hacking and spamming, so I will ask here.

My site runs on a custom CM, and there are many ways for visitors to add content, without login.

Obviously, I was getting spammed like crazy, so I implemented this check that looks for any HTML in the comments, and doesn't allow them. What's more, I record the IP of anyone that tries to add HTML to a comment.

Therefore, I end up with my own custom list of spammer.

I then block all these IPs in the htaccess file.

I was wondering, is any other webmaster doing this? I think I remember reading here that some people do.

But is this a good idea? I was wondering, is it possible that some hijacked computers are spamming my site, without the knowledge of the person that has the computer. By blocking their IP, am I blocking an innocent person, who might be a potential visitor to my site?
8:36 pm on Feb 5, 2012 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 30, 2002
posts:4975
votes: 41


It sounds like good practice. I would also watch for people posting raw links though, which can be considered useful for SEO and don't use HTML markup.

>is it possible that some hijacked computers are spamming my site

It is entirely possible. It is worth inspecting the log files of the IPs you're catching. See if they come in on a referer, download assets such as JS and images from your pages, how long they viewed the page etc. hijacked computers wont wait around to appear human.

Because your CMS is custom, I would go as far to assume that either the spammers are human user agents or someone has coded up something specifically for your site.
9:35 pm on Feb 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 11, 2004
posts: 582
votes: 0


Yes, for sure it is custom code for spamming my site.

How about the official list of spammers? I read somewhere these exist too. Should I be using those lists instead?
11:08 pm on Feb 5, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15452
votes: 739


I then block all these IPs in the htaccess file.

I was wondering, is any other webmaster doing this?


Are there people who don't do this? I had Deny from... directives in my htaccess for years before I got up the nerve to do fancier things like redirects or user-agent matching.

But don't waste your time blocking exact IPs. You can generally slam the door on a whole range without losing any worthwhile human visitors.

Official list of spammers? Wow, that's something people would pay a lot for. There are lists of major server farms, that kind of thing, but a comprehensive list of all the bad guys ever? Uh-uh.
4:26 am on Feb 6, 2012 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4171
votes: 262


How about the official list of spammers?

OP may be referring to services like spamhaus that keep track of reported spammers. They are very helpful for forums.
7:38 am on Feb 6, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15452
votes: 739


And very unhelpful for some site owners :) Seen any of the recent posts from helenp? Her site host apparently takes spamhaus as divine revelation. ("Would you please consider not blocking the entire continent of North America?" "Stop bugging me, lady, it's listed in spamhaus.")
3:50 pm on Feb 6, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 11, 2004
posts: 582
votes: 0


Ok, thanks a lot for the answers. I will keep building my own list of IPs and blocking it then.

Maybe we should have a place to share these IPs. The only issue would be, how would you trust such a list to be accurate.

I also seem to understand you use redirects and user-agent matching to stop spam. I imagine user-agent matching tries to determine if the user-agent is accurate or just advertised?

I don't understand how redirects would stop spam, unless I just redirect the list of spam IPs into some sort of trap or something?
4:55 pm on Feb 6, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member rocknbil is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 28, 2004
posts:7999
votes: 0


A list is fairly worthless. It may be today, but when those providers clean up the server, it's now outdated.

I used to ban IP's. It turns into a high maintenance task, requiring constant attention (and subsequent frustration.) Now I only do it if the requesting IP's are not relevant. E.g., if a company only sells to North America, then they don't need the grief of some of the trouble spots in other parts of the world (see how well I avoided naming names? Hey, it's Monday, I'll take that self-gratification. :-) ) In this case, I ban a whole class of IP's in a one liner nuke.

What I found works far better is the logic behind spamming.

1. Spammers are paid on total delivery. It doesn't matter to who. If they can send out 1 million emails that don't bounce, or post to one million vulnerable blogs without getting the links removed by the time the check arrives, they get paid.
2. If a form can be abused, it will be.
3. If it's too difficult to abuse a form, they **WILL** move on to greener pastures.

Summary: make your forms too difficult to abuse, without making it more difficult for your users (e.g., captcha and other challenge/response fields are an absolute last resort.) It doesn't matter how else you do it, but this works, everyone is happy: your users, your clients, you.

there are many ways for visitors to add content, without login.


This may have been an attractive way to get initial content, but I suggest you now change this method. The ability to prevent anonymous posting is your front line of defense.
9:24 pm on Feb 6, 2012 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 30, 2002
posts:4975
votes: 41


Agreed with rocknbil.

Some arbitrary methods for sorting out the majority of bots

- Send a cookie with an image on the form page. Users who submit the form without the cookie don't get posted or get flagged. This would require any bot to download the image first, and would require the bot owner to be aware of this requirement.
- and/or use some arbitrary encoding of a timestamp and put it into a hidden field in the form. Upon form submission, decode the field and see if the timestamp is relatively fresh... say, 4 hours. Flag anything that doesn't fit into the timeframe. This'll require bots to constantly refetch the form page before submitting... or the bot owner will have to break down your arbitary encryption of timestamps.

Either way, give the same "your message was sent" response to these flagged submissions, to give the impression that it was successful.

Captchas can stop a lot of bots, but definitely not all. There are human and automated services that'll solve them... save man hours of people's lives and don't use them :o)
11:15 am on Feb 14, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2007
posts:1394
votes: 0


I don't understand how redirects would stop spam, unless I just redirect the list of spam IPs into some sort of trap or something?

Yes something like that and it backfires once the trap is identified. If you want to avoid the mass form spam you don't expose the forms to spiders. They can't spam it if they can't find it. It's that simple.
11:45 am on Feb 14, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15452
votes: 739


I don't understand how redirects would stop spam, unless I just redirect the list of spam IPs into some sort of trap or something?

Some robots get confused if you redirect them to 127.0.0.1. Doesn't stop them altogether, but may slow them down-- and if they're clogging up your bandwidth, that by itself can be a big help.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members