homepage Welcome to WebmasterWorld Guest from 54.237.98.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Hardware and OS Related Technologies / Website Technology Issues
Forum Library, Charter, Moderators: phranque

Website Technology Issues Forum

    
Banning traffic worldwide
wheel

WebmasterWorld Senior Member wheel us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 5:26 pm on Aug 13, 2008 (gmt 0)

I've just had my third website defaced this year, and am getting tired of it. Throw in all the scrapers, and I really don't have any use for any traffic outside of any westernized countries.

Is anyone blocking entire swaths of the internet, like asia/africa/much of europe, and so on? If so, how does one go about doing so - is there a decent list of IP addresses we can just dump into apache's config?

 

encyclo

WebmasterWorld Senior Member encyclo us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 5:30 pm on Aug 13, 2008 (gmt 0)

There are some details in this earlier thread on the same subject:

[webmasterworld.com...]

wilderness

WebmasterWorld Senior Member wilderness us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3722219 posted 6:41 pm on Aug 13, 2008 (gmt 0)

Is anyone blocking entire swaths of the internet, like asia/africa/much of europe, and so on? If so, how does one go about doing so - is there a decent list of IP addresses we can just dump into apache's config?

It's really NOT that simple!
1) The list of IP's would need conversion to Rewrite Rules.
2) A simple syntax error may result in a 500 taking down your site (s).

I would suggest Rewrite Rules for some targeted Class A's that are primarily outside your market focus.

Then after your assured that these are functioning, proceed slowly to specific sub-Class (Classes B, C & D's) rewrites.

Making sure to test your website (s) for functionality after the implementation of either each and/or series of Rewrites.

wheel

WebmasterWorld Senior Member wheel us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 7:35 pm on Aug 13, 2008 (gmt 0)

I'll have to have a look. I may not do this through apache (though I may). Instead I may do this just by firewalling the entire server. Prolly cuts down on spam too :).

The above thread link from encyclo brings up an interesting point. I do sometimes get calls on my site from folks looking for my services, from other areas. Like someone calls from Australia to buy my services for their parents where I am (not Australia). So I need to make sure I don't just block everything.

Still - in reading all these associated threads it seems like no one is popping up and saying "I did this and it worked/didn't work". I guess I have to be guinea pig?

wilderness

WebmasterWorld Senior Member wilderness us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3722219 posted 7:39 pm on Aug 13, 2008 (gmt 0)

Still - in reading all these associated threads it seems like no one is popping up and saying "I did this and it worked/didn't work". I guess I have to be guinea pig?

No two websites or servers are identical. Same as restaurants, each offers a different cuisine.
As a result, each websmaster must determine individually what is beneficial or detrimental to their own site (s).
AND that eliminates a copy and paste, one size fits all.

wilderness

WebmasterWorld Senior Member wilderness us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3722219 posted 11:36 pm on Aug 13, 2008 (gmt 0)

This may help as well.

[webmasterworld.com...]

Many thanks ti incrediBill for digging out the thread.

Don

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 11:48 pm on Aug 13, 2008 (gmt 0)

I have used the Deny Asia list of IP blocks on a site that covers a specialist service for, at most, two counties of one country. There is no reason why anyone outside of this country would ever need to look at the site, except for legitimate search engine bots. After multiple Asian bots and scrapers were found on the site, both the AlaxK script and the Deny Asia list were added and most of the problems went away overnight.

venti

5+ Year Member



 
Msg#: 3722219 posted 3:29 am on Aug 14, 2008 (gmt 0)

I like the idea of doing it at the server's firewall level, if you can find a way to do it. We use the free MaxMind ip to location binary database which has a 99.3% country identification level and custom APIs which can handle tens of thousands of requests per second. The C++ API in particular is blazing (we use the it and a Java API to access the data and both are very quick). It's updated every month which is nice. I have no affiliation to these guys but I must say after trying numerous paid and free services that they are the best we have used... and it's free.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 3:38 am on Aug 14, 2008 (gmt 0)

How about the Slurp that comes out of Japan?

wilderness

WebmasterWorld Senior Member wilderness us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3722219 posted 3:54 am on Aug 14, 2008 (gmt 0)

How about the Slurp that comes out of Japan?

Marcia,
I've had the majority of that Class A denied for more than seven years, thus when the new Slurp IP crawling appeared, no changes were neccessary.

It has not affected the page listings of my sites.

Don

wilderness

WebmasterWorld Senior Member wilderness us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3722219 posted 4:14 am on Aug 14, 2008 (gmt 0)

We use the free MaxMind ip to location binary database

Their wesbite offers this under GeoLite if anybody is interested.

An active implemetation of the ranges would still require conversion to Rewrites and comparison of ranges to other countries (possibly creating extra work and ranges more focused than necessary) ranges to verify missing ranges.

Although it may seem that I'm attempting to discourage folks from this implementation, my intention is rather, in providing as much caution and detail as possible. (I've had strict limitations for non-North American visiors in place for more than seven years).

there's also a website (also I have NOT checked the accuracies):

block a country

These Ranges require conversion to Rewrites as well.

rocknbil

WebmasterWorld Senior Member rocknbil us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 6:52 pm on Aug 14, 2008 (gmt 0)

I've just had my third website defaced this year, and am getting tired of it.

Have you considered plugging up how they are doing this as opposed to banning IP's?

The reasoning is this: any hacker who intends to gain illegal access to a site is going to do it through a compromised IP. There are probably millions of computers in your country that have been infected with malware or viruses and are compromised. So go ahead and ban most of the planet, and discover the problem hasn't gone away. I see this all the time, with attempts from educational institutions and major ISP IP's in the U.S.

Banning troublesome countries that bring no business value is a good idea, don't get me wrong. I just don't think it will solve your problem.

wheel

WebmasterWorld Senior Member wheel us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3722219 posted 1:43 am on Aug 15, 2008 (gmt 0)

Yes I'm aware of proxies. but these folks don't seem to use proxies. And I bet they don't use proxies when they're looking for the sites to exploit either.

And of course we're plugging the holes. But it's an unpleasant diversion I don't really need when we're as busy as we are right now.

So ultimately, banning many of these countries would get rid of the problem - they don't see me, they don't hack me.

It also fixes a lot of the scraper problem Incredibill brought to light here recently. I'm not pleased with how much of my bandwidth goes to scrapers - it's substantial.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Hardware and OS Related Technologies / Website Technology Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved