Welcome to WebmasterWorld Guest from 54.158.65.139

Forum Moderators: phranque

Banning traffic worldwide

   
5:26 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I've just had my third website defaced this year, and am getting tired of it. Throw in all the scrapers, and I really don't have any use for any traffic outside of any westernized countries.

Is anyone blocking entire swaths of the internet, like asia/africa/much of europe, and so on? If so, how does one go about doing so - is there a decent list of IP addresses we can just dump into apache's config?

5:30 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There are some details in this earlier thread on the same subject:

[webmasterworld.com...]

6:41 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Is anyone blocking entire swaths of the internet, like asia/africa/much of europe, and so on? If so, how does one go about doing so - is there a decent list of IP addresses we can just dump into apache's config?

It's really NOT that simple!
1) The list of IP's would need conversion to Rewrite Rules.
2) A simple syntax error may result in a 500 taking down your site (s).

I would suggest Rewrite Rules for some targeted Class A's that are primarily outside your market focus.

Then after your assured that these are functioning, proceed slowly to specific sub-Class (Classes B, C & D's) rewrites.

Making sure to test your website (s) for functionality after the implementation of either each and/or series of Rewrites.

7:35 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I'll have to have a look. I may not do this through apache (though I may). Instead I may do this just by firewalling the entire server. Prolly cuts down on spam too :).

The above thread link from encyclo brings up an interesting point. I do sometimes get calls on my site from folks looking for my services, from other areas. Like someone calls from Australia to buy my services for their parents where I am (not Australia). So I need to make sure I don't just block everything.

Still - in reading all these associated threads it seems like no one is popping up and saying "I did this and it worked/didn't work". I guess I have to be guinea pig?

7:39 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Still - in reading all these associated threads it seems like no one is popping up and saying "I did this and it worked/didn't work". I guess I have to be guinea pig?

No two websites or servers are identical. Same as restaurants, each offers a different cuisine.
As a result, each websmaster must determine individually what is beneficial or detrimental to their own site (s).
AND that eliminates a copy and paste, one size fits all.

11:36 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



This may help as well.

[webmasterworld.com...]

Many thanks ti incrediBill for digging out the thread.

Don

11:48 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I have used the Deny Asia list of IP blocks on a site that covers a specialist service for, at most, two counties of one country. There is no reason why anyone outside of this country would ever need to look at the site, except for legitimate search engine bots. After multiple Asian bots and scrapers were found on the site, both the AlaxK script and the Deny Asia list were added and most of the problems went away overnight.
3:29 am on Aug 14, 2008 (gmt 0)

5+ Year Member



I like the idea of doing it at the server's firewall level, if you can find a way to do it. We use the free MaxMind ip to location binary database which has a 99.3% country identification level and custom APIs which can handle tens of thousands of requests per second. The C++ API in particular is blazing (we use the it and a Java API to access the data and both are very quick). It's updated every month which is nice. I have no affiliation to these guys but I must say after trying numerous paid and free services that they are the best we have used... and it's free.
3:38 am on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



How about the Slurp that comes out of Japan?
3:54 am on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



How about the Slurp that comes out of Japan?

Marcia,
I've had the majority of that Class A denied for more than seven years, thus when the new Slurp IP crawling appeared, no changes were neccessary.

It has not affected the page listings of my sites.

Don

4:14 am on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



We use the free MaxMind ip to location binary database

Their wesbite offers this under GeoLite if anybody is interested.

An active implemetation of the ranges would still require conversion to Rewrites and comparison of ranges to other countries (possibly creating extra work and ranges more focused than necessary) ranges to verify missing ranges.

Although it may seem that I'm attempting to discourage folks from this implementation, my intention is rather, in providing as much caution and detail as possible. (I've had strict limitations for non-North American visiors in place for more than seven years).

there's also a website (also I have NOT checked the accuracies):

block a country

These Ranges require conversion to Rewrites as well.

6:52 pm on Aug 14, 2008 (gmt 0)

WebmasterWorld Senior Member rocknbil is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I've just had my third website defaced this year, and am getting tired of it.

Have you considered plugging up how they are doing this as opposed to banning IP's?

The reasoning is this: any hacker who intends to gain illegal access to a site is going to do it through a compromised IP. There are probably millions of computers in your country that have been infected with malware or viruses and are compromised. So go ahead and ban most of the planet, and discover the problem hasn't gone away. I see this all the time, with attempts from educational institutions and major ISP IP's in the U.S.

Banning troublesome countries that bring no business value is a good idea, don't get me wrong. I just don't think it will solve your problem.

1:43 am on Aug 15, 2008 (gmt 0)

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yes I'm aware of proxies. but these folks don't seem to use proxies. And I bet they don't use proxies when they're looking for the sites to exploit either.

And of course we're plugging the holes. But it's an unpleasant diversion I don't really need when we're as busy as we are right now.

So ultimately, banning many of these countries would get rid of the problem - they don't see me, they don't hack me.

It also fixes a lot of the scraper problem Incredibill brought to light here recently. I'm not pleased with how much of my bandwidth goes to scrapers - it's substantial.

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month