Welcome to WebmasterWorld Guest from 54.147.20.131

Forum Moderators: phranque

Message Too Old, No Replies

Ban 75% of the Planet

     

pageoneresults

4:16 pm on Jun 23, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Let's say that I'm a local business here in California. And, that 98% of my product is shipped within the United States and ordered by consumers residing in the United States. What benefit is there for me to allow anyone other than those within a certain IP range access to the site? I don't want visitors from the other side of the planet. I really don't want them if they our outside our serviced areas really. We don't ship product there, never have and probably never will. So, I'm going to block those countries from accessing regionally specific sites.

Am I making a rash decision? Talk me out of it. We're already starting development on the 403 process right now. In fact, we have the logic in place and my programmer warned me to test this slowly. :)

In my thirteen (13) years of doing this, I never, ever thought I'd be starting a topic like this, ever!

Our new Website Whitelist. Actually an IP Whitelist, that sounds better and was available!

Everyone is out there focusing on their Blacklist, why not look at this from the Whitelist perspective? Am I missing something other than a few visitors here and there?

jake66

6:57 am on Aug 8, 2008 (gmt 0)

5+ Year Member



I've been trying to figure out how to do this myself.
pageoneresults, have you had any success? If so, how did you achieve it. Are you using something like GeoIP?

wilderness

4:06 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



If so, how did you achieve it. Are you using something like GeoIP?

There's an old thread either in this forum or another, in which incredBill requested me to pop in an provide a basic explantion.

I was unable to locate the thread!

Here's some other results:
[webmasterworld.com...]
[webmasterworld.com...]

An example of problems that may arise
[webmasterworld.com...]

jake66

8:16 pm on Aug 8, 2008 (gmt 0)

5+ Year Member



Thanks for the links!

A lot of those though, deal with blocking countries. I am more interested in what pageoneresults has suggested, whitelising. Obviously, the database for this would be a lot smaller than the alternative.

The last topic however, is a bit spooky. Gonna have to keep an eye on that one.

wilderness

9:42 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



whitelisting search [google.com]

pageoneresults

10:24 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



p1r, I think you deserve the distinction of being nominated as the the biggest $h!T disturber on Planet Earth.

Who me? No...

Counterpoint: What if any of those IPs you're blocking would have given you some PR6 links to your site if they had been able to access it?

Oh well. Based on the sheer volume of abuse, if they felt the site was that valuable, maybe they will figure out a way to get on our Whitelist. :)

Unless there's some compelling financial advantage, it just seems to make good sense to block access outside of a reasonable targeted area, as a sensible business decision.

I think so too. And, I'm still convinced this IS THE WAY moving forward in most regionally specific instances.

pageoneresults, have you had any success? If so, how did you achieve it. Are you using something like GeoIP?

Heh! I've got a full plate right now and it is a bit down the priority list. I do know there are others working on something similar and they need to get it to market quickly while things are heating up. So do we! My goal is to have this all figured out by 2009-01-01 at which time I'll make an official announcement.

And when I said Ban 75% of the Planet, I don't think I'm too far off on my calculations!

A lot of those though, deal with blocking countries.

Which is a good starting point, they are a given, just block their entire country, mostly abuse anyway and many of them are in Netcrafts "Most Phishiest Countries" report so who cares. We run a U.S. based network appealing to U.S. based customers. I don't care about the link from whatever country wants to give it to me. Give it to someone else who cares. ;)

I am more interested in what pageoneresults has suggested, whitelising. Obviously, the database for this would be a lot smaller than the alternative.

It appears to be somewhat of a monumental task to develop the Whitelist or approach it from that perspective. I'm still trying to figure out how to work that out. More brainstorming is required to fully weigh the pros vs cons of Blacklisting vs Whitelisting. I'm getting on the bandwagon now and not waiting until it is too late. The Internet is getting way too big and its time to start developing some Trusted Networks that are "outside the box" as they say.

Most of my clients products are sold and resold through various providers in other countries. If someone were looking for those, they could easily purchase them locally which many will do. Its those orders coming from abroad that we don't fill anymore. Or, if we do, there is all sorts of paperwork involved and an expense to go with it. For my clients, it just isn't worth the additional time and expense to consider honoring any requests outside the United States (or bordering countries).

Marcia

10:36 pm on Aug 8, 2008 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You are 110% right.

And how much (like bandwidth) cost-wise would be saved, with excluding scrapers and bad bots geographically? I was just looking in WT at a ton of 404's from long-gone pages in a directory that's excluded in robots.txt

g1smd

11:56 pm on Aug 13, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I have seen an approach with a whitelist that allows complete access, a blacklist that allows no access, and then everything else gets an interstitial CAPTCHA for every page.

With occasional analysis of the log file produced by the CAPTCHA system to identify IP blocks and move them into either the blacklist or whitelist, the system seems fairly robust.

This 37 message thread spans 2 pages: 37
 

Featured Threads

Hot Threads This Week

Hot Threads This Month