Welcome to WebmasterWorld Guest from 54.147.44.93

Forum Moderators: ocean10000

Message Too Old, No Replies

Limit number of DB queries?

See this done in PHP, but with MS-Access/ASP?

     

JayCee

7:38 pm on Mar 22, 2005 (gmt 0)

10+ Year Member



Also posted this over on the PHP forum.

I'm a web developer/webmaster and do some limited database/ASP (VBscript) stuff and some SEO.
One of my clients has a new competitor. Their store locator stops you after 5 zip code queries ("..only 4 queries per week or 8 queries per month allowed.."). They are obviously using PHP.

My client wants me to do the same for our MS-Access/ASP (Dreamweaver MX 2004) store locator database, which now has no limitations on number of ZIP, City name or Phone Area searches.

My client is convinced this new competitor, with a new clone of client's product, got useful info on client's retail distribution strategy from our database.

We are on a Windows server (obviously) on a virtual shared hosting account. Not doing anything server-side at this point (except the search and results pages).

Can we also create such a query limitation without switching to MySQL and/or away from a Windows server?

Any ideas much appreciated!

--thanks!

txbakers

11:07 pm on Mar 22, 2005 (gmt 0)

WebmasterWorld Senior Member txbakers is a WebmasterWorld Top Contributor of All Time 10+ Year Member



They are obviously using PHP.

What makes you say that? anything you can do in PHP you can do in any other language as well.

Just because you are on a windows server doesn't mean you can't do that, you just haven't done it yet.

There are lots of ways to do it. One way would be to write/increment a cookie for each attempt. After 5, you just turn it off. that's an easy way.

A more complicated way would be to write something to a DB everytime it's run for a certain date. When that hits five you shut it off.

You might not be able to do it in straight vanilla Dreamweaver, but it certainly is possible.

JayCee

11:46 pm on Mar 22, 2005 (gmt 0)

10+ Year Member



Thanks txbakers,
"Obviously PHP" because "PHP" is in the post string that shows in the browser address box.

I didn't want to prejudice people's ideas by restricting my question, but i'm pretty sure that a cookie would be too easy to intercept/bypass.

Not expecting "perfect" security, since that would probably take a password (which would discourage sales), but hoping for a method that would at least discourage database copying - if not totally stop it.

Maybe a way to block IPs after a certain number of queries and for a limited blocking time would be best?

But how to script that?

lovethecoast

12:15 am on Mar 23, 2005 (gmt 0)

10+ Year Member



Their store locator stops you after 5 zip code queries ("..only 4 queries per week or 8 queries per month allowed.."). They are obviously using PHP.

Because they're doing a certain function, they "obviously" are using PHP? Anything, and I do mean anything, that can be done in PHP can be done in ASP or .Net or a slew of other technologies.

To answer your question, there are multiple ways you can do this. If it's a B2B site, then I'd employ a signin script to get ahold of the data. If it's B2C, the only real thing you can do is use cookies. You can also keep a db log of ip addresses and limit access from a specific ip to x number of times a week (useful in your cause because most businesses have static ip addressses these days).

Short of those two things, there's not really much else you can do without developing an ActiveX or Java solution that runs on the client-computer and uses some form of uniqueness to limit that particular machine from accessing more than X times a week, month, etc.

Dreamquick

12:58 am on Mar 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



First off it's slightly unclear the amount of data we're talking about - if it's only smaller volumes then you should be thinking about tactics to stop a person from extracting your data, if it's larger volumes then you'd want to focus your tactics on stopping machines extracting your data.

Honestly if someone really wants to query your database, they'll find a way - personally if it were me and there were a real need to secure the data I'd take a balanced approach to it:

1. Permanent cookies to "limit" day-to-day usage for normal users.
2. Session variables to limit how many queries can be used in a single session*
3. Server-side trigger to stop one IP from rapidly downloading all the data.
4. Server-side tracking / auditing so you can watch for any patterns.

- Tony

* The session method is only really useful if you can force a user to interact with an intermediate page before they hit the "search" page - stopping an attacker from just requesting the search page directly and forcing them to create a more complex site scraper.

However if they can just go to the "search" page directly to set their "per session" counter (for example by requesting a new session every time) then this method falls apart.

JayCee

5:50 pm on Mar 23, 2005 (gmt 0)

10+ Year Member



Thanks Lovethecoast and Tony,

Yes, B2C and the customers go to the store locator ASP search page from the home page (but of course nothing stops them from going directly, once they know the page name).

There are about 10K records, so far.

Sounds like i need to find an ASP script to log IP addresses into a separate database and count them, etc.

Any idea where i could look for such a beast? Maybe it's time to contract a programmer. This is a bit over my head and not something i want to get expert in at this point.

plumsauce

10:24 pm on Mar 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



nothing stops them from going directly, once they know the page name

unless there is a way to keep on changing the page name, and there is such a beast available for iis. if you need to know the name of the product, ask in a private message.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month