Forum Moderators: open
So, does anyone have any experience with this package, good or bad? Or is there another package I should be looking at instead? Thanks for the help.
Okay, so I guess I'm not going with BotBuster. There are still a couple of things which puzzle me -- one, why there's so little discussion (here or on the rest of the net) about the only product I could find which fights bots, and two, why there is no legitimate bot-fighting software available off the shelf from any other source. It seems that there is certainly a market for this kind of software, why has no one come up with it? I have money in hand, and no one to pay.
Maybe if you go back to your own script, you could fill that niche.
Someone needs to; I don't have the expertise to even think about such an item, let alone start working on it ... and with kwells and others stomping all over the place, botstoppers (and even bot identifiers) would be a useful addition to the arsenal.
Someone needs to; I don't have the expertise to even think about such an item, let alone start working on it ... and with kwells and others stomping all over the place, botstoppers (and even bot identifiers) would be a useful addition to the arsenal.
It's simply impossible to create an across-the-board implementation.
Different websites have different goals.
As a result allowed bot traffic must be determined by what is beneficial or detrimental to your own site (s).
Were there a such a product?
Constant monitoring and communication bewteen the software operator and their customers (thus requiring regular fees and a contract to enforce for time spent)woud be required for continued functionality.
The entire process (like the rest of the internet) is constantly being revised.
Who would locate all the Colo's that intiate crawls which are of no benefit to websmasters? I've had two new ones in the past two days.
It's simply impossible to create an across-the-board implementation
That's not entirely true.
You create all sorts of rules and let the individual webmaster pick and choose which rules to apply to their situation.
If it's an all-or-nothing situation, use all the rules or don't use it at all, then it's unlikely to be used.
Who would locate all the Colo's that intiate crawls which are of no benefit to websmasters?
You just block all data centers in the first place.
Plus, you don't have to look for them because crawls can be detected and stopped in real-time without knowing it's a bot in advance.
Detecting a single page access by a stealth bot is sometimes hard to do unless you block based on the source of the crawl or if there are tells in the user agent or HTTP headers.
However, if that bot accesses multiple pages automated detection and deflection become somewhat trivial with the right script.
Also, blocking all bots that plainly identify themselves using whitelisting is pretty trivial as well which can be accomplished already using robots.txt and .htaccess in combination.
Constant monitoring and communication bewteen the software operator and their customers (thus requiring regular fees and a contract to enforce for time spent)would be required for continued functionality.
Surely a regularly updated online database could be linked to the product, allowing users to keep in touch and select according to need and categorization of new bots.
Antiviruses typically have a subscription model that allows for updating for a year.
Failing that (or as well), a users forum could deputize.
It's simply impossible to create an across-the-board implementation
That's not entirely true.You create all sorts of rules and let the individual webmaster pick and choose which rules to apply to their situation.
If it's an all-or-nothing situation, use all the rules or don't use it at all, then it's unlikely to be used.
It appears to me that you have a business plan in place?
Perhaps you could sell it to the two inqiring parties in this thread ;)