grandma_genie - 4:42 am on Jun 10, 2012 (gmt 0)
I think mighty mean marie had the right idea. I've taken a heavy hand on IP ranges. Would it make more sense to just allow the ranges I want and block the rest?
Lucy, that visitor just clicked on the New Products link and that will bring up page after page of new products all the way to page 45, so they were acting like a person. It was about a page a minute. Is that too fast for a person?
What I would like to do is only allow USA/Canada IPs and get a script to hinder scrapers. That should at least help slow the stinkers down. Is there a script available for novices that just prevents visitors from grabbing too much too fast?
Don, thanks for the list. That will make my list in htaccess a lot shorter, which is the goal.
dstiles, I don't want to block you and I really don't like to have to block at all. I must say I am seeing less of the code injection attacks in the server logs. Most of the stuff I am seeing lately is just scraping. So far the only way I can see to identify a scraper is by the speed with which they scrape. Too fast for a casual visitor.
I guess I could make it so anyone who visits my site has to register to get in. Has anyone tried that?