Forum Moderators: open
For a bot that isn't honorable (at least from my side of the fence) a solitary page is hammering :)
10 per second?
If I recall correctly, a well-respected poster in these forums mentioned a figure of one, or possibly two requests per second, per page as acceptable. (Obeyance of robots.txt aside.)
In a post entitled: Beware the lovely Lachesis [webmasterworld.com], I was able to illustrate some attrotious behavior.
Additional reading: modified "bad-bot" script blocks site downloads [webmasterworld.com].
Also, there was a post a couple of weeks back (which I can't find at the moment) discussing a 'throttle' of sorts, which I believe would actually slow the bot down.
Maybe someone who knows where that one is will post it?
One thing is certain to me - as Webmasters we need to address poorly behaved, malconfigured or just plain stupid [webmasterworld.com] bots who peruse the Internet with impunity.
Pendanticist.
I think this is the post you are thinking of: Blocking badly behaved runaway WebCrawlers [webmasterworld.com]
It's a PHP script that looks pretty useful, but I need to get some time to re-write it in PERL in order to try it on my sites, unless someone else does it first.
tkarade,
No, they access one page of your site per second. While they wait to access you again, they go request another 10 million pages from other sites. So it only takes them 300 seconds to do the whole web. (I'm kidding about that 300 seconds, but you get the idea) :)
Jim