Forum Moderators: open

Message Too Old, No Replies

How many requests per minute is ok or normal from an ip?

         

born2run

10:02 am on Jul 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi so in cloudflare I'm setting up rate filters which will trigger alarm based on requests by an ip in a minute

I wanted to know how many such requests per minute is a good number considering search bots make many per second itself! Thanks!

keyplyr

11:36 am on Jul 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Better to use seconds if you can - However - you really need to consider how many files are on your average page. With multi-threaded modern browsers, as many as 6 requests per packet and several packets at a time could be sent, then fulfilled depending on server config, router or load-balancer. So that's humans.

Now consider benneficial bots, the ones you want to crawl your files. The reputable bots usually won't request any more frequently that 2 or 3 per second, and not consistently. The next second may only be 1 request, and so on. Many "good" bots will delay requests much slower than that and some will support "crawl delay" in robots.txt.

So that leaves the so called "bad" bots. These don't usually request anyvmore frequently than any other bot to avoid the kind of thing you're asking about. However, occasionally you nay see a malicious agent scraping as fast as they can. These you'd likely be blocking anyway.

[fix typo]

[edited by: keyplyr at 7:37 am (utc) on Aug 1, 2017]

born2run

12:55 pm on Jul 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Agreed Keyplyr. So as per your advice, I'm not setting any rate limits for now.

However can I set a rate limit on exactly one URL for example: http://www.example.com/example ?

Please advise.

lucy24

4:10 pm on Jul 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Page requests or all requests?

keyplyr

6:54 pm on Jul 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



However can I set a rate limit on exactly one URL for example: http://www.example.com/example ?
Please advise.
• Block if redundant requests for same page more than 3Xs within a time frame. Some bots request files very fast, beyond what a browser does.
Source: Blocking Methods [webmasterworld.com]

born2run

12:06 am on Aug 1, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So guys what number of requests does this translate to, in requests in a minute? This is not a file it's a url to login to the site to post articles (admin url).

keyplyr

12:22 am on Aug 1, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This is not a file it's a url to login to the site
That information would have been helpful in your opening post.

I wouldn't use a "rate limit" for login pages at all. Not a good match. You want to treat these with respect.

Members may forget their login credentials, but after 3 attempts (so on the 4th try) I'd initiate a password reset procedure (I do this.)

I wouldn't bother blocking the drive-bys. They're blocked from getting through anyway.

born2run

1:00 pm on Aug 1, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Ok great keyplyr your advise is always valuable. I won't put any rate limit.