Automatically block IP's doing more than 1 request per second

Created on 11 June 2025, 16 days ago

Problem/Motivation

Great initiative, I very much appreciate it, and it will be a very useful tool, to keep bots in check.

Some bots will pound the server intensively for a few minutes, take an hour off, and then return and repeat all over.

So another simple and, also importantly, automatic method to block bad bots, could be to log visitor IP's for a short period like 2-3 minutes, tally up the total requests per IP, and block the IP's which made more than for example 1 request per 2 seconds.

A human user shouldn't request multiple page loads per second, and a reasonable expectation could for example be no more than 20 requests per minute.

Steps to reproduce

Proposed resolution

I suggest adding a more generalized "Max. requests over time period"-rule because I have a web site where five or six facets by a human user is to be expected. But an intense pounding by a bot is problematic mostly due to the rapid requests, not the number of facets.

These parameters could be configurable, and up to the web site admin, to set this value as they see fit, for example:

  • Logging interval: 1, 2, 3, 4 or 5 minutes (default 2 min.?)
  • Max. requests per minute: 20, 30, 40, 50 or 60 (Default 30?)

An argument for using a short logging interval such as a few minutes, is that some bots will pound the server intensively for a few minutes, take an hour off, and then return and repeat.

For this reason, logging intervals of just a few minutes might not only save database or memory space, but also work best.

Remaining tasks

User interface changes

API changes

Data model changes

✨ Feature request
Status

Active

Component

Code

Created by

πŸ‡©πŸ‡°Denmark ressa Copenhagen

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @ressa
  • πŸ‡ΊπŸ‡ΈUnited States bburg Washington D.C.

    Not sure if Crawler Rate Limit β†’ is on your radar, but it seems like that can meet this feature request easily enough. I'm using the module myself on a number of sites, and my main wish with that is that it had some sort of reporting on how much traffic it actually blocks. While I use it, I can't tell if it's effective or now. In fact that was the inspiration to create a dashboard for the Facet Bot Blocker module.

  • πŸ‡©πŸ‡°Denmark ressa Copenhagen

    Thanks for the tip, and I can see that I bookmarked the project back in March, when I got hit by the first wave of bots, but forgot about it ...

    But thanks for reminding me! Because that does look a lot like what I request in this Issue Summary. I agree that it would be nice with just some very basic logging of the efficiency, to see if it works, or misses the bots, or blocks too much, etc.

    I found the issue, where you suggest some logging, and perhaps, if you have the time, and it doesn't take a big effort, you could make a MR in that issue, adding some basic logging? I would be ready to test any MR's, and if nothing else, those interested could then patch the module, to get logging.

    Since the feature I request is covered by Crawler Rate Limit β†’ , I'll close this issue.

  • πŸ‡©πŸ‡°Denmark ressa Copenhagen

    Also, I just now see that a lot of great examples for reviewing the logs have been added in the Crawler Rate Limit README, so that could probably work well for me: https://git.drupalcode.org/project/crawler_rate_limit#logging-of-rate-li...

    The grep examples could be combined with the script here to send an alert, in case a bot goes really amok: https://www.drupal.org/docs/administering-a-drupal-site/security-in-drup... β†’

Production build 0.71.5 2024