We want to block crawlers that are too fast or that don't follow the instructions in robots.txt. We do this by keeping a list of recent visitors: for every IP number, we remember the timestamps of their last visits. If they make more than 30 requests in 60s, we block them for an ever increasing amount of seconds, starting with 60s and doubling every time this happens.
The exact number of requests and the length of this time window (in seconds) can be changed in the config file.
our $speed_bump_requests = 20; our $speed_bump_window = 20;