I hate AI answers. So many words for so little meaning.
I'm working with Fail2Ban for years and it's great to stop brute force attacks against
ssh,
ftp, Plesk, Wordpress, etc. but I'm not sure it can be helpfull in this case.
Some details about the "bad guys" a website admin has to combat today. They can't be identified so easily as in the past.
They use hundreds of different IPs.
They use common user agents like "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36", no scraper or crawler id.
Their IPs have different common user agents.
They switch their IPs every 10 minutes.
They have extensive amounts of bandwidth.
They know their target well and never produce any 404 error.
They load the whole page including images, js and css, not just
html like other crawlers.
They behave like common users.
So, Thomas has a hard job to do and at the moment the forum is running fast.