Sorry if this is a dumb question as I am not that tech savvy, but it seems like the problem is that these bots create dozens of sessions at once. My site was brought to crawl before blocking via the .htaccess file. But the issue wasn't that there was bad bot crawling the site, but rather that there were 150 sessions from that one bot. Is there anything that can restrict each unique individual user/bot to one session max? Wouldn't that solve the problem? Even if the bot was bad, if we could auto-limit it to 1 session max, and say 1 query every 2 seconds,... it would essentially eliminate the server load. No human user would need to make a query more often than a second or two max.
Again, sorry if this is a stupid question.
Again, sorry if this is a stupid question.
Statistics: Posted by invenio — Thu Jun 20, 2024 8:36 pm