Rate Limiting Crawlers and bots

I'm currently using a tool called Booter - Bots & Crawlers Manager to control the number of requests sent by crawlers on my website. These crawlers tend to overload my site with excessive requests, causing it to crash on a daily basis. However, I've noticed that this tool also limits the requests made by regular users and visitors, which is not ideal for my business. Is there a way to specifically restrict the rate of requests only for crawlers and bots, while allowing normal users to access my site without any limitations?