Hi @polina92,
If there are no records of Bingbot being blocked or tracked in your Wordfence Live Traffic, but you are seeing that elsewhere you will need to consult with the providers of that plugin to see whether anything is incorrect. We recommend against blocking Bingbot – you can check whether these IPs are genuine bots as these services will publicly state, or at least provide a search for, the IPs these use. There can be issues with SEO rankings amongst other things if you flat-out decide to block a crawler using robots.txt, but of course this is at your discretion.
Treatment of Google crawlers, along with other crawlers can be set in the Rate Limiting section of Wordfence > All Options. Whilst Bingbot and DuckDuckGo are not specifically allowlisted by default, they can be controlled through the crawler options on this page if you wish.
I generally set my Rate Limiting Rules to these values to start with:
Rate Limiting Screenshot
- If anyone’s requests exceed – 240 per minute
- If a crawler’s page views exceed – 120 per minute
- If a crawler’s pages not found (404s) exceed – 60 per minute
- If a human’s page views exceed – 120 per minute
- If a human’s pages not found (404s) exceed – 60 per minute
- How long is an IP address blocked when it breaks a rule – 30 minutes
I also always set the rule to Throttle instead of Block. Throttling is generally better than blocking because any good search engine understands what happened if it is mistakenly blocked and your site isn’t penalized because of it. Make sure and set your Rate Limiting Rules realistically and set the value for how long an IP is blocked to 30 minutes or so.
Just as a note, there may be overlapping features when running two security plugins concurrently that can cause performance issues if both are scanning for the same vulnerabilities or checking the same files concurrently, so it would be advised to choose your preferred security plugin, or disable overlapping features from one where possible, and stick with that going forward.
Thanks,
Peter.