403 errors associated with SEO spiders
-
When performing SEO site audits, I am getting 403 errors due to the SpamFireWall Anti-Flood and Anti-crawler features being enabled.
Is there a way to whitelist spiders such as Screaming Frog, SEO Powersuite, AHREFS, etc?
This is annoying as it generates false positives in these auditing tools.
Another feature that would be beneficial would be to be able to allow or block spiders. Currently the plugin states “Plugin shows SpamFireWall stop page for any bot, except allowed bots (Google, Yahoo and etc).” We should be able to choose which bots are permitted. And the fact that you mention Yahoo but not Bing is a bit concerning…
Viewing 2 replies - 1 through 2 (of 2 total)
Viewing 2 replies - 1 through 2 (of 2 total)
- The topic ‘403 errors associated with SEO spiders’ is closed to new replies.