• The option: “404 Detection

    404 detection looks at a user who is hitting a large number of non-existent pages and getting a large number of 404 errors. 404 detection assumes that a user who hits a lot of 404 errors in a short period of time is scanning for something (presumably a vulnerability) and locks them out accordingly. This also gives the added benefit of helping you find hidden problems causing 404 errors on unseen parts of your site. All errors will be logged in the “View Logs” page. You can set thresholds for this feature below.”

    Is there a risk that Googlebot or other spiders might get locked out by this? We often have large numbers of 404’s because discussion forum threads get deleted, etc. Google will generate hundreds of them sometimes in our Google Console area to alert us. I would hate to have Google locked out of accessing my pages because of this option?

  • The topic ‘404 Lockout – Googlebot Included?’ is closed to new replies.