• If “Deny 404 exploit probing” is active, Stop Spammers shows the captcha page but returns a 200 OK response code. It should still return a 404 File Not Found.

    One main reason to do so is if there are links out there to exploit locations on your site due to a past exploit, Google will continue to list those pages that don’t exist because Stop Spammers returns a 200 OK, whereas if it was a 404 it would eventually delist like it should and if it didn’t happen to already be listed, it wouldn’t get added as a page in google with a Stop Spammers page title.

    I’m having this problem with a site that had been exploited before I took over and google was listing a bunch of Stop Spammer pages of exploit links.

    Deactivating “Deny 404 exploit probing” does return the proper 404 header, but it would be nice to keep the Deny feature while still maintaining a 404 response for search engines.

    It also seems that this option would start blocking completely valid bots, so maybe some protections against blocking legitimate bots like Google, Bing, etc. would be a good idea.

    Otherwise maybe this option is completely useless if you want to not block or cause bad listings on legitimate search engines.

    https://www.ads-software.com/plugins/stop-spammer-registrations-plugin/

Viewing 1 replies (of 1 total)
  • Very good point.

    It will require a little spaghetti code to save the original 404 state and then retrieve it on the next page in order to set it again.

    Search engines should not be a problem. Search engines should never issue a form submit and will not be denied access unless they try to leave a comment or log in. Just to be safe, Google is white listed. I just noticed that I did not include the bing and yahoo white lists, so I will add these also – if only for peace of mind.

    The next release will have a way of preserving 404 status.

    Thanks,

    Keith

Viewing 1 replies (of 1 total)
  • The topic ‘Problem with Deny 404 exploit probing Header Response Code’ is closed to new replies.