[Plugin: W3 Total Cache] Risk of Caching 404's
-
Hi… I run a VPS with a fair number of small, low traffic, mostly static WP sites. Everything runs smoothly until the occasional directory scanning bot comes along and issues hundreds of requests for non existent files (you know the drill).
WP of course happily tries to respond and rapidly the server runs out of resources. I have mitigated this on the CentOS side with firewalls and other tricks, but I still see these “attacks” causing issues.
W3TC offers two options to deal with 404’s – caching them, with the downside of returning 200 status in enhanced disk page mode – and the “let the OS handle static request 404’s”
There are many posts here saying these are bad ideas – but no real explanation of why. I gather the main risk is that Google/Bing will see a 200 for some period of time if a page/image is removed – but won’t that settle out if proper sitemaps are regularly submitted?
Question – what is the risk of these options and does anyone use them successfully?
Thanks,
ljj
- The topic ‘[Plugin: W3 Total Cache] Risk of Caching 404's’ is closed to new replies.