Why would you not want BOTS to cause caching
-
Just a question about the option in the Admin options under the title “Rejected User Agents” which says
Strings in the HTTP ’User Agent’ header that prevent WP-Cache from caching bot, spiders, and crawlers’ requests. Note that super cached files are still sent to these agents if they already exists.
And contains these default strings
bot
ia_archive
slurp
crawlCan I ask why you wouldn’t want these BOTS to cause cached files to be created as surely if a new post is generated and a BOT is the first “user” to visit the page (Especially if you post to Twitter as you get a Twitter Rush of about 50+ BOTS hitting your site as soon as the Tweet appears (see https://blog.strictly-software.com/2011/11/twitter-rush-caused-by-tweet-bots.html)
I was going to add a feature into my own Twitter plugin Strictly TweetBot that would make a call to any new post/page before posting to Twitter so that when the Rush comes they get served cached pages and don’t all hit the server at the same time driving up CPU usage etc.
Therefore I am wondering why you wouldn’t want BOTS to cause a cached page to be generated so that it creates them for real users?
Thanks
- The topic ‘Why would you not want BOTS to cause caching’ is closed to new replies.