• strictly-software

    (@strictly-software)


    Just a question about the option in the Admin options under the title “Rejected User Agents” which says

    Strings in the HTTP ’User Agent’ header that prevent WP-Cache from caching bot, spiders, and crawlers’ requests. Note that super cached files are still sent to these agents if they already exists.

    And contains these default strings

    bot
    ia_archive
    slurp
    crawl

    Can I ask why you wouldn’t want these BOTS to cause cached files to be created as surely if a new post is generated and a BOT is the first “user” to visit the page (Especially if you post to Twitter as you get a Twitter Rush of about 50+ BOTS hitting your site as soon as the Tweet appears (see https://blog.strictly-software.com/2011/11/twitter-rush-caused-by-tweet-bots.html)

    I was going to add a feature into my own Twitter plugin Strictly TweetBot that would make a call to any new post/page before posting to Twitter so that when the Rush comes they get served cached pages and don’t all hit the server at the same time driving up CPU usage etc.

    Therefore I am wondering why you wouldn’t want BOTS to cause a cached page to be generated so that it creates them for real users?

    Thanks

    https://www.ads-software.com/extend/plugins/wp-super-cache/

Viewing 2 replies - 1 through 2 (of 2 total)
  • I am wondering this also.

    I’ve wondered that too, but from some comments elsewhere e.g. “keeping cached files on disk is ‘expensive'”, my guess is that people have limited disk space, and so don’t want to have posts cached that only bots visit.

    I always erase everything out of the user-agent box, and preload all posts, so everything is cached for everyone.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Why would you not want BOTS to cause caching’ is closed to new replies.