• Several of my sites use KB Robots plugin – it creates virtual robots.txt for each individual WordPress (MU) site. Virtual meaning there is no physical robots.txt file.

    In W3tC 2.1.7 BoldGrid added code that compulsorily creates a robots.txt file in the root of the WordPress (MU) site … thereby DISABLING every virtual robots.txt response created by KB Robots, preventing crawling of the cache folder (which Autoptimize* uses to store page optimized .css and .js ) and which has created sheer havoc on Google Console with my sites.

    Generic_Environment.php:-

    /**
             * Write robots.txt directives to prevent crawl of cache directory.
             *
             * @since 2.1.7
             *
             * @param Config $config Configuration.
             * @param Util_Environment_Exceptions $exs Exceptions.
             *
             * @throws Util_WpFile_FilesystemOperationException with S/FTP form if it can't get the required filesystem credentials.
             */
            private function robots_rules_add( $config, $exs ) {
                    Util_Rule::add_rules(
                            $exs,
                            Util_Rule::get_robots_rules_path(),
                            $this->robots_rules_generate(),
                            W3TC_MARKER_BEGIN_ROBOTS,
                            W3TC_MARKER_END_ROBOTS,
                            array()
                    );
            }

    WT3C MUST REMOVE THIS CODE IMMEDIATELY!

    … and not make such discourteously compulsory setting changes to sites in future.

    * I choose to use Autoptimize rather than W3TC’s Minify which fails when minifying subpages.

    • This topic was modified 3 years, 2 months ago by iCounsellorUK. Reason: Added info on "Autoptimize"
Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @icounselloruk

    First of all, thank you for the review and I am sorry about the issue you are experiencing.
    This issue is reported numerous times and we already gave a Github issue reported for this.
    The temporary solution is either to revert back to the previous version of W3TC or to change the permission for the robots.txt file.
    We are working on a fix for this and I hope that you will reconsider your review.
    Thanks!

    Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @icounselloruk

    We have released a patch in version 2.1.8. We do apologize for any inconvenience. Please update and do let us know if there are any issues. We will be happy to assist you. Thank you.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Compulsory robots.txt creation’ is closed to new replies.