Compulsory robots.txt creation
-
Several of my sites use KB Robots plugin – it creates virtual robots.txt for each individual WordPress (MU) site. Virtual meaning there is no physical robots.txt file.
In W3tC 2.1.7 BoldGrid added code that compulsorily creates a robots.txt file in the root of the WordPress (MU) site … thereby DISABLING every virtual robots.txt response created by KB Robots, preventing crawling of the cache folder (which Autoptimize* uses to store page optimized .css and .js ) and which has created sheer havoc on Google Console with my sites.
Generic_Environment.php:-
/** * Write robots.txt directives to prevent crawl of cache directory. * * @since 2.1.7 * * @param Config $config Configuration. * @param Util_Environment_Exceptions $exs Exceptions. * * @throws Util_WpFile_FilesystemOperationException with S/FTP form if it can't get the required filesystem credentials. */ private function robots_rules_add( $config, $exs ) { Util_Rule::add_rules( $exs, Util_Rule::get_robots_rules_path(), $this->robots_rules_generate(), W3TC_MARKER_BEGIN_ROBOTS, W3TC_MARKER_END_ROBOTS, array() ); }
WT3C MUST REMOVE THIS CODE IMMEDIATELY!
… and not make such discourteously compulsory setting changes to sites in future.
* I choose to use Autoptimize rather than W3TC’s Minify which fails when minifying subpages.
- The topic ‘Compulsory robots.txt creation’ is closed to new replies.