WP's Default robots.txt bad for google?
-
So wordpress’s default robots.txt file looks like this.
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Seems logical enough. However we have been investigating a site and google is complaining about not being able to see the javascript files that are under the wp-includes dir.
In June 2014, Google did the Panda update. In it, they apparently very much care about seeing all javascript and css files being included on the site.
Why would google care about javascript/css?
Because javascript/css can be used to alter what a search engine might see vs what the user really sees. (hiding heavily optimized content for search engines for example) And this of course has long been a no-no in google’s eyes.Any theme/plugin that makes use of wp_register_script (for example wp_register_script(‘myjs’, $srcfile, array(‘jquery’));) would rely on pulling the WP provided jquery.js file from the wp-includes folder.
So if I’m reading the chatter right from google searches relating to this, wordpress’s default Disallow of /wp-includes/ is terrible!!!
I’m hoping for some discussion here from folks that may know a bit more than I’ve researched so far.
- The topic ‘WP's Default robots.txt bad for google?’ is closed to new replies.