• I just got an email from Google saying:

    Googlebot cannot access CSS and JS files

    on my site.

    Seems the robots.txt file is disallowing wp-admin/ and wp-includes/
    which it has always done.

    BUT there’s CSS and/or JS in there that Google wants to be able to see.

    How to fix? Allow all?

    Has the Googlebot lost its mind?

    THANKS!

Viewing 15 replies - 31 through 45 (of 94 total)
  • I don’t use Wordfence on my sites and I am getting the errors.

    It’s beginning to look like this is a bit of an overzealous crawler bug. Over a few dozen sites that I touch, most are only blocking /wp-admin/, and are getting the warning. There’s no legitimate reason for it to crawl there.

    I think we’ll see this cleared up over the next few days, but as long as you’re not blocking user-facing resources, you should be okay.

    +1 more to worry about ??

    So simply ignore this ?

    I solved same for my blog by editing Robots.txt file.
    Check: Fix : Googlebot cannot access CSS and JS files

    Have anyone look this?

    https://yoast.com/robots.txt

    Yoast has almost the same robots.txt that the default

    User-Agent: *
    Disallow: /out/
    Disallow: /wp-admin/

    I, too, received this notification for dozens of my sites.

    Most of the sites were completely up to date and only the /wp-admin/ was being blocked in the robots.txt, which is something WordPress appears to do be default and isn’t related to any plugin.

    When I used the Fetch and Render tool in GWT and saw what Google has rendered, Google bot was being blocked from accessing some scripts under /wp-admin/, with admin-ajax being the main culprit.

    As a temporary fix, I just removed the /wp-admin/ disallow entirely using the below filter:

    add_filter( 'robots_txt', 'om_remove_wp_admin_block', 10, 2 );
    function om_remove_wp_admin_block($output, $public) {
        if ( !$public ) {
            return $output;
        } else {
            $output = "";
            return $output;
        }
    }

    I then reran the Fetch and Render in GWT and my site passed without any issues.

    According to Yoast, removing the wp-admin block shouldn’t be a problem either: https://yoast.com/wordpress-robots-txt-example/#updates

    Dont ignore it, its not a bug. I clearly stated why its happening and the solution.

    Its happening now because google just started notifying people about the access issues today.

    If you add the following lines, to the very bottom of almost any robots.txt file, it will fix these issues:

    User-Agent: Googlebot
    Allow: /*.js*
    Allow: /*.css*

    I have step-by-step instructions on how to update Google Webmaster Tools that you’ve updated your robots.txt file, and to how to verify the changes work posted here as it is rather lengthy.

    You will need access to your actual robots.txt file to do this and it is not WordPress specific and will work on almost any site.

    However – if you have a plugin that manages robots.txt for you, you may need to update this code in the plugin’s interface, or contact the plugin developer for additional support.

    mlepisto what about if the .js file is inside of wp-admin and you are disallowing that…

    Leave your robots.txt blank, unless there is a specific directory you need to block such as affiliate directory or another. There is no need to block wp directories such as wp-admin or wp-includes any more. This is what is causing the issue.

    I disagree with this. You definitely want to block directories you don’t want crawled regularly or more importantly, indexed. While crawling isn’t necessarily respected by all, it can reduce duplicate content (say thumbnails of images by googlebot-image) and if these are created by code on the fly (timthumb for example) it will increase server load.

    So a properly managed robots.txt file will help you reduce server load, make sure duplicate or irrelevant content isn’t indexed at a very minimum.

    mlepisto that looks like a solid solution as well ?? so I suppose if you feel safer or better by disallowing wp-admin, wp-functions you can add those lines to allow the JS/CSS.

    So what do you have in your robots.text exactly so others can see how you feel its best structured?

    User-Agent: *
    Disallow: /wp-*
    
    User-Agent: Googlebot
    Allow: /*.js*
    Allow: /*.css*

    Looks like a good robots.txt to me that will solve this problem. Thats the way I’d personally go, I was just stating that according to yoast (who are very well known for WP SEO) they are saying there is no longer any reason to block wp- directories.

    bsaverino, those lines will allow ANY JS or CSS file so it shouldn’t matter.

    mlepisto what about if the .js file is inside of wp-admin and you are disallowing that…

    This will still work if the allow lines are after the deny lines – aka the very bottom of the robots.txt file. It overrides the disallow with allow for any pattern that matches /*.js* in your case.

    There could be some unintended consequences for rare cases
    So if you have a file in /old/something.js and for example, that would be allowed even though /old/ may be disallowed.

    My example is a “fits 99% of DIY webmasters” type of example so keep that in mind.

    I disagree with this. You definitely want to block directories you don’t want crawled regularly or more importantly, indexed.

    Apparently WordPress no-indexes files in the wp-admin directory by default, so they won’t be crawled even if the directory isn’t blocked in the robots.txt

    From Yoast:

    WordPress has a robots meta x-http header on the admin pages that prevents search engines from showing these pages in the search results, a much cleaner solution.

    In other words, I think we’re good to leave the robots.txt file blank.

    Exactly, but if your super cautious you could always disallow as usual (wp- or wp-admin, wp-includes) and follow the method mlepisto posted.

Viewing 15 replies - 31 through 45 (of 94 total)
  • The topic ‘Googlebot cannot access CSS and JS files’ is closed to new replies.