• Resolved dmitresku

    (@dmitresku)


    Hi. I received an error notification with robots.txt
    How can I exclude this?

    # BEGIN W3TC ROBOTS
    User-agent: * Disallow: /wp-content/cache/
    # END W3TC ROBOTS

    At the moment, I have manually changed the file robots.txt, but when clearing the cache in the plugin, this entry appears again.

    The page I need help with: [log in to see the link]

Viewing 2 replies - 16 through 17 (of 17 total)
  • If you’re verified that your robots.txt no longer has the block lines in it, then it’s also worth noting that it can take Google Search Console a day or 2 to fully see the difference and load the previoously blocked assets. So try again tomorrow and if that doesn’t work, then the next day. That’s how it’s gone for me. That also assuming you don’t have actual other issues. When you inspect the page on Google Search Console, there’s a section where you can view the pages errors and see if it’s still showing the css and js files as blocked. Once that clears up, validation should work.

    Specifically when on Google Search Console in Mobile Usability, click the error, then click the magnifying glass that appears for a url when rolled over. Then click View Crawled Page and then More Info. Then open page resources and scroll through and you can see there if things are still being blocked due to cache of robots.txt on Google’s side of things.

Viewing 2 replies - 16 through 17 (of 17 total)
  • The topic ‘Robots.txt error’ is closed to new replies.