• Hi!

    I’ve been trying to work this issue out over the past few days, but I can’t seem to find a way to make it work.

    I’ve discovered that my website has 25 sites with crawl problems, and when using the examine URL tool, they say that the crawling was blocked by robots.txt and that the sites can’t be indexed.

    I’ve tried running the old GSC robots.txt tester on my URL property (not the domain property) and here it says that there isn’t any problems with my robots.txt.

    Furthermore, when looking at my robots.txt file through /robots.txt it seems to be fine?

    i’ve also tried uploading a new sitemap, but this hasn’t done the trick either.

    I’ve tried running this test: https://search.google.com/test/mobile-friendly in which it sometimes says that my site is mobile friendly and sometimes says its not. Sometimes it shows 20+ sites with indexing problems and sometimes it only shows 10. Furthermore the test shows between 0-10+ javascript console issues as well, depending on when you run the test.

    So to sum it up:
    I’ve suddenly experienced crawlerrors on GSC saying that my sites can’t be crawled due to robots.txt blogging the crawl. I can’t find anything that should make this error appear and I’m really running out of ideas.

    I really hope you can help me!

    The page I need help with: [log in to see the link]

Viewing 5 replies - 1 through 5 (of 5 total)
  • Hi Amanda, I recommend removing the wp-admin disallow and change the allow to * as well. This may get rid of the issue. Google can’t really crawl things behind a login anyways, so blocking it from the wp-admin folder isn’t doing anything for you.

    You can completely remove the Disallow line and then resubmit to Google Search Console and see if it resolves.

    Thread Starter amandahovman

    (@amandahovman)

    Okay, I’ve done that now ??

    When will I be able to see if it has worked? It still shows the same errors on google search console, but maybe it takes some time before google processes the new robots.txt file?

    Try to resubmit your sitemap and you should be able to request another review…? I know it can take a day or two sometimes, but it shouldn’t be longer than that.

    Thread Starter amandahovman

    (@amandahovman)

    Now I’ve waited for almost 48 hours after changning the robots.txt and resubmitting it to GSC.
    I’ve also uploaded a new sitemap.

    None of it has worked ??

    This is the link to my google search console, this might help? https://search.google.com/search-console/index/drilldown?resource_id=sc-domain%3Aamandahovman.dk&item_key=CAMYISAE&hl=da&sharing_key=oEw-uZ9WVS_Kd1yVJTX35w

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Google search console crawl errors’ is closed to new replies.