Google search console crawl errors
-
Hi!
I’ve been trying to work this issue out over the past few days, but I can’t seem to find a way to make it work.
I’ve discovered that my website has 25 sites with crawl problems, and when using the examine URL tool, they say that the crawling was blocked by robots.txt and that the sites can’t be indexed.
I’ve tried running the old GSC robots.txt tester on my URL property (not the domain property) and here it says that there isn’t any problems with my robots.txt.
Furthermore, when looking at my robots.txt file through /robots.txt it seems to be fine?
i’ve also tried uploading a new sitemap, but this hasn’t done the trick either.
I’ve tried running this test: https://search.google.com/test/mobile-friendly in which it sometimes says that my site is mobile friendly and sometimes says its not. Sometimes it shows 20+ sites with indexing problems and sometimes it only shows 10. Furthermore the test shows between 0-10+ javascript console issues as well, depending on when you run the test.
So to sum it up:
I’ve suddenly experienced crawlerrors on GSC saying that my sites can’t be crawled due to robots.txt blogging the crawl. I can’t find anything that should make this error appear and I’m really running out of ideas.I really hope you can help me!
The page I need help with: [log in to see the link]
- The topic ‘Google search console crawl errors’ is closed to new replies.