Robots.txt causing Crawl problems on Google Webmasters
-
So I’m using the Yoast SEO plugin for my sitemap and robots.txt and somewhere one of them is wrong causing Google Webmasters to be unable to crawl, my sitemap I’m using is https://www.diversecomputing.com/sitemap_index.xml and the robots.txt says basic things and when I tested in Webmasters it said allowed so I’m unsure what is causing the error “Crawl postponed because robots.txt was inaccessible.” Any ideas?
- The topic ‘Robots.txt causing Crawl problems on Google Webmasters’ is closed to new replies.