Google Search Console Errors
-
I am having difficulties getting Google to crawl my site, fetch sitemaps, etc.
My url is https://tellest.com
My sitemap is https://tellest.com/sitemap_index.xml
My robots are at https://tellest.com/robots.txt
When I tried to research this topic, I saw a lot of messages from a year ago or older, and I’m trying to see what the resolutions are nowadays. When I try to fetch a sitemap, it gives me the following:
Sitemap submitted successfully
Google will periodically process it and look for changes. You will be notified if anything goes wrong with it in the future.It then lists the status as “Couldn’t fetch” and the type as unknown, and discovers no URLs.
For the robots.txt file, I try to use the testing tool, and when I search with my verified property, it gives me the message below:
robots.txt fetch failed
You have a robots.txt file that we are currently unable to fetch. In such cases we stop crawling your site until we get hold of a robots.txt or fall back ot the last known good robots.txt file.When I try to have a page indexed, I get an Indexing request rejected, and, when looking at the live test, it tells me the “URL will be indexed only if certain conditions are met”. Of course, those certain conditions are incredibly vague and don’t seem to indicate what I need. The discovery says “not checked in live tests”.
Any advice anyone can give would be greatly appreciated.
Thanks,
Mike
- The topic ‘Google Search Console Errors’ is closed to new replies.