• Resolved tellestmichael

    (@tellestmichael)


    I am having difficulties getting Google to crawl my site, fetch sitemaps, etc.

    My url is https://tellest.com

    My sitemap is https://tellest.com/sitemap_index.xml

    My robots are at https://tellest.com/robots.txt

    When I tried to research this topic, I saw a lot of messages from a year ago or older, and I’m trying to see what the resolutions are nowadays. When I try to fetch a sitemap, it gives me the following:

    Sitemap submitted successfully
    Google will periodically process it and look for changes. You will be notified if anything goes wrong with it in the future.

    It then lists the status as “Couldn’t fetch” and the type as unknown, and discovers no URLs.

    For the robots.txt file, I try to use the testing tool, and when I search with my verified property, it gives me the message below:

    robots.txt fetch failed
    You have a robots.txt file that we are currently unable to fetch. In such cases we stop crawling your site until we get hold of a robots.txt or fall back ot the last known good robots.txt file.

    When I try to have a page indexed, I get an Indexing request rejected, and, when looking at the live test, it tells me the “URL will be indexed only if certain conditions are met”. Of course, those certain conditions are incredibly vague and don’t seem to indicate what I need. The discovery says “not checked in live tests”.

    Any advice anyone can give would be greatly appreciated.

    Thanks,

    Mike

Viewing 4 replies - 1 through 4 (of 4 total)
  • Thread Starter tellestmichael

    (@tellestmichael)

    Now I’m actually seeing a 500 internal error for some reason as well. Maybe that has something to do with it. Direct links won’t work but copy and paste will for some reason.

    Hi @tellestmichael,

    We checked the homepage’s meta tags, the sitemaps, and the robots.txt and we didn’t find anything that would block Google from being able to crawl your site.

    Since pages from your site appear in a Google site search, we can confirm that Google has indexed at least those pages on your site. We would need an example URL for a page where this is happening to see if we could gather more information.

    If you’re seeing fetch failed in the search console and 500 errors, this may be a technical problem with the site configuration or the webserver. You might be able to troubleshoot further by checking your server’s PHP error logs or by contacting your hosting provider.

    Hope this helps!

    Thread Starter tellestmichael

    (@tellestmichael)

    Hi there! So, I’m not sure if this will help other people down the line, but an associate of mine helped me identify the problem. Just to clarify, this was found before you wrote to me, @priscillamc, but I ended up so busy at my day job that I couldn’t get back here in time to remark on it.

    My htcaccess file was the culprit, as for some reason we had a redirect route in the first few lines. Removing that allowed the Google Search Console to do everything it needed on my site. Crawls and indexing are now working, so if you ever do run into something like that, make sure you at least pass a cursory glance toward your htcaccess file!

    Cheers,

    Mike

    Plugin Support devnihil

    (@devnihil)

    @tellestmichael Thanks for confirming that you were able to resolve the issue, and for providing additional information on what corrected it. We are going ahead and marking this issue as resolved but please let us know if you require any further assistance.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Google Search Console Errors’ is closed to new replies.