• Hi everyone,

    For some reason in last couple of days I received notifications and errors from Google Search Console and Ahrefs Site audit that my Robots.txt is returning status 403 – forbidden.

    And this should not be as my robots.txt is easily accessible and have completely standard rules inside it.

    Can someone pls give me advice on why is this happening? Is it just some crawling error or something is wrong on my site?

    My site is: CuriousMatrix.com

    And here is robots.txt file.

    The page I need help with: [log in to see the link]

Viewing 4 replies - 1 through 4 (of 4 total)
  • Looks fine to me. Possibly that was a short interruption in hosting. A look at the logfile of the hosting should actually show you what was going on when. The support of the hoster could help.

    Thread Starter dpernar

    (@dpernar)

    Thank you very much. Do you know where I could find this log file on bluehostt?

    Also, do you think this could be the issues because I just changed name servers to Ezoic because I applied to their ad network. Perhaps this caused temporary issue?

    Bluehost has a great FAQ: https://www.bluehost.com/help/article/error-logs – but that’s all I know about it, since I don’t host there.

    And yes, such changes might have led to something like this for a short time.

    Thread Starter dpernar

    (@dpernar)

    Thnx. I’ll see what happens in next few days.

    Thank you.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Robots.txt 403 Forbidden’ is closed to new replies.