• Resolved jayankoodal

    (@jayankoodal)


    During a speed check of my site in Pagespeed insights, I saw a comment that my robots.txt is not valid, Lighthouse was unable to download a robots.txt file… Why this error is happening? robots.txt is valid in my search console data…

    • This topic was modified 1 month, 2 weeks ago by jayankoodal.

    The page I need help with: [log in to see the link]

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Support Mushrit Shabnam

    (@611shabnam)

    Hi @jayankoodal

    I can confirm your robots.txt file is valid. Also from page speed insight and from Chrome lighthouse tool, I noticed the robots.txt file is valid. Please check the screenshot from lighthouse and this screenshot from page speed insight.

    Can you please try clearing your browser cache and cookies before restarting your browser? If you’re unfamiliar with clearing cache, we have a guide to walk you through the process. Please try a different modern browser or another device if the issue remains after clearing the browser cache.

    Thread Starter jayankoodal

    (@jayankoodal)

    Thank you, Shabnam. Your kindness is appreciated. I’ve thoroughly checked my robots.txt file using various tools and confirmed its validity. However, Lighthouse is consistently reporting an issue that seems like a false positive or a potential bug. I’ve cleared browser cache and cookies, used updated browsers, and ensured my database is clean and regularly purged. It’s perplexing that Lighthouse is sometimes showing inconsistent or incorrect results.

Viewing 2 replies - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.