Robots.txt error
-
After the last update I get this warning when testing the speed with Google PageSpeed Insight:
robots.txt is not validLighthouse was unable to download a robots.txt file
I checked the file and it seems ok. Not sure what’s going on here.
I use Polylang
The page I need help with: [log in to see the link]
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
- The topic ‘Robots.txt error’ is closed to new replies.