I have migrated my website hosting from Godaddy Cpanel hosting (WordPress Website) to Hostinger’s WordPress hosting 2 days back.
The transition was not easy. I had to disable WP Rocket and start using LiteSpeed Cache. There were a few errors in the beginning but I figured my way out, somehow. The problem began 2 hours back when I tried to check PageSpeed Insights. It showed no error in Desktop report but in Mobile reports it shows that Robots.txt file has errors. It showed complete HTML document of my website instead of the robots.txt file. Then I tried to reach my robots.txt file by going to mydomain/robots.txt but it kept redirecting me to my homepage.
I checked file manager and the robots.txt file was present. I deleted it and tried to create with Yoast Seo. As soon as I clicked “create robots.txt file”, it takes me to my source code. I hot back and it created a file. I saved.
But still the file was inaccessible.
Finally, I deleted the robots.txt file. But now, after going to mydomain/robots.txt it’s showing my file but with xml code in the beginning.
It’s some sort of Ghost file which Google Search Console can read and produce error to.
I have tried disabling LiteSpeed Cache, Quic.cloud CDN, Yoast SEO. I’ve tried clearing cache from CDN dashboard as well as from LiteSpeed Cache plugin.
Still, it’s showing a ghost file with wrong code. I’ve searched file manager for other robots.txt files but couldn’t find any.
Mydomain/robots.txt reads:
<?xml encoding="UTF-8"><?xml
encoding="UTF-8"><?xml
encoding="UTF-8"><p># START
YOAST BLOCK
# ---------------------------
User-agent: *
Disallow:
Sitemap: https://mydomain/sitemap_index.xml
# ---------------------------
# END YOAST BLOCK</p>
I’m not a developer but I can implement code changes if suggested. Plz help.
]]>news-views/page/26
I would like to stop google from indexing news pages other than page one, but I can’t see a way of overriding robots.txt output, is it possible?
I’d like to add something like this:
Disallow: /news-views/page/*$
Allow: /news-views/page/1$
or possibly replace the robots.txt file completely as it is very basic and doesn’t need to be controlled by The SEO Framework plugin
]]>I’m encountering an issue where the AIOSEO plugin is unable to create the robots.txt
file. The plugin displays the following message:
“It looks like you are missing the proper rewrite rules for the robots.txt file. It appears that your server is running on nginx, so the fix will most likely require adding the correct rewrite rules to our nginx configuration. Check our documentation for more information.”
I’ve reviewed the documentation here:
https://aioseo.com/docs/nginx-rewrite-rules-for-robots-txt/
My WordPress site is hosted on an Azure App Service, where we don’t have the ability to modify the NGINX configuration as recommended. Is there an alternative workaround that would allow AIOSEO to manage or generate the robots.txt
without needing to change NGINX rewrite rules on Azure App Service?
Any guidance would be appreciated. Thank you!
]]>I have enjoyed this plugin with no problems. But recently I added a AI scrapper robots.txt. plugin and after some troubleshooting realized that if I activate your plugins “Add sitemap URL to the virtual robots.txt file”-function, it blocks out the function of the AI scrapper blocker plugin.
So I unchecked it and everything seems fine.
But now my robots.txt links wp-sitemap.xml instead of sitemap.xml. And the wp-sitemap.xml appears to be generated by your plugin, but is blank. Is there anything I can do to fix this without having to stop using the AI scrapper blocker?
]]>I would like to report a bug related to the sitemap URLs declared in the robots.txt file when using The SEO Framework in combination with Polylang.
On my test site, the sitemap section in the robots.txt file looks like this:
Sitemap: https://app.local/en/sitemap.xml
Sitemap: https://app.local/en/fr/sitemap.xml
The first URL is correct, but the second one includes two language codes (/en/fr/), while it should only show /fr/. The proper URL should be:
Sitemap: https://app.local/fr/sitemap.xml
Could you please investigate this issue?
Thank you!
]]>Estoy intentando desde Yoast SEO con la versión gratuita, acceder a Herramientas, al editor de archivos para corregir el robots.txt de mi página web.
El problema es que no me aparece esa opción. Me aparecen solamente las adjuntadas: importar y exportar, editor masivo y optimizar datos de seo.
?Cómo puedo acceder?
]]>It is undeniably better that today was delivered to the market as an achievement. Thanks to everyone who worked on the creation of this incredibly useful product.
]]>