Crawl Optimazation Setting and robots.txt
-
Hello, how are you all?
I noticed something very weird.
I adjusted the settings in crawl optimization before couple of days, but now I see that the robots.txt file (in the Yoast plugin interface) did not change after.
Today, I opened a staging website on my computer to check something, and I needed to generate the robots.txt file through the Yoast plugin again.
When I generated it, I got a different robots.txt file compared to the original website.
It’s very important to note that: this is the same website; I just created a copy on my computer to test something. The plugins and the Yoast settings were the same as the original, but when I regenerated the robots.txt file, it was different.
I noticed that the robots.txt files are not the same.
Screenshot:
https://i.ibb.co/Dg6bQ64/image.png
OG = Original website
Copy = Staging website
——-
If you notice the copy have:
Disallow: /?s=
Disallow: /page/*/?s=
Disallow: /search/This because i adjust this “search” to no index in the “Crawl Optimization” setting of Yoast plugin.
But as i said before, to see this, i was need to re-generate the robots.txt again.
on the original website (on the right inside my screenshot), its still the default robots.txt
Hope you can tell me if its a bug or maybe i wrong and you can light me.
Regards,
Nadav
The page I need help with: [log in to see the link]
- You must be logged in to reply to this topic.