• Resolved Aronov Edward

    (@ea95)


    Hello, I’m looking for a solution to an issue where the robots.txt file is in sync with the main domain. Whenever I make changes to the file, it gets updated on both the subdomain and the main domain. I want to be able to update the file for the main domain and subdomain separately, so they are not synchronized.

    Is this a known issue, and is there a solution available?

    The page I need help with: [log in to see the link]

Viewing 8 replies - 1 through 8 (of 8 total)
  • Plugin Support Maybellyne

    (@maybellyne)

    Hello @ea95

    Thanks for reaching out. Can you let me know how your subdomain is set up? Is it on a WordPress multisite network, or is it a separate, single WordPress installation from the main domain?

    Thread Starter Aronov Edward

    (@ea95)

    Hi, thank you for the quick reply.

    Setup – A single WordPress installation from the main domain

    Plugin Support Maybellyne

    (@maybellyne)

    Hello @ea95

    Thanks for the clarification. Am I correct about this is the subdomain and main domain? I see the same contents in the static robots.txt file for the sub and main domains but this is not the same as the virtual robots.txt – sub VS main.

    • Am I correct that the contents of the virtual robots.txt is your desire?
    • Since you mentioned the subdomain VS main domain are separate WordPress installations, are you able to access their robots.txt files separately?
    • If yes, please share a screenshot of what you have for each in WordPress > Yoast SEO > Tools > File Editor

    You can use any image-sharing service like https://pasteboard.co/, https://snag.gy/, https://imgur.com/, https://snipboard.io/, or even upload the screenshot to your own website. Once you upload it to an image-sharing service, please share the link to the image here.

    Plugin Support Jose Varghese

    (@josevarghese)

    This thread was marked resolved due to a lack of recent activity. The original poster can change the status to Not Resolved to re-open the issue or open a new topic.

    If you are not the original poster but have a similar issue, please open a new topic.

    Thread Starter Aronov Edward

    (@ea95)

    Just to recap, we have WordPress installed only on the main domain, so there is only one robots.txt file on the server. The Yoast plugin, managed through the Yoast admin panel, affects all subdomains simultaneously. We should inform them that the wpml plugin handles the translation work on the subdomains.

    Thread Starter Aronov Edward

    (@ea95)

    Hi, @maybellyne

    Is there an update on this issue?

    • This reply was modified 8 months, 3 weeks ago by Aronov Edward.
    Thread Starter Aronov Edward

    (@ea95)

    Hi, is there an update? @maybellyne

    @josevarghese

    Just to recap, we have WordPress installed only on the main domain, so there is only one robots.txt file on the server. The Yoast plugin, managed through the Yoast admin panel, affects all subdomains simultaneously. We should inform them that the wpml plugin handles the translation work on the subdomains.

    • This reply was modified 8 months, 3 weeks ago by Aronov Edward.
    Plugin Support amboutwe

    (@amboutwe)

    Yoast SEO uses the WordPress-generated robots.txt file by default but allows you to create a static robots.txt file. I see that your sites have the correct generated WP robots.txt files (example links below). Therefore, your shared robots.txt must be a static file on the server.

    To use static robots.txt files on each domain, I found a resolved topic on WPML’s support forum containing a workaround. Changes to the separate robots.txt files should be made using SFTP or a file manager and not through Yoast SEO’s file editor. As WPML handles translations on your site, you may reach out to them for advice on translating these files within their plugin.

    To switch to the WordPress-generated robots.txt files, please use SFTP or a file manager and rename the static robots.txt file to robots.old. Cache, if in use, may cause the changes to take a while, so clear your site and browser cache before checking for the changes. You can safely delete the robots.old file once the generated robots.txt files are displayed.

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘An issue with Robots.txt through subdomains’ is closed to new replies.