Viewing 1 replies (of 1 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Hello!

    /robots.txt only works for the domain’s root directory. So, example.com/robots.txt and something.example.com/robots.txt are valid, but example.com/something/robots.txt is not.

    So, multisites can have multiple (virtual) robots.txt outputs, but only if it’s set up using subdomains, not subdirectories.

    If you want Google to know about every sitemap on your network, I recommend using Google Search Console and submitting them here: https://www.google.com/webmasters/tools/sitemap-list.

    You can submit them to the Bing network here: https://www.bing.com/webmasters/sitemaps.

    There were discussions about including multisite sitemaps to robots.txt as you described in The SEO Framework itself (I wanted it entirely automated), but because of the complex customization requests that would inevitably follow this feature (translation support, NGINX conflicts, exclusions, etc.), I forwent including it. See https://github.com/sybrew/the-seo-framework/issues/147.

    Still, this is achievable using filters. If you wish to use such a filter, I could create one for you. Let me know!

Viewing 1 replies (of 1 total)
  • The topic ‘robots.txt for multisite’ is closed to new replies.