• Ivan

    (@whocaresjustcurious)


    Hello!

    I have two websites https://domain1.com and https://domain2.com hosted in the same directory.

    I need to conditionally serve a different robots.txt file based on which domain has been accessed. What code should I place in .htaccess to do that? Let’s say I’ve created a robots.txt file for the second domain called robots_xx.txt.

    Thank you.

Viewing 3 replies - 1 through 3 (of 3 total)
  • I have two websites https://domain1.com and https://domain2.com hosted in the same directory.

    I would think that if they are two different domains, then each domain must point to a separate root directory for each site. (Site two is in a sub-directory located in public_html maybe?)

    I need to conditionally serve a different robots.txt file based on which domain has been accessed…
    …I’ve created a robots.txt file for the second domain called robots_xx.txt

    I don’t think that’s going to work. If they are different domains, then placing one robots.txt file in the root of each domain might be the solution. One probably shouldn’t affect the other in this case.

    • This reply was modified 7 years, 11 months ago by Clayton James.
    Thread Starter Ivan

    (@whocaresjustcurious)

    Thank you! I appreciate you answer. I will add a little background to clarify the issue.

    I had to place robots.txt file for domain1.com in public_html because for some reason I hadn’t be able to verify the file in google search console.

    I placed a separate robots.txt file for domain2.com in public_html/domain2. But sill the robots.txt file for domain1 is being served.

    Would you feel comfortable sharing links to both domains? Maybe we can spot something that will give a clue to what’s happening with the robots.txt files.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Serve a different robots.txt for every site in the same directory’ is closed to new replies.