robots.txt is blocking search engines on subdomain blogs whatever I do
-
I have a multisite WordPress installation with sub-domains, and today I got a message from Google that their crawlers are being blocked by robots.txt. But whatever I do the robots.txt file on the subdomains remains the same while the main domain’s robots.txt is fine and reacts to the solutions I’ve tried.
Here’s what it says in the subdomains robots.txt:
User-agent: *
Disallow: /Here is what I tried:
– made a custom robots.txt file in the root directory, this doesn’t have any effect on the sub-domains only the main domain
– remade the network from scratch with both Softaculous and the www.ads-software.com old and new versions, without plugins/themes, new fresh database same thing
– I even deleted the wordpress function from functions.php that makes the virtual robots.txt file
– Tried various plugins that manipulate the robots.txt file none have any effect on sub-domainsWhatever I do has effect on the root domain, but in the subdomain blogs it has no effect whatsoever.
How is it possible that wordpress is still making a blocked robots.txt file even though I deleted it’s function to do such a thing and in the root directory I have a correct hand made file? Is there any other function that handles the virtual robots.txt?
Please does anyone have any other sugestion that I didn’t try? I don’t know what else to do… This used to work without problems now all of a sudden whatever I do it has no effect.
Thank you
- The topic ‘robots.txt is blocking search engines on subdomain blogs whatever I do’ is closed to new replies.