I have a huge problem for some time, a bad plugin put me a virtual robots.txt;
User-agent: *
Crawl-delay: 10
and I can not destroy this.
I had put a physical robots.txt, but the result is nil because Google continues not to reference me.
I just try your plugin, (another before) and nothing changes. I register your
User-agent: *
Disallow:
but my files still indicate
User-agent: *
Crawl-delay: 10
what can i do more ?
]]>