• Hi everyone,

    I successfully created the sitemap for my page with this plugin but I am having some troubles with my robots.txt file. In my plugin panel it says

    The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!

    So I didn’t create another robots.txt in my root. But when I go to my webmaster-tools account google says that Googlebot was blocked by my url and couldn’t crawl my content. My virtual robots.txt blocks google completely.

    Where can I change this virtual robots.txt or how can I disable the file thus I can upload a regular one. Hope someone has an advice,

    Thanks!
    dk

Viewing 15 replies - 1 through 15 (of 19 total)
  • Hi,
    I do have the same problem. Please let me know if you find a solution.
    I begin my search yet :-p .

    me too, getting 401/407 authentication error in google webmaster tools

    Just found this, not WordPress specific but may be related
    https://forums.digitalpoint.com/showthread.php?p=8719602

    This is still a problem for me.
    After 4 days Google still hasn’t indexed my site, all 5 pages.
    The 401/407 error persists.
    Can I disable the virtual robots.txt file as it seems to be messing the whole thing up?

    Thread Starter do77

    (@do77)

    Hey guys,

    a little late but someone might still hasn’t figured it out. Create a file named robots.txt and upload it to your root. Google automatically prefers this one and your problem should be fixed. At least its working for me

    Thread Starter do77

    (@do77)

    Hey guys,

    a little late but someone might still hasn’t figured it out. Create a file named robots.txt and upload it to your root. Google automatically prefers this one and your problem should be fixed. At least its working for me

    Arne

    (@arnee)

    If the contents of your WordPress-generated virtual robots file block Google, check your WordPress privacy settings. There is an option in the admin panel to tell search engines NOT to crawl your site. Probably you have this option checked ??

    Thread Starter do77

    (@do77)

    Thanks arnee!
    What the heck, Ive never check the privacy settings and I seriously blocked all search engines ??

    I’m still confused . . . if this plugin is enabled, must I completely delete the robots.txt from the root of my domain if wordpress is installed in the root of my domain. What is the effect of having a robots.txt in the blog root and having the XML Sitemap plugin installed.

    Tks,

    Phil.

    THANKS ARNEE!
    I spent all day trying to figure out why my new live site was now blocking google. Turns out I *did* switch on the block search engine feature in wordpress during the preparation of the new site (since it was a duplicate of the site that was online).
    Thanks for the reminder ??

    I’ve unchecked the privacy setting. I’m still getting blocked. I’m using WebFaction for hosting. Any ideas?

    Tried the above.

    The privacy setting on my site is allow all and my robots.txt says:

    User-agent: *
    Allow: /

    But google fetch says access denied by robots.txt

    @philipanderson: No problem, you can add the following to your custom robots.txt file:

    Sitemap: https://url/to/sitemap.xml.gz

    @mcbrowne: Can you check if there is a static robots.txt file in your blog root?

    Hi,
    Have used this plugin successfully before on another blog. Have just installed on a new blog today here and am having some issues.

    I have built the sitemap successfully and I see the two sitemap files in the root of my website directory. However, when I submit the sitemap url to the Google Webmaster tool, I get the following error after submitting and waiting 5-10mins:

    “URL restricted by robots.txt

    We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.”

    I do not have a robot.txt file in the root of this site. However, when I now look at https://www.passiveincometrial.com/robots.txt it produces:

    “User-agent: *
    Disallow:
    Sitemap: https://passiveincometrial.com/sitemap.xml.gz

    which, I assume, is what is blocking the Googlebot from looking at the sitemap.

    I’m sure this is user error on my part somehow, but if someone could point out my mistakes in setting this up, I would greatly appreciate it.
    Thanks
    Tim

    and interestingly when I look at:

    https://www.passiveincometrial.com/robots.txt

    I see:

    User-agent: *
    Allow: /

    but Google is still saying the robots.txt file is restricting their access.

    Any help would be greatly appreciated. Thanks

Viewing 15 replies - 1 through 15 (of 19 total)
  • The topic ‘[Plugin: Google XML Sitemaps] How to change virtual.txt’ is closed to new replies.