• I am trying to have one of my pages NOT be indexed. I used the Ultimate Noindex Nofollow tool plugin – but that is not working – the page numbers I specify are getting indexed anyway. Someone suggested I use a Robots.txt file instead.
    since I am not sure how to do that, I looked through the forum and found KB Robots.txt plugin.
    I’m wondering – if i turn OFF (uncheck) the option in the Google XML sitemaps plugin to have the URL be placed in a virtual robots.txt file, can I then also use KB Robots to generate a robots.txt file with the pages I don’t want indexed disallowed?
    Not sure why the Ultimate plugin is not working, but I am looking for a work around

    thanks

Viewing 2 replies - 1 through 2 (of 2 total)
  • Moderator James Huff

    (@macmanx)

    Deactivate and uninstall the Ultimate Noindex Nofollow Tool and the KB Robots.txt plugins and they won’t do what you need them to do. Next, uncheck the option in the Google XML sitemaps plugin to have the URL be placed in a virtual robots.txt file.

    Now, use a plain text editor to create a file named robots.txt with the following contents (edit as necessary, of course. /post-you-want-excluded/ is basically the permalink without https://www.yourdomain.com):

    # robots.txt for https://www.yourdomain.com/
    
    User-agent: *
    Disallow: /post-you-want-excluded/
    
    Sitemap: https://www.yourdomain.com/sitemap.xml.gz

    Now, upload this file to the same directory as the wp-login.php file.

    Thread Starter adi339

    (@adi339)

    Thanks for this. Are you saying that none of the plugins works?? Or do they just not work with XML Sitemap?

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘robots.txt, XML sitemap and noindex’ is closed to new replies.