• I am trying to stop google (and other search engines) accessing a particular directory.

    and I don’t want the pages listed in that directory turning up in google serps.

    So I have put the pages in a directory and I am going to do the following.

    The directory that I want to restrict access to is /quotes

    1. Add

    User-agent: *
    Disallow: /quotes/

    to the robots.txt file.

    So now I have hopefully stopped google displaying results from within this directory.

    But will this also stop this page showing as well or do I need to do more?

    https://www.capabilityevents.co.uk/quotes/

Viewing 5 replies - 1 through 5 (of 5 total)
  • that should stop the whole folder from being indexed

    Thread Starter benleeke

    (@benleeke)

    But will this also stop this page showing as well or do I need to do more?

    https://www.capabilityevents.co.uk/quotes/

    do you not want the page to show for anyone?

    Thread Starter benleeke

    (@benleeke)

    I’ve solved the problem by installing Robots Meta plugin which gives me a small panel under the update or publish button. This allows me to spec noindex, nofollow which seems to do the trick.

    We use wordpress to produce quotes for clients. Even though these are password protected google still lists the pages and a snippet from the quote.

    So now with the Robots meta plugin, when we do a quote we just click the noindex, nofollow radio button and I hope google won’t list the quote pages on their serps.

    Do you think this will work?

    Thanks for the help.

    ahh…sounds like a winner

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘How do I stop search engines accessing a directory’ is closed to new replies.