• I’m wanting to stop search engines from crawling a page I’m going to be setting up in the next few days and as far as I’m aware this means I need to upload a robots.txt file.

    The thing is, I’m not actually sure how this is done. Is it simply a case of pasting the following code into the CSS style sheet:

    User-agent: *
    Disallow: /theurlthatidontwantcrawled

Viewing 3 replies - 1 through 3 (of 3 total)
  • No, you don’t paste it into your CSS file. Create a new text file called robots.txt in the root of your site, i.e. example.com/robots.txt and paste the code from above in that file. You’ll need to do this with a FTP client.

    Here’s an example file for your reference: https://www.google.com/robots.txt

    Thread Starter peps2004

    (@peps2004)

    Hi Blogjunkie,

    Thanks for the reply. How would I go about doing this with an FTP client? Sorry if this seems like a dumb question, but we update everything online rather than in a separate program.

    Hmm sorry it’s too difficult to explain how to use FTP client here. Try searching YouTube for a FileZilla tutorial. FileZilla is free FTP client for windows and Mac. Good luck

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Adding Robots.txt’ is closed to new replies.