Adding Robots.txt
-
I’m wanting to stop search engines from crawling a page I’m going to be setting up in the next few days and as far as I’m aware this means I need to upload a robots.txt file.
The thing is, I’m not actually sure how this is done. Is it simply a case of pasting the following code into the CSS style sheet:
User-agent: *
Disallow: /theurlthatidontwantcrawled
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
- The topic ‘Adding Robots.txt’ is closed to new replies.