Google Indexing WordPress files and folders. Even after placing robots.txt
-
Hello. I am sorry if i am bringing this topic up on but i did the research before posting a new post. Google is indexing the wordpress folders and its contents as well and giving weird “fatal error” php links in the search engine. I learnt about robots.txt and placed it in my site /public_html folder but still no help. Now i’ll tell the details.
First of all I’d be REALLY glad if someone could verify the content of the robots.txt file that I’ve written.Please note that my wordpress folders are not in /public_html folder but in /public_html/blog/.
Content of the robots.txt file:
———————————–User-agent: *
Disallow: /blog/wp-admin/
Disallow: /blog/wp-includes/
Disallow: /blog/wp-content/Allow: /
———————————–
I also read somewhere that the robots.txt file should be read as type : text/plain rather than text/html. TO ensure that i added an .htaccess file to my public_html folder which had the following content:
AddType text/plain .txt
php_value auto_append_file none
php_value auto_prepend_file noneOn my google webmasters account, it shows the following after downloading my robots.txt file:
Allowed by line 6: Allow: /
Detected as a directory; specific files may have different restrictionsis there a problem? or the robots.txt file is okay? Please help me. Google is still showing crap in its search. And one more thing, on the google webmasters dashboard, for every everything it’s showing “no data available”.
- The topic ‘Google Indexing WordPress files and folders. Even after placing robots.txt’ is closed to new replies.