Googlebot can't access my site
-
I have 2 domains submitted to Google webmaster tools, non-www and www and I have set up preferred domain of non- www. I’ve submitted site maps to both and they seem to be working fine. However the www one is saying that it can’t access my site. Error is
https://www.digitalmarketingpost.com/: Googlebot can’t access your siteFeb 8, 2014DNS – green tick ok
Server connectivity – green tick ok
Robots.txt – Not ok, saying crawl post phoned because it cannot access the robots.txtURL Errors
20 Not found. – these are all 404 not found errors and this is ok because they are deleted postsThe non-www domain is working fine and no errors
I am using wordpress with yoast SEO plugin. I’ve changed my permalinks to https://digitalmarketingpost.com
The sitemap that the yoast plugin generates is sitemap_index.xml
So problem is the www version of my site in Google webmaster tools is not working. I set up a robots.txt file and all that is in it is as follows:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
Disallow: /wp-content/cache/
Sitemap: https://digitalmarketingpost.com/sitemap_index.xmlI did a fetch as google for the robots.txt on the www domain and it succeeded without any errors.
I did have the https://www.digitalmarketingpost.com set up on another hosting server in January but a good few weeks ago I deleted the folder and the SQL database from the server but google must have indexed that back in Jan and is now totally confused…. Maybe that is part of the problem? I have no idea, I’ve been looking at it for days now and can’t sort it?
- The topic ‘Googlebot can't access my site’ is closed to new replies.