GoogleBot Rendering – but clear robots.txt
-
Hi there guys
I seem to be having an issue with GoogleBot rendering my site correctly (juxtalegal.com). I attempted a mobile usability test and (although the site is optimised for responsiveness and works perfectly on a phone) I kept getting changing results – first saying a pass but with errors, then saying a fail, and also reporting that it was unable to fetch the URL.
I’ve since delved deeper, and tried fetch and render in Webmasters.
Having tested fetch and render for Desktop and Smartphone – it appears users see the site fine, but GoogleBot is not correctly rendering the page. I’m receiving a ‘Googlebot couldn’t get all resources for this page.’ error, with a large list of URL’s – all stating ‘Temporarily Unreachable’. I’ve also intermittently received a general ‘Temporarily Unreachable’ error when attempting fetch&render for smartphone devices – albeit this is intermittent and changes between that error and ‘Partial’. I’ve checked my robots.txt and I don’t have anything blocking GoogleBot at all.
Having delved deeper I noticed my site is currently running 20 External CSS and 70 External JS. Could this possibly be the reason?
It all seems very odd. The site was working perfectly just 2 weeks ago.
Any assistance would be much appreciated.
Thanks, Tom.
- The topic ‘GoogleBot Rendering – but clear robots.txt’ is closed to new replies.