These errors appear because you use the Stats module on your site.
You can rest assured, though: such errors will have no negative side-effects, and won’t impact your site or its ranking.
Google’s Blocked Resources Report allows you to spot pages on your site that cannot be indexed by Google. It’s a good way for site owners to make sure there are no pages that Google can’t crawl because of an error, or an unintentional blog.
In this particular case, the resource mentioned in the report is not a page of your site: it’s the tracking code used by Jetpack to record visits on your site.
We’ve recently added a robots.txt file to the domain hosting the tracking code, to ask all bots to not index anything on stats.wp.com. We do so to prevent well-behaving bots and search engines from ever hitting the stats pixel and thus triggering a view in your stats.
Until recently, we maintained a list of these bots and search engines crawlers, and excluded them from the stats, but by adding the robots.txt file it allows us to be more precise, and to improve the accuracy of your stats.
To conclude, you can continue to use Jetpack Stats safely, and ignore these errors. As they’re not related to pages hosted on your site, it won’t have an effect on your site’s ranking in Google and other search engines.
The only effect it will have is a positive one: from now on, Jetpack Stats will block more bots, and consequently give you more accurate stats by excluding more false visitors from your daily views.
In the future, these errors will eventually disappear as we’ve made further adjustments a few days after our initial set of changes. Our new measures remain in place, but we’ve added an extra rule to make no errors get displayed in Google Search Console.