We ran a Semrush Site Audit yesterday and it found 180 pages that were blocked by X-Robots-Tag: noindex HTTP header. Double checking in Google Search Console confirmed the pages were not indexed (Excluded by ‘noindex’ tag) and even the Request Indexing and Test Live URL both failed in Google Search Console.
Is there a way to resolve this with Seopress so that all pages are indexed (except for ones that have the Advanced meta robot setting “Do not display this page in search engine results / XML – HTML sitemaps (noindex)” checked)?
Thank you,
Adam
]]>I have upload the yoast sitemap to google console:
The http version has been includen with no problem
The https version is not uploading (Could not read sitemap)
When I inspect the url, Google says:
Is indexing allowed? No: the “noindex” tag was detected in the HTTP header “X-Robots-Tag”.
Any help?
]]>Hey, I am having the same issue as this person with the google search console for my sitemap.
I don’t seem to have the ‘noindex’ meta tag anywhere that I could find. It seems that all the robot files and meta tags are correct as well. But maybe I am wrong.
How would we fix this issue? Should we go to google to try and ask them if you dont see any problems on your end?
Indexing allowed? No: ‘noindex’ detected in ‘X-Robots-Tag’ http header
The thing is, I registered this site in Google Search Console without any issue and it worked just fine for a few days. Then I was reading around (I’m very new to all this) and found that I needed to add my sitemap to Search Console. I went into WordPress and got the site map from Yoast SEO, while I was there, I saw a prompt from Yoast SEO plugin saying something along the lines it could perform some optimizations for my site so I clicked and it did whatever it did. I submitted my sitemap to Search Console and it was accepted but then the next time I checked Search Console, it was showing no valid pages because of the noindex thing. I don’t know if it’s the optimization causing this issue.
Any help is greatly appreciated,
Thanks!
google search console returning the following message for all my pages:
Indexing allowed? No: ‘noindex’ detected in ‘X-Robots-Tag’ http header
HTTP / HTTPS Header Check returning the following:
X-Robots-Tag => noindex, nofollow
Yoast SEO plug installed, the obvious checked with that
Below is my robots.txt
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /uploads/wpo-plugins-tables-list.json
Below is my .htaccess file
# BEGIN WordPress
# The directives (lines) between "BEGIN WordPress" and "END WordPress" are
# dynamically generated, and should only be modified via WordPress filters.
# Any changes to the directives between these markers will be overwritten.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress
Please help!
]]>I deactivetd Yoast plugin it’s ok in google search console….
Can you help me with a tip?
]]>It is highly important for Google to be able to index RSS feeds now with the way they look for podcasts by indexing them. With the RSS feeds not being indexed by Google, my podcasts will not show up in the new Google Podcasts app.
Is there a way to prevent Yoast from setting my RSS feeds as noindex with the X-Robots-Tag header?
]]>I don’t want to get my website page indexed in search engine, so I want to stop crawlers from crawling and indexing the page using X-Robots-Tag via HTTP header.
Please tell me how to use X-Robots-tag for a WordPress website.
]]>