Viewing 7 replies - 1 through 7 (of 7 total)
  • @thnk4 I believe the sitemap normally contains all the URLS.

    If you don’t want certain URLs indexed/crawled, as long as you’ve specified that in your robot.txt file it won’t be crawled.

    I think there’s nothing to worry about.

    Thread Starter thnk4

    (@thnk4)

    Dear Sukafia,

    The google console gives me an error even if I crawl the whole website. The error is:

    “Sitemap contains urls which are blocked by robots.txt.”

    Any suggestion to fix this ?

    Thanks in advance,

    T4

    Apologies for the late reply. Yes there is.

    Yoast has a feature and even a guide for that. Follow these steps:

    1.

    Log in to your WordPress website. When you’re logged in, you will be in your ‘Dashboard’. On the left-hand side, you will see a menu. In that menu, click on ‘SEO’.

    2.
    The ‘SEO’ settings will expand providing you additional options. If you don’t see the ‘XML Sitemaps’ setting in your ‘SEO’ menu, please enable your advanced settings first.

    3.
    Click on ‘XML Sitemaps’.

    4.
    Click on ‘Excluded Posts’

    5.
    Under post to exclude – input the post/page IDs separated by commas. e.g 1,90,100, etc.

    I hope this helps. Kindly mark this topic as resolved if it does. Cheers

    • This reply was modified 6 years, 11 months ago by Sunday Ukafia.

    To find your post ID. Go to Post – All post – hover on the post you need the ID, right click and copy the URL/or just look at the URL at the bottom left hand corner of your screen.

    Locate post=”ID”&… the number you see after post = is your post ID.

    That’s the number between = and &.

    You can locate your page ID using the same approach.

    Thread Starter thnk4

    (@thnk4)

    Dear Sukafia,

    It looks like a good solution,
    But unfortunately the links defined as not found by the Google Console do not exist in my website and not even in the db, based on what phpmyadmin says.
    Thus I cannot find the IDs to exclude.

    Any suggestion ?

    Here is a pic of what the Google Console says:
    https://ibb.co/cpKyaG

    Thanks again,

    T4

    Plugin Support Md Mazedul Islam Khan

    (@mazedulislamkhan)

    We suggest you please try the steps given in our this knowledge base guide to check whether this resolves the issue.

    @thnk4 there are no blocked URLs in your robot.txt file.

    Just uninstall the plugin – delete it -install back and activate.

    (Remember to clear cache if you have a caching plugin installed.)

    Delete the sitemap in search console and add afresh.

Viewing 7 replies - 1 through 7 (of 7 total)
  • The topic ‘Sitemap contains urls which are blocked by robots.txt.’ is closed to new replies.