@findsteps
We went ahead and inspected your site’s robots.txt file and noticed there are some additional rules added and sitemaps links are being included from it.
The robots.txt standard supports adding a link to your XML sitemap(s) to the file. This helps search engines to discover the location and contents of your site.
But we’ve always felt that this was redundant; as you’ve already added your sitemap to your Google Search Console in order to access analytics and performance data, you don’t need the reference in your robots.txt file. We suggest you to remove those sitemap links.
We usually recommend the file to look clean like this:
User-Agent: *
Disallow: /wp_admin/
Allow: /wp-admin/admin-ajax.php
This would allow GoogleBot to crawl all the links on your site and better understand how the pages are viewed and displayed. This guide explains more:?https://yoast.com/ultimate-guide-robots-txt/. We have additional recommendations on robots.txt here:?https://yoast.com/wordpress-robots-txt-example/.
Please note the above suggestion is to optimize your robots.txt for better crawlability of your website.
We understand that your main concern is why Google is unable to discovered URLs from the sitemap. Technically, we looked at a recent post published (21hrs ago) like https://bohatala.com/trans-saharan-gas-pipeline/ and we can see Yoast is outputting the right data on it and is also indexed by Google on search result page.
The sitemap is generating pages correctly on it: https://bohatala.com/sitemap_index.xml
There is nothing about the Yoast plugin that is causing the issue. It is working as expected. We think there is an issue with Google and its ability to crawl the sitemap. We suggest contacting Google for more information as to why they are not able to crawl pages that have the right data on it and are showing a 200 OK status. You may contact Google by making a post on their forums: https://productforums.google.com/forum/#!forum/webmasters.