Hi, you can add custom robots.txt rules on Settings > Reading. This is independent of any XML Sitemap setting.
I should maybe explain some fundamental things to consider:
1. Excluding a page or post from the sitemap will not prevent search engines finding and indexing that page or post. It will also not add any rules to the robots.txt file.
2. Adding an exclusion rule to the robots.txt manually will prevent search engines from re-visiting a certain page or path on your site but it will not remove that page or directory from their index.
If you really want to get and keep (!) a page or post out of the index and search results, then you best do two things:
1. Go to your Search Console / Webmaster Tools account for each major search engine (if you do not have such an account, create one) and ask for a removal of the specific post/page
2. Add a robots:noindex meta tag to the page itself. This can be done with most (if not all of the) SEO plugins available on the plugin repository.
I’m considering adding this feature for this plugin as well but it’s outside its core function so I cannot guarantee that ever happening.