When you type out your [domain]/sitemap.xml, does it show the page you are trying to hide/noindex? If it does then your turning it off is (obviously) not working and SSS should be able to resolve. My experience with some 3rd party tools is that despite turning off SEO (for example, for blog, turning off categories and tags), it still gives me duplication and canonical errors and I just ignore those. Additionally, some pages also are displayed with AMP errors by the crawlers of the 3rd party bots which I also ignore.
https://www.google.com/webmasters/tools/robots-testing-tool?hl=en will show how Google reads your robots.txt and I would go by that rather than what SEMRush says.