The cause of the problem is that Google is not seeing the newly added because it's not re-indexing the page. X-Robots-Tag
Removing the ban from robots.txt and letting Google fetch the pages with their headers does remove the pages from the results.
Sitemaps don't have much to do with indexing or rankings. You could remove URLs from your sitemap or even remove your sitemap entirely without hurting your SEO.
changefreq and priority fields are just ignored. Googlebot comes back to crawl URLs based on how popular they are and how often Google has seen them change before.For more information see The Sitemap Paradox.
robots.txt
User-agent: *
Disallow: /
this will block all search bots from indexing.
for more info see: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360