Forum Moderators: Robert Charlton & goodroi
1. If the sitemap helps Google to index urls they weren't finding with their regular crawl - would you rather know that some factor is making it difficult to crawl those urls? Or is the workload so intense that this kind of analysis just won't happen in practical terms -- and therefore you would rather have sitemap "insurance" that Google knows all the urls that exist.
In other words, a Sitemap might hide crawling problems you would be better off not disguising. But it might be more practical just to use the Sitemap solution in some situations.
2. There's also the recent discussion here that a Google sitemap is also an open map to scrapers. So having that file open on your server "might" increase your chances of being scraped. It's for sure that there are many bots looking for that .xml file that are not googlebots.