I would like to get others' opinions or experience RE the following simple methods. Please note: my only important .txt files are robots and sitemaps (this post of course is asking how to get urllist.txt and sitemap.xml REMOVED from the SERPs, ie noindex...but follow). I don't have htaccess on our server, so can't do the X-robot solution. Plus I'm not that advanced. But I have read about this solution:
or one of the following:
Since I don't have the experience yet, I need others' honest help and to benefit from their experience on this question:
Does "disallow" prevent the sitemap file from being INDEXED (sitemap file itself) or from altogether being accessed (its enclosed links crawled)? //I don't pretend to be the first person to ask this question and understand the circular logic everyone hates where if it can't be crawled then how can SE know to disallow and other related stuff//
So looking for the simplest, smartest solution. Thanks for anyone who takes the time to help me.