I just hope I posted this question in the right category.
If you are doing international SEO for one site i.e. www.website.com for multiple markets and the local sites are adopting a subfolder method, how are you going to view & implement the robots.txt & sitemap.xml files for each of these local sites:
Here's how I would approach it. First, be sure to verify each subfolder as "a website" in WebmasterTools.
You only need a single robots.txt served from the domain root. You can get some economies with your Disallow rules by using pattern matching wildcards - in particular the "*" character. If the same URL pattern appears in more than one subfolder, a single rule like this would address all of the prohibited files:
For the xml sitemap, you could do the same thing (just use one sitemap at the domain root), but for ease of maintenance, I would probably use a master sitemap.xml at the root that lists only the filepaths for individual sitemaps - one for each of the country-specific subfolders. Then list the location of the master sitemap.xml file in the robots.txt file.
[edited by: tedster at 4:23 pm (utc) on Nov 15, 2011]