Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: mack
The best place to start is by checking [sitemaps.org...]
Site maps (or "sitemaps if you prefer) are primarily meant to assist SE's in finding pages they may not otherwise find. SE's will spider down your site, but if your site contains a lot of sub-directories, e.g.: www.example.com/directory1/directory2/example.htm, site maps will expedite the spidering of these pages.
If your site has good navigation, I suggest only placing key pages in the site map such as those in the root directory and the first level down, but that is merely my opinion. If, however, there are pages you do not want spidered, I suggest a robots.txt file to disallow those pages. That being said, do not put a URL in the site map that you subsequently disallow in the robots.txt unless the disallow is for a specific spider.
And there are sites that will generate site maps for you, some of which allow you to set parameters. Otherwise, once the map is generated, you can edit it as needed.