Msg#: 3527383 posted 7:31 am on Dec 14, 2007 (gmt 0)
If a site has lots of deep content, and especially if content goes through rapid changes with new urls coming and old urls going at a fast pace, then an automated XML sitemap that pings Google every time it's updated can be a big help for more complete indexing.
For a smaller site, or even for a bigger site that does not change often, my preference is not to use a Google sitemap -- but rather to see if the existing site structure is presenting indexing problems to googlebot. I want to know about and fix that kind of trouble, so I can be confident that the url structure and navigational links are well designed. If a url gets indexed only because of an XML sitemap, then it can be harder to spot this kind of issue.
Msg#: 3527383 posted 5:03 am on Dec 19, 2007 (gmt 0)
I'm sorry I misunderstood your question.
I basically agree with Ted.
The sitemaps you're referring to have been great for my websites. I added them to one of my sites that wasn't getting crawled properly and within a short time the site now ranks in the top three for all my keywords. It's number one for the most popular search term. Members are now joining faster than I can keep up with them.
On another one of my websites I don't use sitemaps at all but still rank number one for the most popular search terms.
Msg#: 3527383 posted 5:22 am on Dec 27, 2007 (gmt 0)
I agree with other members. Google sitemap is very useful. I usually update my Google XML sitemap weekly as the website has few web pages which gets updated on a regular basis. Also, it lets Google or any other search engine to know the changes in your website.