The folks over at Vertical Leap did an experiment. They wanted to know if SE bots actually used their sitemap to find pages.
We recently undertook a little SEO experiment of adding a new page to the Vertical Leap website, with no links from any navigation anywhere, with the only reference to the page placed in the websites XML sitemap. We were trying to find out if the Vertical Leap domain strength, coupled with the XML sitemap reference would actually be enough to get the page indexed.
We didnít necessarily expect the page to be indexed, but we thought it was an interesting exercise to see if the XML sitemap could do this.
After all, the XML sitemap is designed to tell the search engines about all the pages that you want them to index Ė so if you add a page here you are requesting that this page is indexed irrespective of where this is in the site structure.
A month on from the start of this experiment, and nothing Ė the page hasnít got any visibility in any of the search engines.
Vertical Leap does not mention whether they submitted this sitemap to GOOG or BING via their webmaster tools. But they do allude to including the sitemap in their robots.txt file, which is the method prescribed by the spec [sitemaps.org]. There are also details missing from their account which would tell us what the page was like, whether it had outbound links, their normal crawl rate and indexing latency, whether bots had requested the robots.txt or sitemap.xml lately, etc.
Though I feel they may be jumping to premature conclusions about the efficacy of the sitemap protocol, the experiment is interesting and worth repeating, albeit with a greater sample set and more scientifically with more deliberate metrics and a control group.
I'd love to hear from other webmasters who have experienced similar or contrasting results.