Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Final word on Google Sitemaps for old sites

Helpful, not helpful, or dangerous?

         

pdivi

3:18 pm on Jul 25, 2007 (gmt 0)

10+ Year Member



Background: I have a couple of old sites that are frequently visited by the googlebot. The link structure on the sites is designed so a spider can make its way through to every page, and there is an link to an html sitemap in the site footer on every page. Most of the pages on the sites are indexed on G, though about 80% are supplemental.

Question: The Google XML sitemap builder I use gobbles bandwidth, so I've laid off updating my Google sitemaps for the well-indexed sites. Will not updating sitemaps harm me?

I've actually seen some claims of rankings tanking after sitemap updates, so I'm also wondering if not updating might actually help me.

tedster

5:26 pm on Jul 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd say hoping for the "final word" here is hoping for too much. Only a Google crawl team member can say for sure - and even then, any information would change over time and still not be final.

But I would suggest either keeping your xml sitemap updated or removing it. The service is supposed to help Google find urls that may not be crawled otherwise. But that's about crawling, with no promises made about indexing and ranking -- very different functions done on Google's back end after the crawler has retrieved the urls.

Given this, I don't really see how an out-of-date sitemap could help with ranking or supplemental issues, unless it keeps spurious urls away from Google's knowledge. Since your link structure is providing a path to every url, that seems unlikely.