Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Forcing a URL update with Google Sitemaps?

Should old 301'ed URLs be included to force and update in Google's index

         

erichazann

3:55 pm on Jul 17, 2007 (gmt 0)

10+ Year Member



We've recently changed the structure of some URLs. from

www.example.com/index.php?tag=Blah

to

www.example.com/tag/Blah

This is using WordPress.. WP accepts the old URL as well as the new, but we only want to promote the use of one.

We will be 301'ing the old URLs to the new ones so that the old ones are dropped from Google's main index.

The old URLs are in Google's index. We are preparing a Google Sitemap to submit. My question is, should we include the old URLs in the Google Sitemap to force Google to see the 301 redirect? If so, should we not include the new URLs in the Google Sitemap? (well, until we are sure the old ones have been dropped?)It seems to me, including both would seem like duplicate content.

Thx for any help!

tedster

7:22 am on Jul 18, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The sitemap doesn't "force" anything. If Google already knows about a url, it will re-crawl it on some schedule or other. I would suggest you clean the sitemap of the old urls once the 301s are in place.

erichazann

2:34 pm on Jul 18, 2007 (gmt 0)

10+ Year Member



I thought that pinging the Google servers would have it come crawl the Sitemap. My concern was, that with the 301s in place and the old URLs not in the sitemap, it would crawl the sitemap and find the new URLs, but never see the 301s until it naturally crawled the old URLs again. So it won't find the sitemap until it comes back on it's own? If that is the case, then yeah, I would not include the old URLs in the sitemap. I was thinking that I could make it see the 301s sooner by putting the old URLs in the sitemap.

tedster

4:07 pm on Jul 18, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can't say I'm 100% correct - I don't have any special hotline to googlebot's crawl logic. Your plan might save a day or two.

But I think the critical factor is not the crawl so much as the following calculations done on google's back end. The whole question may be a moot point - and either approach may end up giving you the same result. Just be sure not to leave the old urls in the sitemap for too long, if you do go down that route.