Forum Moderators: Robert Charlton & goodroi
My site has 15,300 results in Google but I only have some 2000 pages! :-o
So I was wondering would it help to keep old Sitemaps together with new one in order for Googlebot to spider all old URL's and by some miracle figures out they have been redirected?
Or should I only place new Sitemaps and let him figure out what happened to old links on his own?
Any danger of duplicate content penalty? (probably not since I already got non-www and www version indexed plus non-default.asp and default.asp version and who knows what other combination in those 15,300 results ;-) )