Preparing a major site relaunch for a site that has been around for about 6-7 years. The site ranks extremely well for relevant keywords, both generic and widget specific. For years, the main argument against redesign and relaunch was to not "disturb the beast", so-to-speak, and impact the SERPs. However, due to the aging design and difficulty to manage content it was decided that the site needed a face lift and overhaul. Another argument was to increase conversions based on better content etc.
Anyways, here is a non-exhaustive list of the steps that I have taken in order to (whatever extend it is possible) prevent dropping in the SERPs. Feedback for additional steps (or steps that should *not* be done) is appreciated..
* Redesigned link structure with keyword-rich URLs, designed in a logical manner
* .htaccess based 301 redirects from _all_ old internal links to the new corresponding URL
* XHTML 1.0 Transitional validated page (I know it's basically HTML4.01 but IE doesn't do full XHTML text/xml well..)
* CSS validated style sheets included as files
* Compliant robots.txt file (nothing changed here..)
* Sitemap; linked from the footer of every page
* Custom 404 page with sitemap links
* XSLT compliant Google sitemap.xml file (for submission to G!)
* Meta Tags/Titles: Old site had some products with unique titles/meta tags and most other pages used globals. Old internal pages (that rank well) retain current unique meta tags, while other internal pages are given new optimized meta tags
* rel="nofollow" on 'Print View' links etc. that could trigger dupe content filters..?
* Extensively rewritten and expanded site content with unique and grammatically correct copy.
* Before relaunch, slowly updated some of this new content as to not dump too much new content at one time, since the relaunch has to be site-wide.
* Carefully matched old to new URLs with 301 redirects, as well as slowly adding corresponding pages to the current site for otherwise new links (prevent triggering sandbox/tarpit behavior by SEs from dumping too many new links)
* Keep track of the major SE's IP ranges to prevent them from getting stuck in the "bad bot" spider trap
* New internal site search feature (not sure if this would have any impact possitive or negative?)
* Division of the 'Links' section into subcategories, but maintaining current outgoing links.
Hopefully this helps others in the same situation..