Msg#: 4005306 posted 9:51 pm on Oct 11, 2009 (gmt 0)
I have a client with one website that ranks very well in the search engines and they have now for several years. They have a specialized market, but over the years have diversified into related fields and built at least 6 other websites for these related markets and some of them rank ok, but not nearly as well as the original. Recently they decided to merge all websites into one “superstore” where customers can shop at any of the websites and use one shopping cart. Each still has its own domain, but they all share a session so that the customers can purchase from any site and checkout with one shopping cart. The site is set up similar to gap.com. They ran into an issue where the “one” site that had great rankings suddenly dropped from the engines. They discovered that the code that allows customers to share a session also prevented bots from crawling the site. They put a patch on it and now the bots are back and the site is making a come back by using a whitelist program for bots. By only allowing whitelisted bots to access the site they are obviously limiting their exposure. The big serps are covered, but they may be missing others that could help them out too. The programmer came up with a solution to share a session and not exclude any bots, but it requires all domains to be converted to subdomains. If they make the switch to subdomains I’ve recommended using 301 redirects from the top level domains to the subdomains and focusing heavily on branding of the websites so that customers are not confused. However, I’m not sure if switching is the right move. I know they have good content and the means to continue good ranking, but I’m not sure how badly their “one” domain may or may not suffer from that top level domain switching to a subdomain? Sorry to be so lengthy, but I wanted to try to cover as much as I can and I’ve never ran into this situation before. Thanks for any advise you can provide.