Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Moving old site to a brand new (never existed) domain

         

1script

5:19 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi all,

I've read quite a few posts here (inlcuding very recent) that deal with the issue of moving an existing site to a new domain. In most cases that I can see the "new" means "new to the current owner" and the entire process is initiated in hopes to leverage former ranking of the "new" domain.

I have a situation here that looks like an opposite of that: I have a site on an established domain. The domain name is rather long and, most importantly, no longer reflects what the site is about - after years of development one of its former small categories became the major theme of the site.

I have registered a brand new, never existed, name that is a tad shorter and better describes what the site's about. I expect it to be an advantage in future promotion efforts.

Anyways, my line of thinking was to simply redirect anything and everything from the old domain via 301 to the new via a properly formatted .htaccess or maybe even at the registrar level (if I can pass URI requests - have to check on that first). The project also involved a (modest) change in the template and adding new sections (directories) but it can all be done later, after the move is complete and all is settled.

But, after reading other posts here [webmasterworld.com] I feel that I respect tedster's opinion too much to ignore the fact that he put the type of move I envisioned just one step higher than the bottom of the list:


These are the kinds of details that can make a difference:

1. URLs and content remain stable for many weeks
2. URLs remain the same, but some new content is added to the pages
2. URLs remain the same, but some all new content is created for those pages
3. Existing URLs remain the same, but new URLs and text content are created
4. Existing URLs remain the same, but new URLs and content with outbound links are added
5. Both content and URLs change, but the domain still remains the same
6. After a waiting period of several weeks, existing content and URLs are redirected to a different but established domain.
7. Domain name is immediately redirected to a different domain, and all legacy content is also moved
8. Domain name is immediately redirected to a different domain, and all legacy content is just gone


Of course, I'm not trying to leverage rankings of the "new" domains since it does not have any. I would be very concerned though not to damage any rankings of the existing domain.

So, do you guys think tedster's list would be sorted differently in this kind of situation?

P.S. I should also mention this: while I was in the middle of planning the move, making new template etc. the old domain rankings plummeted real hard. No changes concerning the move have been done yet, just an unhappy coincidence. Should I delay the move until I get the old domain's problem sorted out?

tedster

7:19 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As you point out, I wrote up that list about keeping or using the ranking power of a "new-but-established" domain. Yes, in my opinion the process is different if you are simply trying to move an existing website to a brand new domain, rather than leverage the existing power of a newly purchased domain.

The main suggestion I have is not to change anything else at the same time that you are changing the domain. Just move the existing pages to use the same file paths on the new domain name, and 301 redirect from the old domain on the individual URL level. That should be a very simple redirect rule.

Then wait a few weeks until you see that the new URLs are well indexed before changing the templates. Of course if you are using absolute addresses for any internal links (including css and scripts) then those addresses need to change, but nothing else.

During that waiting period, notify as many other sites that link to you as possible about the change of domain name. Also, I've heard that filing a change of address within webmaster tools can help - I've had no direct experience with that.

Many people here report only a few weeks drop in traffic (or even less) when following these steps. The key, IMO, is taking things one step at a time. Let Google digest just the change of domain name before changing anything else.

TheMadScientist

7:40 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think, in addition to what tedster suggests, it has also been suggested by MC (I think, if not him some other G employee) to move the site one directory at a time to minimize the impact rather than redirecting the everything at once.

1script

9:22 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@TheMadScientist:

Thanks for your reply, TMS. Moving in small chunks sounds a lot like extracting a tooth by pieces :) It would be a logistical nightmare for a database-driven site that uses lots of URL rewriting.

I am, however, even more concerned about both sites being live at the same time. A part of my plan was to use WMT -> Site Configuration -> Change of Address but if both sites are live at the same time, I won't be able to use that, I guess.

Anyone has a success story like that to share? Had WMT address change request helped?

TheMadScientist

9:56 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am, however, even more concerned about both sites being live at the same time.

I actually work with databases quite a bit, so, I would plan on it, and here's what I would probably do in your situation.

1.) Make sure the DB you are currently using allows you to connect from the new site.

2.) Create the new site and have the whole thing working and 'live' so, yes, both versions of the site would be accessible, to visitors but would run off a single database. (Edited: I should probably more accurately say: All information would be available for visitors on newsite.com, but there would be an 'constantly decreasing' amount of information available on oldsite.com)

3.) NoIndex the entire new site or disallow it in robots.txt, so to search engines the sites and information are 'split'.

4.) Redirect one directory at a time from the old site to the new site and remove the corresponding noindex or robots.txt directive.

5.) When the move is complete transfer the DB to the new site DB and set your 'change of address' in WMT to make it 'officialer' (if I can say officialer, that is).

This keeps you from having to update two different DBs in unison or from having to 'split' the information available to visitors on the new site... When a visitor visits a 'new site redirect' link they will see the information they are looking for on the new site and have all the information available to them on the new site, so the oldsite.com would have a 'dwindling' amount of information, but visitors would be able to find all resources on the new site.

If you keep the architecture the same when users hit a 'redirected' directory, they will see the same links and resources on the new site as they do on the old site, and will be able to fully surf and view the new site...

You might even set a cookie based notice for first time visitors to newsite.com who have a referrer of oldsite.com with a notice like:

'Hi, Welcome to the new home of 'widget world' we are currently in the process of migrating from the site you were visiting to our new home, Right Here! During this transition you will only be able to find limited information on oldsite.com, but don't worry, all our information is still available to you right here on newsite.com... Please feel free to bookmark newsite.com right now by clicking "this link".

Thanks for visiting and enjoy newsite.com!'

I would also be all over the search bots and as soon as they spidered a directory and the SERPs started to reflect the change (I don't know if I would even wait until the change is 100% complete in this situation, so it might only be a couple days to a week I would actually wait between directories, depending on spidering frequency, etc) I would then redirect the next directory... Bascially, I would probably go with: Spidered, Old SERPs are disappearing, 'Pull the trigger' on the next redirect, and keep them as close together as possible...

1script

10:25 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thank you for your detailed answer, TMS. So, having them both live at the same time is a part of the plan. So, what amount of time would you say you'd give between the first directory is redirected and the "address change" request files at the end of the process? The directories hold different amounts of URLs (both URL and directory are virtual in this case because everything is in one DB). I'm assuming the aim is to have an approximate "move time per URL" and space directory moves accordingly, right?

I'm guessing the time lag between directories is for GoogleBot to crawl them on new site so you split your "crawl share" between the two sites? I guess, this is where you're losing me: what's the value of having Googlbot crawling those parts of the old site that don't server 301 yet?

On a large site once it visited a page, it may not come back for a few months, so I'd think you want to "cram" as many of those 301 redirects as possible as soon as possible so it knows to get this page at the new site. Else it may only learn about this redirect on the next visit only in a few months all the while the old page hanging around the Index. Should this be of any concern?

TheMadScientist

10:45 pm on Mar 27, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



what's the value of having Googlbot crawling those parts of the old site that don't server 301 yet?

They stay (about) right where they are in the SERPs... There's no change to those, and the timing of events from start to finish, for me, would be totally Bot Dependent. As soon as a majority of pages, or even 'key pages' within a directory are crawled and the SERPs start to reflect the change I would move the next directory. What this does is help keep 'moving shock traffic drop' to a minimum.

It's been reported here (Bing especially) can at times take a while process redirects through the system to be reflected in the SERPs, so rather than having all the pages moved on a single day and maybe taking (up to) 3 weeks to see the new locations replace the old locations in the SERPs you can maintain a 'more consistent' level of overall traffic to the sites.

I would personally not worry too much about infrequently spidered pages, because they are probably not the 'substance' and 'traffic drivers' either, so if there were a few 'danglers' not spidered I wouldn't worry about it too much, because until they are spidered there will probably not be a significant change in the SERPs related to those pages either.

One of the reasons this method was suggested (to the best of my memory) was so you don't make the change on the entire site today, have all the old URLs removed from the SERPs and then have to wait for the new URLs to replace them over the next 2 days to 3 weeks...

Changes in the SERPs are not 'instant' so it might be your redirected pages get spidered today, tomorrow they start to drop lower in the SERPs and in three or four days (or more) they are replaced by the new locations in the SERPs. During this time the terms with the pages 'on the way out' and a new set 'on the way in' won't be sending the normal amount of traffic, so by staggering the move you can keep the overall number of URLs in the SERPs higher and not risk 'a day without traffic' or something to that effect.

I haven't done this 'site to site' specifically, but have done it 'intra-site' when changing a URL structure and although there was a % drop in traffic overall it was no where near where it could have been (IMO) if I had 3 to 10 days of very limited URLs in the SERPs...

Basically, when redirecting large portions of a site I 'rotate through' the directories at (generally) about 1 per week, so I might take a traffic hit on one directory the first week, then the second week I'll add another, but the first is starting to recover and so on...

Another thing you might want to do (I just thought of this and think I've heard it reported it's possible) is designate specific directories as sites in WMT so you can manage them individually, which should make it so you can designate them as 'moved to' there as you apply the redirects.

Another thing staggering does IMO is makes it so if anything goes wrong you don't have both sites entirely dropped from the SERPs at the same time... When you see the new URLs beginning to replace the old ones, in my experience, you can fairly reasonably determine the move is going well and move on to the next directory.

Anyway, that's probably how I would handle it, YMMV, and I don't know what would happen if you redirected everything at once from experience, because it's not anything I've ever tried... Any large URL changes I've applied have been on a directory by directory basis for years, and it's been quite a while since I've had to apply large scale redirects to any site, so I can't say for sure exactly how things are currently being handled.

The general, underlying reasoning for this type of move has nothing to do with whether the move will be effective or where the new URLs will eventually rank after the move is complete... It's all about minimizing the traffic drop during the move, and by spacing the move out and letting SEs find it in sections the traffic drop is, in my experience, spaced also, so you might take a % hit on the overall traffic to the sites, but it will be spread over time rather than a 'plummet and hope to recover soon' situation.

It's more about traffic management than anything, and the real timing I would use in making decisions on when to redirect each subsequent directory would be traffic and 'new URLs replacing the old' based... If I see the new URLs are replacing the old in 2 days, and traffic is constant I would make the moves closer together, but if I see traffic is dropping or the new URLs are not replacing the old quickly I might spread the timing out a bit...

Again, it's not about whether the redirection or move will be effective over time nearly as much as it is about managing the overall level of traffic to the two sites during the 'Search Engine adjustment and re-ranking period' for the new URLs (website).

ADDED:
Some of the timing I would use would probably depend on the specific situation and how many directories I had to move... If there were 50 I might move one a day for 50 days, and 'roll through' the transition rather than waiting as long between, so I would know for the next 60 days I'm probably not going to have my usual traffic to the sites, but the drop would be more likely to be a 'rolling % drop' than an all at once 'disappearance'.

For me it would really depend on the specific situation more than anything else, so it could be on a really large site I would simply make the decision to move one directory a day and get through the move quickly at the beginning, then I could adjust the timing as I saw the changes take effect and either bump it up to two directories a day or one every-other-day or something if things weren't going as smoothly as I would like.

Just remember once you make a move like this if you make it site wide, there's no 'undoing' it, but if you move in segments, you have some time to watch and make decisions, which is why I would probably do it the way I outlined going from site to site the same way as I have intra-site, but ultimately, the decision is up to you and what you are most comfortable with doing.

aakk9999

2:17 am on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We have done it in "one hit" about 12 months ago. The size of the site was about 3000 pages. We first made sure both domains had the same WHOIS and added the new domain to the same GWT account and then did swap 1:1 for each URL.

As Tedster said, everything else remained unchanged - apart from domain part of URL, the rest of URLs were the same and there was page to page 301 from the old site to new.

At that time the "Change of address" in GWT did not exist, so we did not do this.

From the SERPs point of view, there was a minor drop that lasted about a week. There was also a slight period of confusion where Google has indexed the domain URL, showed new domain URL in the SERPs for about a week, then replaced it with old one for a week, but on the same ranking spot (almost as saying - are you sure you are doing this), then after about a week replaced again with the new domain URL which then stabilised.

So our experience was good. I think that the same WHOIS and the same GWT helped.

TheMadScientist

2:31 am on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@aakk9999 thanks for the actual experience info...

On a size note, since I haven't addressed page count specifically, I wouldn't hesitate to move the size site you're talking about (possibly up to 5000 pages even) at once either (unless it was not well spidered or something), but 3000 pages is actually smaller than most of the directories I was talking about moving within sites, and when you start getting bigger sites (20,000+ pages) it can take quite a bit longer to get a bot to visit them all, depending on PR and 'crawl budget' etc.

I think it's good you point out page count though, because it certainly makes a difference on how fast the bots get to all of them and it's not something I'd specifically mentioned at all.

Your 'one week or so to take effect' experience was right in line with the experience I had while moving directories of similar size (3500 to 5000 or so page) within a domain too. The bigger ones took a bit longer in my experience, up to about 10 days.

[edited by: TheMadScientist at 2:52 am (utc) on Mar 28, 2010]

1script

2:49 am on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@aakk9999:

Thanks for sharing!

We first made sure both domains had the same WHOIS


My guess is that was un-protected WHOIS, right?

@TheMadScientist:

Dude, what an awesome response! I'm going to have to put off reading it all until next morning 'cause my brain is already fried. But yeah, size matters. I have to worry about approx 1000 or so (formerly) ranked pages but the total amount differs between my directories with largest being 51,000 posts (approx. 70,000 pages 'cause posts are paginated)

I don't suppose anyone has any experience moving a site in a middle of its worst ever Google traffic crisis? Like I alluded to before, while I was taking my sweet time planning the move, new design etc., 90% of Google traffic suddenly disappeared on March 15th. What would you do, wait or bite the bullet and move?

Thanks for sharing, everyone!

Cheers.

TheMadScientist

3:23 am on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



NP 1script...

Actually, I've been thinking about this one for a bit today because I've been waiting on some info to do some real work and I usually work on directories that are a bit bigger than 'average' so here are a couple more thoughts I'm having:

First, I added a bit to my previous post, but will reiterate: aakk9999's 'move to take full effect' time was right in line with my moving directories of a similar size within a site, so it seems to be about the same either within a site or from one site to another, and it should give you an idea of how to plan accordingly based on the number of pages per directory.

Second, something I was thinking about before I read your most recent post was noindexing (rather than using the robots.txt disallow as an option), submitting an XML Sitemap and letting the new site get spidered before making the move, so G would already have the pages spidered but not in the index, which might help the effects take hold sooner...

I'm not sure and haven't thought through all the logistics of it, but it's definitely something I would consider, because then rather than 'spider, find redirect, spider, dissect, categorize, weight, rank' you would be looking at 'spider, find redirect, find removal of noindex tag, add to index' or something to that effect.

Anyway, my thought was the overall 'change processing time' might be cut down if the new URL with the info is already 'known but not indexed' from what it would be if the redirect led to the initial discovery. Like I said, just a thought right now and not anything I've tried.

##### # #####

In response to your most recent post, personally I'm not sure...

On an intra-site move I would probably just make the change (maybe even the whole site at once) while traffic is down, because why not? It's the same domain and I can put a redirect in and be done with it then try to find the issue, and I'm not really risking any traffic by doing it, which is the main reason I would spread it out otherwise, so why not?

On a new domain I'm not sure... I might actually try to at least find the issue first. So, let's pretend it's your posts being available on multiple individual URLs or something silly and by removing those from the index I could regain my rankings I might not fix the issue before the move, but rather have it corrected on the new domain from the start and then put the redirects in place.

The reason being domain history plays into rankings, so if I could find the ranking issue with the current domain and correct it on the new one, so not only was it a 'new domain' it was a 'correction' of the issue related to the old one I think it might be better from a historical perspective and it might be the direction I would go personally.

If the domain name was staying the same, then the history would be about the same regardless of the location of the content and 'there was an issue but the issue was corrected' would still apply... Basically, it is what it is, regardless of the URLs.

Moving to a new domain though, I don't know if I would want to start it off with the same issue, because you have the chance to start the domain name off 'clean' by either finding and correcting the issue before moving, or by finding the issue and correcting it on the new site before it gets spidered, and either way the history starts off 'good' rather than 'with an issue' present, so I think I might try to at least figure out what's causing the issue before moving and then I could at least try to have it correct when the move is made to start the domain off 'error free' ...

1script

3:30 pm on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@TheMadScientist:
Thanks again for your response. I would have to admit I don't have the level of confidence you do on the issue of putting noindex meta on a page that you actually do want indexed at some point in the future. In my experience (which probably stems from a non-optimal internal linking structure of the site) Googlebot does not come back for every single pages often enough to re-read it and realize that it can now be indexed. I think that, given it will only come back for this particular page in a few months, it might actually hinder the speed of transfer because I would not want to stretch the move for that long. No matter how good you plan it, it is still a painful and dangerous process and would be looking to wrap it up in, say, a month or two.

I will go and look more carefully for Googlebot visitation patterns. It comes every day for sure but quantity of visit per day may vary wildly. If I can deduct what's the approximate time between the visits to the same average page, it might help me understand better what kind of staggered move duration we are talking about. I have a feeling I'm not going to like how long it is, but I need to see the data first.

##### # #####

Regarding moving during a Google traffic crisis. You make great points. I wish I could tell it was duplicate URLs for sure but I can't. I have many sites axed the same day (10+) and every time it looks like I'm inching in on a potential problem, I find a site that was axed yet it does not have that particular one.

However, you were spot on: the very site I'm planning to move did in fact have multi-URL problem. Via a dumb programming mistake the pagination code started to insert "page 2', "page 3" etc. into the URL as opposed to the title only as intended. And the program that served the page, accepted it either way. And so, good number of posts (especially popular with many replies that were paginated) became available on multiple URLs, sometimes 5 or 6 for the same page.

It was a "Eureka!" moment for me when I found it - I thought I found the reason for my troubles - until I realized that none of the other axed sites have that problems. Nevertheless, Googlebot is still coming for those bad URLs ( now served 301 to good ones) and I would love to fix that during the move.

So, yes, a bunch of variables to consider here. Thank you very much for the food for thought!

aakk9999

10:18 pm on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@1script

Yes, it is non-protected WHOIS.

Forgot to say - we did two more things:

1) The same day we did site-wide redirect, we also deleted sitemap.xml from the old domain.

2) Not sure how much difference it made, but we hosted the both domains (old and new) under the same IP during the swap. Basically, we pointed the new domain to the same webspace. All requests would pass through redirect module and we had redirect module detect whether the request was for old domain or for new domain, and if it was for the old domain, we served redirect, if it was for the new domain, we let it through to fetch the page.

Once we got new domain properly indexed and settled, then we moved hosting (about 6 months after the domain swap).

It would be worth bearing in mind that if you do everything in one hit, Google will find number of pages on the new domain (via crawling through new domain) without yet finding redirect on the old domain (which it founds when crawls old domain), hence for a while some of pages would be duplicates between domains. This sorts itself out once Google crawls old domain page and finds redirect.

I would hope that "Change of address" option in GWT would help you as you are signaling Google your intention up front (rather than leaving it confused as to what is happening when flooded with 70,000 pages that are exact duplicates of pages it already has in its index).

TheMadScientist

10:23 pm on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would hope that "Change of address" option in GWT would help you as you are signaling Google your intention up front (rather than leaving it confused as to what is happening when flooded with 70,000 pages that are exact duplicates of pages it already has in its index).

That's what the NoIndex or robots.txt block prevents... I haven't had 1script's issue with NoIndex at all... I track bots on sites based on pages requested and can say, to date, they usually spider my NoIndex pages with as much frequency and regularity as they do other pages, but have not requested my robots.txt disallowed pages, even though they are usually listed as URL only in the results. I can also say any time I've removed either of the blocks the pages return to the index soon after... Note: Usually, in removing a robots.txt block for a section of a site or 'directory' I will 'submit' the main page of the previously blocked section just to make sure they know something has changed...

1script

10:53 pm on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



@TheMadScientist:
Note: Usually, in removing a robots.txt block for a section of a site or 'directory' I will 'submit' the main page of the previously blocked section just to make sure they know something has changed...
Submit as in the old fashion "Submit your URL"? I didn't think that Google page is still alive...

TheMadScientist

10:59 pm on Mar 28, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yep, it's still there and I occasionally still use it... ;)

I simply search for 'add url to Google' and it's always the #1 result so the actual location may have changed at some time, but there is still a live URL submit page for Google (and Bing).