homepage Welcome to WebmasterWorld Guest from 54.196.77.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Switching hosts - Is there a problem keeping the old site as a backup
coachm




msg:4548405
 7:55 pm on Feb 24, 2013 (gmt 0)

For a number of reasons, I've had to change hosts once in a while, but I tend to maintain accounts even when I move a domain away from a host.

So, basically, I'll keep two identical sites on different hosts, so if there's a problem with the one that's pointed to via nameserver I can easily switch over.

But it occurs to me that this might be a bad idea in terms of duplicate content, even though theoretically the old site isn't available via its domain name. Typically I use shared servers, without dedicated IP's, but is it possible Google will access both sites in spidering and penalize?

If so, what's the fix? Somehow secure the site I'm not presently using from google?

 

Andy Langton




msg:4548423
 9:01 pm on Feb 24, 2013 (gmt 0)

Google uses the DNS just like everybody else, so other than the short period where there would be to separate DNS records for the same site (i.e. at the time you change the records to point to your new host) there is no duplication.

It's possible, of course, to manually request a domain name via an IP that does not appear in the DNS, but that would be pretty rare and is certainly not an activity that search engines undertake.

Of course, even if they did so, it would still be the same content under the same domain name so there would still be no possibility of issues.

The only possible problem is if you serve the site on the old host under something other than your name (e.g. via an IP) but then, if you have a proper canonicalisation procedure in place, this would redirect to the canonical host anyway.

I think this is one you shouldn't need to worry about. That said, checking for duplication is a good idea generally, and if you found an old host cropping up for some reason, add it to your canonicalisation procedures.

netmeg




msg:4548425
 9:05 pm on Feb 24, 2013 (gmt 0)

I generally password protect the duplicate (or dev or backup) copy. Just to be sure. Can be easily removed if I need to.

Robert Charlton




msg:4548426
 9:05 pm on Feb 24, 2013 (gmt 0)

When switching hosts and not changing domain names, I always keep the original site up until DNS has fully propagated. Google and users should see one site or the other, but not both.

If your .htaccess is set up properly, Google won't index the site by IP number, so that should not be a problem. I've kept an original site up for a month on the old host before taking it down with no apparent difficulties. Probably not much point these days keeping it up longer than a week.

The above is a simplified description for a static site, and assumes you've got access to the A-records of your domain. I never host my DNS with my web host, as that can create propagation problems. Some web hosts make it difficult to leave them. On a site with dynamic content, you also need to pay attention to keeping the data on both sites in sync.

creeking




msg:4548430
 9:13 pm on Feb 24, 2013 (gmt 0)

Typically I use shared servers



you could get a reseller account to host your websites. this would allow you to suspend your own website, which would prevent spidering and viewing.

keep dupe sites, suspend the one you are not using.

Sgt_Kickaxe




msg:4548447
 10:08 pm on Feb 24, 2013 (gmt 0)

I do as netmeg suggests too, keep the files and hosting ready for a DNS switch if needed but password protect the files until then and disable the database for good measure.

If your site is updating regularly however you need to account for those updates on the backup server too.

deadsea




msg:4548472
 12:05 am on Feb 25, 2013 (gmt 0)

I have worked on a very large website (top 500 with millions of pages) that keeps a hot spare of the website around at a different data center with a different IP address.

I would recommend configuring your spare to only respond to a request for the correct host name. You can do this by configuring a virtual host that is not the default virtual host. Have the default virtual host redirect to the host name. That way, the only way that this site is going to get accessed is when DNS is pointing to it. It won't get crawled and indexed by Googlebot.

seoskunk




msg:4548480
 12:34 am on Feb 25, 2013 (gmt 0)

Its a great idea to keep old site as backup, just put robots.txt noindex file on server so ip isn't indexed

lucy24




msg:4548517
 3:06 am on Feb 25, 2013 (gmt 0)

robots.txt noindex file

?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved