Forum Moderators: coopster

Message Too Old, No Replies

Transferring large website to new host

         

mr_nabo

1:18 pm on Jan 18, 2010 (gmt 0)

10+ Year Member



Hi,

I've got a site that is over 2gig in size (I think it's around 3-4gig) and I want to change webhosts. So, normally I would use this script:

<?php 
// Set path to download
$dl_url = 'http://www.domain.com/_bscript/date_name.zip';

// Check if everything's ok - create a link to download zip file
if (system('zip -r /home/user/domains/domain.com/html/_bscript/date_name.zip /home/user/domains/domain.com/html'))
{
echo '<p>All zipped up!</p>';
echo '<a href="' . $dl_url . '">Download</a>';
} else {
echo '<p>Damn! Didn\'t work.</p>';
}
?>

However, the limit on zip files appears to be 2gig meaning I can't use that method for this site. What's the best way for me to create an archive to download and re-upload to my new webhost?

They have said I only need to create the archive for them to download and they'll do through the rest...

Thanks

andrewsmd

2:46 pm on Jan 18, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not necessarily a solution, but could you break it up into two separate files?

tangor

3:56 pm on Jan 18, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Limit might be available space. Creation of a zip file requires approximately three (3) times the size of the content (content, workfile, outputfile). Should support 4gb. Might look to the split archive function (specify archive parts file size) to see if that will make a difference. Also look to the different compression methods to speed up the process.

Psychopsia

8:40 pm on Jan 18, 2010 (gmt 0)

10+ Year Member



Hi! Do you have SSH access to your hosts?

Two years ago I moved my host and used SCP.
[en.wikipedia.org...]

Hope this helps.

StoutFiles

8:51 pm on Jan 18, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Why can't you just move the files to a local machine with a big copy/paste and then move them onto your new machine with a big copy/paste? A little more time consuming but you'd probably be done already.

mr_nabo

1:30 am on Jan 19, 2010 (gmt 0)

10+ Year Member



Hi guys,

I'm really not familiar with SSH nor much else with server migration. Is there a bit of PHP code that might be of use in creating multiple-part zips that can be uncompressed on the server in one go with another PHP script?

I'll look into SSH and Secure copy in the meantime.

Thanks for your help so far

Psychopsia

2:06 am on Jan 19, 2010 (gmt 0)

10+ Year Member



With SCP I moved completely 3 GB in about 10 minutes between hosts, the best option if possible.

While writing previous line, I was thinking another idea using PHP...

Make a script to read local files and directories, then send them to new host via FTP. Call set_time_limit function to not die in the process.

Even can create an exception file list, if needed.

lammert

7:43 pm on Jan 19, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Something is going wrong here,

I assume you are only having HTML files because that is the directory you want to backup and you never mentioned a database. 2GB as HTML files is not a standard site. I personally think you don't have 2GB source files.

What I think what is happening is that you try to add you ZIP file to your ZIP file which causes the ZIP process to explode.

Look at your system call:

zip -r /home/user/domains/domain.com/html/_bscript/date_name.zip /home/user/domains/domain.com/html

You are creating /home/user/domains/domain/com/html/_bscript/data_name.zip, and you are filling that zipfile with contents from the /home/user/domains/domain.com/html subdirectory.

Zip is now working like a dog chasing its tail. It tries to store the file date_name.zip inside the archive data_name.zip. The archive file will get bigger with every call to ZIP and the process stops once you reach your disk or RAM memory limit (probably 2GB as you mention).

mr_nabo

8:29 pm on Jan 19, 2010 (gmt 0)

10+ Year Member



Hi Lammert,

True, that system call is calling itself. However, the files that are making the zip file so big are the videos and audio files hosted on the site.

I assume making the zip in a directory below /html would help, but the problem still remains that there are over 2gig of video and audio files to compress into that zip file...

I've been looking at an scp -r call via SSH to clone the contents across to my new webhost, but it would be great to know how to make multi-part zip files on the server

lammert

9:13 pm on Jan 19, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Video and audio files explains a lot. Compressing these files gives not any benefit, so you could use tar as an archive tool instead. As far as I know tar has no intrinsic 2GB limit. The output of tar can be sent to the tool split which will split it in several files.

You could try something like

system('tar cf /home/user/domains/domain.com/all.tar /home/user/domains/domain.com/html');
system('split -b100M /home/user/domains/domain.com/all.tar /home/user/domains/domain.com/html/parts')

The first command will create a .tar archive.

The second command will read the tar file and output it to your html directory to files which have the name partsaa, partsab, etc. Each of these parts files will be 100 megabytes (due to the -b100M option).

You can send these files to the other server. There the following commands are needed:

cat parts* > all.tar
tar xvf all.tar

And all your files will be transfered to the new host.