Forum Moderators: coopster
I've got a site that is over 2gig in size (I think it's around 3-4gig) and I want to change webhosts. So, normally I would use this script:
<?php
// Set path to download
$dl_url = 'http://www.domain.com/_bscript/date_name.zip';// Check if everything's ok - create a link to download zip file
if (system('zip -r /home/user/domains/domain.com/html/_bscript/date_name.zip /home/user/domains/domain.com/html'))
{
echo '<p>All zipped up!</p>';
echo '<a href="' . $dl_url . '">Download</a>';
} else {
echo '<p>Damn! Didn\'t work.</p>';
}
?>
However, the limit on zip files appears to be 2gig meaning I can't use that method for this site. What's the best way for me to create an archive to download and re-upload to my new webhost?
They have said I only need to create the archive for them to download and they'll do through the rest...
Thanks
Two years ago I moved my host and used SCP.
[en.wikipedia.org...]
Hope this helps.
I'm really not familiar with SSH nor much else with server migration. Is there a bit of PHP code that might be of use in creating multiple-part zips that can be uncompressed on the server in one go with another PHP script?
I'll look into SSH and Secure copy in the meantime.
Thanks for your help so far
While writing previous line, I was thinking another idea using PHP...
Make a script to read local files and directories, then send them to new host via FTP. Call set_time_limit function to not die in the process.
Even can create an exception file list, if needed.
I assume you are only having HTML files because that is the directory you want to backup and you never mentioned a database. 2GB as HTML files is not a standard site. I personally think you don't have 2GB source files.
What I think what is happening is that you try to add you ZIP file to your ZIP file which causes the ZIP process to explode.
Look at your system call:
zip -r /home/user/domains/domain.com/html/_bscript/date_name.zip /home/user/domains/domain.com/html
You are creating /home/user/domains/domain/com/html/_bscript/data_name.zip, and you are filling that zipfile with contents from the /home/user/domains/domain.com/html subdirectory.
Zip is now working like a dog chasing its tail. It tries to store the file date_name.zip inside the archive data_name.zip. The archive file will get bigger with every call to ZIP and the process stops once you reach your disk or RAM memory limit (probably 2GB as you mention).
True, that system call is calling itself. However, the files that are making the zip file so big are the videos and audio files hosted on the site.
I assume making the zip in a directory below /html would help, but the problem still remains that there are over 2gig of video and audio files to compress into that zip file...
I've been looking at an scp -r call via SSH to clone the contents across to my new webhost, but it would be great to know how to make multi-part zip files on the server
You could try something like
system('tar cf /home/user/domains/domain.com/all.tar /home/user/domains/domain.com/html');
system('split -b100M /home/user/domains/domain.com/all.tar /home/user/domains/domain.com/html/parts')
The first command will create a .tar archive.
The second command will read the tar file and output it to your html directory to files which have the name partsaa, partsab, etc. Each of these parts files will be 100 megabytes (due to the -b100M option).
You can send these files to the other server. There the following commands are needed:
cat parts* > all.tar
tar xvf all.tar
And all your files will be transfered to the new host.