Forum Moderators: coopster
I am the admin of a remote linux server, I have the shell access. I have to upload a 300 MB mysql DataBase.
Can anyone suggest me the best way to do it. PphMyAdmin crashes quite often doing it and I dont want to split it into the 10000 records of the total 300,000 records.
Even after exporting and zipping the file comes out to be 100 MB.
Please help,
Thanks
try this
[ozerov.de...]
it is a script that splits the text file into manageable chunks.. It is brilliant. worked like a charm.
I did not have access to root.
I used DBtools DBmanager (http://www.dbtools.com.br) to restore the DB, but it's not entirely bug-free and might cause trouble with such a large DB.
Perhaps MySQL Query Browser/administrator can be used?
I managed to upload the file, but unfortunetly there is a simple error in the create table command. Now I have to edit the huge .sql
I tried opening the file using shell access with pico and nano command. But it since the file is almost 300MB it says Nano out of memory.
Can anyone help me on how can I edit the first 20 lines of a file using shell access.
Thanks,
Jaunty
basically I have not been able to run this command:
CREATE TABLE `lyrics` (
`id` int( 10 ) unsigned NOT NULL AUTO_INCREMENT ,
`content` text,
`hits` int( 10 ) NOT NULL default '10',
PRIMARY KEY ( `id` )
) ENGINE = MYISAM DEFAULT CHARSET = latin1 AUTO_INCREMENT=260687;
if I remove the CHARSET = latin1 AUTO_INCREMENT=260687
the thing should run....
any Idea why my mysql server version 4.022 is not taking this command and what can be done to make it process this command.
Thanks
jaunty
Hex editors seem to either load the entire file or use the filesystem to retrieve just what they need, I don't know which package would work but I did have success with this method on a massive file in windows on low ram some time ago (I know it's a different OS but if it can be done on Windows then I'd imagine Linux will be up to it).
yes it will but at the moment I have already uploaded the huge file and the only reason why its not importing is that one line that we are trying to remove.
If I upload the file again without the create table command I am pretty sure the sql will be imported.
So the question is how to I remove a few starting lines from a file that has 300,000 lines in linux with shell root access.
Thanks,
Jaunty
all you have to do is to find where the database is stored e.g.
/usr/lib/mysql/mydatabase
shutdown mysql
tar and zip that directory
restart mysql
ftp the file to the new server
untar and zip into the /usr/lib/mysql directory.
This works great as long as things like permissions, users are the same etc.
I use it to quickly duplicate databases on backup servers. I use a script file to tar, zip and scp to the backup servers.
thanks everyone for helping me, The big dump program works very nicely. I have upgraded mysql version to 4.1 so now that Engine command worked and the DB has been uploaded.
in short, I FTPed the big SQL(gz) to the server and then used the bigdump program which is realy very cool. It splits the file into number of records that you specify and then keep uploading the data.
just make sure that this thing works locally before you upload and then realise that the create table(or something else) is not working properly.
Thanks everyone,
Jaunty
as Frank_Rizzo mentioned you can tar gz the whole set of folders on the original server and move it over and untar and unzip there
the only extra thing to that is your versions should be exactly the same and you should make sure on the original server that mysql is stopped first. It does some funky things if it isn't.
If you use a dump from the original db to create the new one. Once you get the monster uploaded just use vi if you need to make changes, not sure why you would need a hex editor.
first thing you need to do before doing any porting is versions and configuration need to be the same, any differences and your odds of errors go through the roof. If you can't make them the same then you should do your porting bit by bit. I can open a 300MB file with me text editor, not that it is much fun but it can be done. Split out your table creates and db creates and then leave your data seperate.
Strip down your table create statements so that your db will use defaults, should be fine as long as you aren't doing too many specific things.
You can split the file yourself, doesn't take much.
also using the commmand line you can do a lot more than just with phpmyadmin or whatever other crappy tool. I can tell you from experience, there is no tool that works better than just the good old command line.
want to pull in a monster file, go to your command line and do a little of this
mysql -u username -p databasename < /path/to/file.sql
works like a dream ;)