Forum Moderators: coopster

Message Too Old, No Replies

MySQL compression?

Compressing large pieces of data in MySQL

         

erikcw

9:30 pm on May 23, 2006 (gmt 0)

10+ Year Member



Hi all,

I'm working on an application that deals with large chunks of plain text data. (about 5MB, sometimes larger, stored in a TEXT type field in MySQL).

I'm trying to speed up the user experience. The data is presented to the user, who then performs various operations on the data using a wizard type interface. (about 10 steps in the wizard, data is downloaded to the browser and then POSTed back to the server each time).

I'm using gzip compression on the pages to speed download time. I'm also trying various other things to try to improve the user experience and the actual time php spends processing the data.

Is there a way to compress the data in MySQL? My thinking is that this will speed the transfer between the MySQL server (localhost) and php.

Are there any other things I can do to speed this thing up?

Thanks!
Erik

eelixduppy

10:23 pm on May 23, 2006 (gmt 0)



Maybe this [dev.mysql.com] can help you a little. Good luck!

millyre

12:40 pm on May 24, 2006 (gmt 0)

10+ Year Member



I would think that if you compress the data at the PHP end with gzcompress() your queries and inserts into the DB would be smaller and faster, and there would be less traffic between the web server and DB (even if both servers reside on the same host). However, if you use the internal Mysql compression functions, I would guess that you will save space, but you won't get any significant speed gains.

You could also store the data directly on the server in files, and only storing the file pointer/name in the DB. It is possible to use the functions gzread() and gzwrite() to handle the data in PHP.

There are probably pros and cins woth both methods, I've never had the need to store that kind of large single rows, so I can't give you advice on that.