Forum Moderators: coopster
I'm working on an application that deals with large chunks of plain text data. (about 5MB, sometimes larger, stored in a TEXT type field in MySQL).
I'm trying to speed up the user experience. The data is presented to the user, who then performs various operations on the data using a wizard type interface. (about 10 steps in the wizard, data is downloaded to the browser and then POSTed back to the server each time).
I'm using gzip compression on the pages to speed download time. I'm also trying various other things to try to improve the user experience and the actual time php spends processing the data.
Is there a way to compress the data in MySQL? My thinking is that this will speed the transfer between the MySQL server (localhost) and php.
Are there any other things I can do to speed this thing up?
Thanks!
Erik
You could also store the data directly on the server in files, and only storing the file pointer/name in the DB. It is possible to use the functions gzread() and gzwrite() to handle the data in PHP.
There are probably pros and cins woth both methods, I've never had the need to store that kind of large single rows, so I can't give you advice on that.