Forum Moderators: coopster
I am trying to insert/update large chunks of data in mysql. I just ran a test UPDATE query (40,000 word document into a largetext field), and got this error:
Got a packet bigger than 'max_allowed_packet'
What is the best way to handle this? Do I need to modify my mysql gonfiguration to allow larger packets? Or should I check for packets of a certain side with php, and if too large break them up an append the data into the field?
Also, is mysql going to be able to handle a large nember of users at once doing UPDATE queries, or am I going to run into locking issues? Would it be better te have each UPDATE be a new select?
Thanks for your help!
Erik
[mysqld]
max_allowed_packet=16M
Before MySQL 4.0, use this syntax instead:
[mysqld]
set-variable = max_allowed_packet=16M
changing 16M to the size you want. Mysql 4.0 has a limit of 1GB (max_allowed_packet=1G).
I'd also consider using transactions when you do updates.
If you've never checked out transactions, its basically a method to lock a table while writing to it, so writes happen one at a time. You can set it up to lock the reads as well, and can commit changes or rollback as seperate step. This will create a queue for write queries.
It may make a good precautionary step if your worried about the large writes.
I was also thinking about your large documents, and was curious why your writing them to a text field. I would be more inclined to store the document name, ID, and a path/to/file in a table, then upload the actual documents to a directory on your server. Then link to them from whatever application your working with from a query.
Just a thought, especially if you really donn't need the data in a table, or not going to be doing text searches on the actual documents.