Forum Moderators: coopster

Message Too Old, No Replies

Trying to speed up a web app

Trying to make a web app more efficient

         

erikcw

6:11 pm on Oct 17, 2006 (gmt 0)

10+ Year Member



Hi all,

I'm working on a php/mysql application that is suffering load slow downs.

It's a "wizard" style app where users massage data through a series of steps - applying various filters and modification tools at each step of the wizard.

The data can range from 50 words up to 5 mb of data on the high end.

At each step, that data is SELECTed from the db, and echoed to a textarea. When the user goes on to the next step, the database row is updated with the new changes. The data actually goes into 2 fields in the row, 'data' [the working copy], and 'backup' [a copy without the changes from the last step - just in case they want to undo the last process].

As you can imagine, uploading and downloading up to 5mb of data for each step of the wizard puts a load on the db and is perceived as slow by the user by the sheer volume of data being transmitted back and forth.

What are some things I can do to make the application more efficient?

I've considered:
-storing the data and/or backup in temporary files on disk instead of in mysql.
-developing a different scheme for handling the "undo" function. Any ideas?
-some how keeping the data client side.

Any idea which of the above will work best, or maybe a direction I haven't thought of?

Thanks!

henry0

8:09 pm on Oct 17, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Could you envision forcing a “pagination” when the content is first loaded in the DB (for edit only, not for output)
Since the pagination will not be visible to the end user it really does not matter how it could be set

For example every what-ever-number-of-charaters followed by a dot could be a row.
On another hand it will generate more rows “text based” you might need to speed bench test it.
First evaluate the content then if size reaches the chunking level your script will do it.

Then for editing purpose display a "segment locator" by showing a text excerpt and bring that chunk to the text area.

filippotoso

9:04 pm on Oct 17, 2006 (gmt 0)

10+ Year Member



I've considered:
-storing the data and/or backup in temporary files on disk instead of in mysql.
-developing a different scheme for handling the "undo" function. Any ideas?
-some how keeping the data client side.

Using a database abstraction layer like ADODb you can achieve a level of optimization using query caching (for reducing load to MySQL).

For undo feature you can try to apply an algorithm to the content to extract (and save) only the differences between the various verions (and maybe saving the result only every X undo steps).

For client side handling you can use Ajax and maybe reduce the MySQL load but this probably will increase the HTTP traffic between the client and the server. Even in this case you should develop a way to reduce the data exchange for undo feature.

If the application needs to handle so much data, maybe you should chose to develop a stand alone application that uses XML-RPC or SOAP for data exchange.

Sincerely,
Filippo Toso

P.S.
If you haven't done it yet, you should enable gz output buffering compression for reducing the data exchange size between the client and server.

eelixduppy

11:35 am on Oct 18, 2006 (gmt 0)



Following up on filippotoso's response, here's the documentation for MySQL Optimization [dev.mysql.com].

By the way, welcome to WebmasterWorld, filippotoso! :)