Forum Moderators: coopster

Message Too Old, No Replies

Cacheing MySQL results

to page through later

         

dingman

8:24 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm working on a system that displays search result sets that can number in the thousands to a user. Obviously, I don't want to stick the user with a page long enough to contain even several hundred results, nor do I want to load all of them into memory if I can avoid it, so I make liberal use of 'LIMIT X,Y'. Unfortunately, this means that for each page of the result set I display, the search gets run again with nothing but the limit clause changed, which hardly seems efficient. Does anyone have a good approach for running the search once and then paging through the results? I don't like any of the ideas I've had so far. (I'm much more accustomed to tuning my queries to keep the result set small, but that's not going to work here.)

TIA

jatar_k

8:31 pm on Jun 21, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



What about generating them into flatfiles with a set number in them?

Not a fully formed idea at the moment but seems doable.

how often does the content change? Could be generated nightly and included into each page?

not sure, still thinking about it.

dingman

1:06 am on Jun 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They are pages of search results, so I can't really generate them beforehand without knowing what searches people will run. I can, and perhaps even will, generate the pages for individual records ahead of time, but what I'm having trouble with is the list of which records are relevant.

jamie

3:34 pm on Jun 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



hi dingman,

this might be a bit out my league as i'm no expert,

but i've just started implementing phpCache for sections of my site. it stores the query results in an array which is cached in the /tmp/ folder.

couldn't you then simply access this array to print off so many results - then get the key of the last result set on the page, pass this as a variable in the link to the next page and start displaying the results on the next page from the next key in the cached array?

daisho

2:43 pm on Jun 23, 2003 (gmt 0)

10+ Year Member



jatar_k's suggestion is not saying you should precache queries. He is saying that if a search is done for "widgets" you check to see if you have a cached resultset for that keyword. If you do use that and don't do a db query. It's quick...

If it doesn't exist then generate the file.

As a side note not sure how much ram you have or how many different queiries but in MySQL 4 you have query caching. MySQL will cache the result of an SQL statement on the server side. It may help and be simpler to implement.

daisho.

dingman

5:54 pm on Jun 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ok, that makes a bit more sense to me. Thanks daisho.

As for MySQL 4.0, I'm sharing a server with another much larger app, and the guys who run it want to stick with 3.23, because they haven't tested the other app with 4.0. phpCache looks like it'd be as complicated to set up as writing my own cache system into the app, but you guys have been helpful. I was thinking more along the lines of session-based caching rather than query-based caching, and I like the latter idea better. Thanks.

daisho

7:07 pm on Jun 23, 2003 (gmt 0)

10+ Year Member



The problem with session based caching is that restoring large sessions can take a long time.

phpCache may or may not be the thing that you are looking for. I was thinking more like something as follows (And I think this is the way jatar was going also):

1. query for widgets
2. check if file named widgets.cache exists
3. if not create it and write 1 result per row seperated by say pipe "¦" to the file.

Then create the file starting at line 1 and going to line 10 (for the fist 10 results).

Next call for "widgets" check to see if widgets.cache exists. If it does simply open the file and then jump (using fseek) to the part of the file where you want to start. This way you are caching the entire query at once. You could use phpCache in addition to this but phpCache is made more for caching the final result of a page generation rather then caching the base data for a series of pages.

My example suggested to use a pipe delimeted text file. A much better solution is a random access file using a predefined record structure but that is a little more complicated to setup.

daisho.

jatar_k

8:41 pm on Jun 23, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



thats pretty much what I was thinking daisho, thankfully you explained it so well.

dingman

10:43 pm on Jun 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks, everyone. I've now got something up that fits the bill quite well, and while it isn't exactly what anyone described, it's deffinitely inspired by this thread, and it's deffinitely much faster than what I had before.