Trying to figure out exactly how to implement this current project to derive estimates from 1 db to another and stores the estimated value in one of the databases. I currently have, what I think is a pretty hacked version of it and works but does not seemed optimized at all and is pretty darn slow...what i have currently is below:
I have 2 tables in my database: one for previously sold products (approx. 600,000 rows) and one for items not yet for sale (approx. 50,000 rows and growing).
What I would ideally like to do is present the site user with an estimated value/sale price of the items that are not currently for sale based on parameters matching/querying against the database for the previously sold items.
I currently have this working on the fly but since it has to query the database on every page load/item they visit it is quite time consuming in regards to page load time (though I’m sure my query could be optimized quite a bit…)
So, I was playing with the idea of possibly, weekly or bi-monthly, looping through the data and creating estimates for each item and storing the estimated value with in a column with a corresponding “last modified estimate” date to query against.
My problem is that I’m really not sure how to set something up w/ this size to make it as optimized as possible…
Example of how I am currently querying:
-search.php stores my class variables and search parameters and initiates the query and stores the "$count" -calculate_item_value.php calculates, in php, the results and averages an estimated value base on query results
How do really large sites handle creating estimates or similar things on their sites and databases? This just seems very VERY slow and un-optimized. Have been thinking that I might be able to use 1 query w/ a broader set of criteria and just use PHP to calculate using the search parameter variables but haven't tried that yet...that is probably next on my list.