I have a mysql database that is used for tracking visitor and sales activity accross a network of sites. It works great for about the first 300,000 visitors, and then slows progressively worse and worse as it scales.
The problem lies in the following:
If there are 4 items on the page that the customer could potentially buy - currently, I insert a new row for each item (4 rows) in to the "sales" table - so that I know the sale has been presented.
If they click on one of those items, I can update the record for that item accordingly as they look at it in closer detail, or buy it. Additionally, it allows me to track the % of visitors that click through to each item and other fun data.
The problem with this is that when querying this database after it's got 900k results or so in it - it gets extremely slow.
I have considered entering all 4 sales attempts as a comma delimited field in a single row - but I'm not sure this is a good idea. I know I have a problem though because given the example above with 4 items on one page, the only thing that differs in each of the 4 rows I insert is the "itemid" - which is the id of the product from the product table.
Any suggestions, greatly appreciated.