Welcome to WebmasterWorld Guest from 54.160.131.144

Forum Moderators: open

Message Too Old, No Replies

mysql, Multiple rows, or comma delimited field?

     

NoLimits

5:10 pm on Nov 6, 2009 (gmt 0)

10+ Year Member



I have a mysql database that is used for tracking visitor and sales activity accross a network of sites. It works great for about the first 300,000 visitors, and then slows progressively worse and worse as it scales.

The problem lies in the following:

If there are 4 items on the page that the customer could potentially buy - currently, I insert a new row for each item (4 rows) in to the "sales" table - so that I know the sale has been presented.

If they click on one of those items, I can update the record for that item accordingly as they look at it in closer detail, or buy it. Additionally, it allows me to track the % of visitors that click through to each item and other fun data.

The problem with this is that when querying this database after it's got 900k results or so in it - it gets extremely slow.

I have considered entering all 4 sales attempts as a comma delimited field in a single row - but I'm not sure this is a good idea. I know I have a problem though because given the example above with 4 items on one page, the only thing that differs in each of the 4 rows I insert is the "itemid" - which is the id of the product from the product table.

Any suggestions, greatly appreciated.

LifeinAsia

5:40 pm on Nov 6, 2009 (gmt 0)

WebmasterWorld Administrator lifeinasia is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I would not go the comma-delimited route. Most likely it will only make the issue worse.

Is your table is properly indexed? Can you give an example of the query that's taking a long time to run?

LifeinAsia

5:42 pm on Nov 6, 2009 (gmt 0)

WebmasterWorld Administrator lifeinasia is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I just read through your posting again. I would use one table for all the other "fun" you record (1 row per page), then another table for the items on the page (1 row per item), with a key between the two tables to link the data.

rocknbil

6:21 pm on Nov 6, 2009 (gmt 0)

WebmasterWorld Senior Member rocknbil is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Agreed, be sure the database is properly indexed **and** that the programming is efficient in it's access of the databases. 300 K records is not really that large. Flat text databases will kill your server, eventually, with files that large and will be extremely slow.

previous thread, same issue, but with PHP [webmasterworld.com]

NoLimits

7:38 pm on Nov 7, 2009 (gmt 0)

10+ Year Member



I think you guys nailed my issue with indexes.

I have no (zero) indexes in my database. I've been reading up on them and familiarizing myself with them and I can't believe I've never been down that road before.

I am self-taught in anything that I know about code or databases, so there tend to be large holes in my knowledge.
It was especially satisfying to see that tremendous gains can be had when using multiple table joins as this table does.

Thank you once again webmasterworld community for coming to my aid.