homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Code, Content, and Presentation / Databases
Forum Library, Charter, Moderator: open

Databases Forum

mysql, Multiple rows, or comma delimited field?

 5:10 pm on Nov 6, 2009 (gmt 0)

I have a mysql database that is used for tracking visitor and sales activity accross a network of sites. It works great for about the first 300,000 visitors, and then slows progressively worse and worse as it scales.

The problem lies in the following:

If there are 4 items on the page that the customer could potentially buy - currently, I insert a new row for each item (4 rows) in to the "sales" table - so that I know the sale has been presented.

If they click on one of those items, I can update the record for that item accordingly as they look at it in closer detail, or buy it. Additionally, it allows me to track the % of visitors that click through to each item and other fun data.

The problem with this is that when querying this database after it's got 900k results or so in it - it gets extremely slow.

I have considered entering all 4 sales attempts as a comma delimited field in a single row - but I'm not sure this is a good idea. I know I have a problem though because given the example above with 4 items on one page, the only thing that differs in each of the 4 rows I insert is the "itemid" - which is the id of the product from the product table.

Any suggestions, greatly appreciated.



 5:40 pm on Nov 6, 2009 (gmt 0)

I would not go the comma-delimited route. Most likely it will only make the issue worse.

Is your table is properly indexed? Can you give an example of the query that's taking a long time to run?


 5:42 pm on Nov 6, 2009 (gmt 0)

I just read through your posting again. I would use one table for all the other "fun" you record (1 row per page), then another table for the items on the page (1 row per item), with a key between the two tables to link the data.


 6:21 pm on Nov 6, 2009 (gmt 0)

Agreed, be sure the database is properly indexed **and** that the programming is efficient in it's access of the databases. 300 K records is not really that large. Flat text databases will kill your server, eventually, with files that large and will be extremely slow.

previous thread, same issue, but with PHP [webmasterworld.com]


 7:38 pm on Nov 7, 2009 (gmt 0)

I think you guys nailed my issue with indexes.

I have no (zero) indexes in my database. I've been reading up on them and familiarizing myself with them and I can't believe I've never been down that road before.

I am self-taught in anything that I know about code or databases, so there tend to be large holes in my knowledge.
It was especially satisfying to see that tremendous gains can be had when using multiple table joins as this table does.

Thank you once again webmasterworld community for coming to my aid.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Code, Content, and Presentation / Databases
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved