Forum Moderators: coopster
I developed a page counter and initially designed my database to have one table per counter. The tables record the traffic of each individual counter (daily visit of an ip address).
This tool is proving to be quite popular. I have already several hundred registrations and thus several hundred tables. I am now considering merging all the tables in one unique table.
Would a unique table (with thousands of entries) be too large for reading and writing queries from several dozens of simultaneous users? Is indexing by counter id enough to make this solution viable?
Thank you for your assistance.
Roberto
I would go to one/two tables the most.
Best regards
Michal CIbor
Would a unique table (with thousands of entries) be too large for reading and writing queries from several dozens of simultaneous users?
You may run into locking issues as MySQL is not exactly the best database for multiple concurrent writers (reading is fine as it does not require serious locking).
I'd capture some stats first so that you know how many concurrent reads/writes you do. Writers are the real issue, it is possible that it may actuall make sense to keep 10-100 separate tables which is in effect manual partitioning of data to spread writers/readers like one spreads butter over bread.
if the performance degrades as the number of users increases, you may wish to consider using 2 tables - one for inserts and one for selects. you could then set up a conjob to merge the tables (using INSERT..SELECT) during a window where activity was minimal. This would mean that you could no longer offer real-time tracking but you could run the merge job several times daily. Another benefit would be the ability to speed up your tracking reporting by having multiple indexes on the 'select' table without incurring a performance overhead for the 'inserts'.