Forum Moderators: coopster
Has anyone ever tried this or would be able to shed any light on the subject and point me in the right direction? Any help greatly appreciated.
Thanks
Ally
try this thread
[webmasterworld.com...] msg2
those steps talk about writing it to file, you would just need to swap that for db stuff.
It looks like the best way for me will be to split the file via a php function as you suggested and then acting accordingly upon each individual item.
would there be any problem with the file size that you know of? my client says there will be somewhere around 2 million entries, making the size of the csv rather large - are there any implications of the loading time of the page that does the processing as i imagine that it will take a while to loop through all of these results?
I have set all the php.ini variables that need to be set so that it doesnt time out after a few minutes.
Thanks
Ally
if the client is using oracle was there not an option to use oracle for the site as well?
I am imagining this would take a lot of work to change at this point but that would make life very simple
is this only changes? and its 2 million, man. You might have to split that up before running it
2 mil lines = min of 4 mil queries for the update
you better make sure you do it in off hours, it may take alot of time, even up to a couple of hours
when the campaign starts running it will be only updates of the persons data where necessary - so it could be as little as 10,000 records that need updating, and this is only once a month (admittedly it could be over 1.5 million records that need updation).
My worry now though is that will be too much for mysql to handle. I think their plan was to add more entries into the database at a later date to cope with other contries involved taking it to over 11 million entries, although this is at a later date and i am not worried about it at the moment.
Just as long as i can handle the 2 million entries and do the updates successfully in the way i have mentioned.
What do you think? will it be able to run smoothly with that many entries?
Thanks
Ally
hopefully it will not be too many updates every time, even 100K is no big deal, though it would still be a lot of queries.
I know people who tell me they run that many records all the time so I don't think it will be too much of a problem though if you need to search a lot you may see some slowness.
11 million, hmm not sure if you might start experiencing some problems there. You will also need to look at the machines this is running on at some point and see if they have the horsepower to handle volume, especially if you are doing updates through the site.
You could use a non transactional install of oracle and there would be no worries with these volumes. As I said though, that may not be an option. It is definitely something you should consider for the future I think.
Thing is it is really hard to say whether you will have issues or not, there are just too many variables involved.
what kind of traffic are we talking
how many concurrent users
what are the server specs
how many servers are there
what jobs are the servers doing
are you just reading from db or are there large numbers of writes as well (excluding the monthly update)
those are just a couple I can think of off the top of my head, there are a ton more
What i will do is just go for it with mysql, once the large amount of data is loaded up there will only be one query of the table per visit (login) then i will pass their vairables across the site as needed. Also even though there may be a mid level amount of traffic there will not be any writes to the database, only reads by the user. the only writes are done in the administrator section.
Thanks again.
Ally