Welcome to WebmasterWorld Guest from

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

exporting 100000+ records to XML with PHP, need efficient XML writer

need to export user records for bulk mail via external company

12:58 pm on Apr 3, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 8, 2004
posts: 500
votes: 0

my problem is I need to export a pretty hefty amount of user records out of a MySQL database to a file which can be easily imported by a third party into their applications.

My client does not want to bulk email it's customers itself, since it would impose a lot of pratical problems and would require more programming on my side and whatnot (that's what I figured). Not to mention the fact the domain could spend time on some spam blacklist, something they really want to avoid.
As far as I knew, there's 3 ways this problem can be tackled:

1: send the data in question realtime via forms to the company
2: send the data realtime via curl or something
3: export the data to a file and have it imported by the company to their application

since the first two options would require additional work and communucations and adjusting of scripts between two parties, it is by far less flexible. My client does not want to be dependant and wants a solution which is versatile, so that they are not bound to the other party and can easily switch if need be. So I figured option 3 would be the best way. Yet option 3 also imposes a lot of practical problems, especially when it comes to filesize of the generated files.

Sofar I have exported (or tried to export) to:

- excel, problem: maximum of 65536 exported records
- excel 2007, problem: memory allocation fails (needs over 512 mb for 100000 records, class used: [codeplex.com...] While this is a awesome class for versatality, the memory footprint is absolute hilarious and can only be considered a joke, taking up around 1 mb per 57 rows of data...
- xml, problem: memory is also an issue here

currently I am using the script at <snip>, which supposedly should go easy on the memory, but it also needs more then 512 mb

it pretty much doesnt matter which XML writer class I use, all of them exhaust my memory

how do other people tackle this issue? Any pointers to what I can do?

[edited by: dreamcatcher at 1:30 pm (utc) on April 3, 2008]
[edit reason] no personal blog urls. thanks. [/edit]

4:07 pm on Apr 3, 2008 (gmt 0)

Junior Member

5+ Year Member

joined:Mar 15, 2008
votes: 0

I would set up a psuedo log file which directs my import script to transfer files in chunks. So, you night set it to do 10000 records at a time. The log file keeps track of which record number you are up to so in the future additional records can be seemlessly uploaded from the point where you left off.
4:45 pm on Apr 3, 2008 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 1, 2003
votes: 0

Judging by your description of the problem, if it were me I wouldn't use either PHP or XML, for efficiency.

I'd just use a mysql command line export command (or several) to create delimited text files. (For example tab text or csv.)


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members