homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
Forum Library, Charter, Moderators: coopster & jatar k

PHP Server Side Scripting Forum

exporting 100000+ records to XML with PHP, need efficient XML writer
need to export user records for bulk mail via external company

10+ Year Member

Msg#: 3618029 posted 12:58 pm on Apr 3, 2008 (gmt 0)

my problem is I need to export a pretty hefty amount of user records out of a MySQL database to a file which can be easily imported by a third party into their applications.

My client does not want to bulk email it's customers itself, since it would impose a lot of pratical problems and would require more programming on my side and whatnot (that's what I figured). Not to mention the fact the domain could spend time on some spam blacklist, something they really want to avoid.
As far as I knew, there's 3 ways this problem can be tackled:

1: send the data in question realtime via forms to the company
2: send the data realtime via curl or something
3: export the data to a file and have it imported by the company to their application

since the first two options would require additional work and communucations and adjusting of scripts between two parties, it is by far less flexible. My client does not want to be dependant and wants a solution which is versatile, so that they are not bound to the other party and can easily switch if need be. So I figured option 3 would be the best way. Yet option 3 also imposes a lot of practical problems, especially when it comes to filesize of the generated files.

Sofar I have exported (or tried to export) to:

- excel, problem: maximum of 65536 exported records
- excel 2007, problem: memory allocation fails (needs over 512 mb for 100000 records, class used: [codeplex.com...] While this is a awesome class for versatality, the memory footprint is absolute hilarious and can only be considered a joke, taking up around 1 mb per 57 rows of data...
- xml, problem: memory is also an issue here

currently I am using the script at <snip>, which supposedly should go easy on the memory, but it also needs more then 512 mb

it pretty much doesnt matter which XML writer class I use, all of them exhaust my memory

how do other people tackle this issue? Any pointers to what I can do?

[edited by: dreamcatcher at 1:30 pm (utc) on April 3, 2008]
[edit reason] no personal blog urls. thanks. [/edit]



5+ Year Member

Msg#: 3618029 posted 4:07 pm on Apr 3, 2008 (gmt 0)

I would set up a psuedo log file which directs my import script to transfer files in chunks. So, you night set it to do 10000 records at a time. The log file keeps track of which record number you are up to so in the future additional records can be seemlessly uploaded from the point where you left off.


WebmasterWorld Senior Member 10+ Year Member

Msg#: 3618029 posted 4:45 pm on Apr 3, 2008 (gmt 0)

Judging by your description of the problem, if it were me I wouldn't use either PHP or XML, for efficiency.

I'd just use a mysql command line export command (or several) to create delimited text files. (For example tab text or csv.)

Global Options:
 top home search open messages active posts  

Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved