Forum Moderators: coopster

Message Too Old, No Replies

Read/Write files + traffic

         

ag_47

10:41 pm on May 20, 2008 (gmt 0)

10+ Year Member



I have some XML files used for quick searching. Each entry has an id which is used to retrieve complete information from the database, if the user chooses to click on a search result link. I have a couple of concerns regarding this.
I am using xml_parser methods in PHP to read and display the data in the XML.
I'm using PHPs file system methods for inserting/deleting new entries. (fopen() etc).

1. Is there a better way to insert/delete data in XML files?
2. Will this be an issue if the amount users increases? Data is inserted/deleted per user request, so if some users make different edits simultaneously, will the files get messed up? Is there a way to query these requests?

Thanks.

eelixduppy

11:22 pm on May 20, 2008 (gmt 0)



You should take a look at SimpleXML [php.net] with PHP 5 as it's the best way to handle XML files. As far as multiple edits at the same time, you should lock the file so that those edits cannot be made, in which case an error will result. You can handle this accordingly by creating a queue, but this could be more complicated.

ag_47

2:35 am on May 21, 2008 (gmt 0)

10+ Year Member



Yes, thank you. I was worried I'll have to do that :P
I'm using XML for quick searches, only a posts title and id are saved here. And if the users chooses to view a whole post, they click the title and I use the ID the quickly retrieve all info from the DB.

Should I just forget the XML method and search directly in the database? Will slow the server down by too much? Because it seems although reading an XML is fast (duh), but the whole updating process is unreliable.

Had another idea.. Instead of the XML files updating whenever a new post is added by users, maybe I should write a routine that updates the file at some interval of time. So new posts are stored in the database and 'become visible' in searches in a little bit..?

Thanks.