Forum Moderators: coopster
I have an RSS feed setup for my forum. It uses php to access mysql and parse the posts into xml. The problem is that it doesn't cache, and I'm worried about server load. I'd like to periodically write the output of the php script to a static file (news.xml) which would be picked up by the end user.
I am considering to methods.
1) Include the script into another script which writes it to a static file. Use a cronjob to automate.
2) curl -o news.xml [....rss.php...] in cron. (or something like that)
What is the best way to accomplish this task?
Thanks!
Erik
I ended up doing the following:
function Grabber($url) {
$ch = curl_init ();
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_TIMEOUT, 60);
$content = curl_exec ($ch);
curl_close ($ch);
return $content;
}
//$PATH = $_ENV['DOCUMENT_ROOT']; #get server path
//$save_path=$PATH.'members'; #define target path
$save_path='/home/lybp/public_html/rss'; #define target path
$data= Grabber("http://www.domain.com/forum/rss.php?&f=1");
$temp='news.tmp'; #define temporary target name
$dest='news.xml'; #define final destination target name
$fp = fopen($save_path.'/'.$temp, "w", 0); #open for writing
fputs($fp, $data); #write all of $data to our opened file
fclose($fp); #close the file
5 fewer lines 20,70 40%
Enjoy!
Erik