Forum Moderators: coopster
I bet some of you use PHP from the command line out of habit. This is a small script that'll load up once and accept input from the command line (using php 4.3).
<?php
set_time_limit(0);
define('STDIN', fopen('php://stdin', 'r'));
define('STDOUT', fopen('php://stdout', 'w'));$selected_options = array();
$options = array(
'curl'=>'Protocol Handling',
'wget'=>'Protocol Handling',
'GET'=>'Protocol Handling (via Perl)',
'tidy'=>'SGML Validation (via protocol wrapper)',
);
// preload some dictionary data herefunction display_options(&$options)
{
fwrite(STDOUT,"Enter a choice of option below\n");
foreach($options as $option => $value) fwrite(STDOUT,"\t$option\t$value\n");
fwrite(STDOUT,"> ");
}
for(;;)
{
display_options($options);
while($input = trim(fgets(STDIN,1000)))
{
if($input == 'exit')
exit(0);
elseif($input == 'options')
display_options($options);
elseif(isset($options[$input]))
{
// code to "do stuff" re options
}
else fwrite(STDOUT,"> ");
}
}
?>
The script is part of a shell that would fetch web pages. Some other options would also be available, like counting the words on a page.
This would involve loading a large dictionary "once" when the script starts, which could be a large chunk of data. The script could then accept a command to "fetch and parse a page", and use the dictionary to compare to the words on the page.
Consider if I wanted to put this script on a webserver. I have 10 client PC's (i.e. my pc) that'd contact this script with commands, and the script would perform them.
Is there a simple way that I can "preload" the dictionary data, so that it would not be loaded every time one of the client PC's hits the script?
i.e. it is permanently running, and "listens" for requests to it, and would not have to load all the "preloaded data" for each unique connection.
Hope I explained that without making too much of a dogs ear of it! Cheers for any pointers or advice. I had a look at the "socket" and "stream" functions but still at a bit of a loss.
i.e. it is permanently running, and "listens" for requests to it, and would not have to load all the "preloaded data" for each unique connection.
Well, other than building an intricate server/client system using sockets, which would really be kinda overkill, you could use shared memory.
What you would do at that point is run that script of yours, and load the dictionary into sharedmem, then the scripts that the web clients hit will use the data from shared mem. One of the better ways to do this (I found), was to include your "master" script with apache.
So your normal httpd startup script will load the PHP script, which creates a SHM segment and fills it with your data, and the shutdown script will delete the shared memory segment.
HTH,
MM
I think it's overkill (IMHO, of course ;), since you would need to involve threads (otherwise you can only answer one web client at a time), sockets on both the listening side (Server portion -- to dish out the dictionary) and the opening socket side (the client end -- requesting the dictionary). I'm sure there are lots of PEAR classes you could use for this, but sounds much harder than it needs to be. :)
Good luck!
MM