Forum Moderators: coopster

Message Too Old, No Replies

Getting urls in parallel

Whats the best way?

         

zootreeves

7:24 pm on Sep 9, 2004 (gmt 0)

10+ Year Member



Hi,

I am trying to write a meta search script, but at the mpment urls are fetch series, and it is very slow. So how can i fetch urls in parallel/simulaniously using php?

Can you please post an example of how it could be done.

Thank you
Ben Reeves

lazydog

7:47 pm on Sep 9, 2004 (gmt 0)

10+ Year Member



Hi!

Use the curl functions. Look at the curl_multi_* functions in the manual.
[php.net...]

These functions are not fully documented yet in the PHP manual, but they work similar to the libcurl C library. You'll get a good idea here -

[curl.haxx.se...]

Saurabh.

zootreeves

8:09 pm on Sep 11, 2004 (gmt 0)

10+ Year Member



Hi,

Thanks very much for your help, but would it be possible that you could post a simple example?

Birdman

9:03 pm on Sep 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hello,

Lazydog has already given you the info you need. I you read the multi_* functions as suggested, there is an example posted there.

[php.net...]

Regards,
Birdman

PS: PHP5 only. PERL may be a better option for you.