Anyone know of a utility that will search a set of directories and extract all http urls found?
minnapple
1:07 am on Jan 30, 2001 (gmt 0)
Brett,
What type of directories are you referring to?
Minnapple
Brett_Tabke
2:40 am on Jan 30, 2001 (gmt 0)
eek. Hard drive - files.
bartek
3:15 am on Jan 30, 2001 (gmt 0)
"The program allows to analyze files on your local computer and to extract URL, news and email addresses. Also you can apply the filter and export obtained data..."
This one looks like a program out of a spammer toolbox, but it might do what you need: [esalesbiz.com...]
Brett_Tabke
3:49 am on Jan 30, 2001 (gmt 0)
Bartek - nice to see you still hanging around here (thanks).
Looks like a nice util. Could I see something in freeware? In the perl isle?
Air
5:44 am on Jan 30, 2001 (gmt 0)
BT, if you have this module installed it should do it.
Thanks. I came up with a solution that is based on just about all of the above. I'd forgotten that I had a dos version of grep that came with a borland compiler. A little fine tuning of the output is needed, but I just let it walk through all the files filtering out "lines" with http's and then massage it with perl.