Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
So, is there any way I can get a web browser (probably Firefox would be easiest) to collect the data from a URL periodically and write it to a file?
If not, is there a way to get something like links/lynx to do this on the command line in combination with cron?
I'm sure this must be possible in a number of ways, but I can't seem to do it at the moment :(
The site in question saves login details between sessions, so it shouldn't be neccesary to negotiate a login each time. Simply present the appropriate cookies, grab the page content, and save it.
Can anyone make any suggestions please?