Thanks. I know urlwatch sends to stdout (message appears in terminal if no pipe) and sendemail uses stdin if there is no -m argument. :(
Your reference to stdin/out spurred me to search for those terms in conjunction with sendemail and I found the suggestion to use $1 for the message body (ie: -m $1). This resulted in an empty body, though. I tried $0 and it emailed me "bash" (this is from the Mint terminal). In case the first line was blank I also tried $2 and $3 but to no avail: always emailed blank body.
I've tried running it as a cron job without -m (as well as with) but got two emails each time, regardless of whether or not urlwatch detected an error. The first email is blank, the second (to postmaster), gives the text I was receiving from running it in the terminal...
===== Reading message body from STDIN because the '-m' option was not used. If you are manually typing in a message: - First line must be received within 60 seconds. - End manual input with a CTRL-D on its own line.
Feb 15 19:51:02 mail sendemail: Message input complete. Feb 15 19:51:02 mail sendemail: Email was sent successfully! =====
I suspect the error may be due to no Ctrl-D, which I cannot enter in this way. As I said, I think the answer is a tailored script. Or possibly an intermediate temp file. :(
Finally got this working. As I secretly suspected, it was me! :(
I was piping urlwatch into sendemail, which worked in terminal (though without the expected email body text). Following a couple of online examples I used the piping mechanism (urlwatch into sendemail) in cron. Which gave me the same result without telling me WHAT had failed and emailing me even when there was no error.
Looking deeper into it and searching for cron examples rather than urlwatch or sendemail, I came across a useful site that (mostly by inference) guided me to setting up cron correctly. The essencce was: there is no need to specifically email through cron: by default it emails on error. The cron line I am now using is...
1,11,21,31,41,51 * * * * userurlwatch -e
This checks a list of sites every 10 minutes. If there is no error I get no email (a minor problem should the local mail server go down). If there is an error or I've added a new URL I get an appropriate email listing all the erratic or new URLs.
The useful page was at... www[.]thegeekstuff[.]com/2009/06/15-practical-crontab-examples/
I now have to see if it will also check mail servers.