Welcome to WebmasterWorld Guest from 18.104.22.168
I find that it's tough to find sites I would consider reading regullarly in the first place...let alone a large enough collection so that I could have my finger on the pulse of the Internet...or whatever it is these gurus do...
I recommend having your own news application custom built rather than going with a service (such as the one in UK). It's cheaper in the long run, and you have total control.
However, I do not think many SEO's are onto this concept yet, but as with anything effective, they will in due time.
I'm familiar with a lot of the tools and resource sites, but still wondering how others manage this. For example do you just download other people's opml files or is it a grueling process of accumulating thousands of sources one by one. It seems to me that just having a ton of sources really isn't good. You'd want to qualify those sources somehow, wouldn't you?
You can use software to scan usenet groups and convert those into RSS feeds. Scraping sites for content isn't that difficult either, all you need is to spider a list of sites on a regular basis really.
[edited by: Woz at 6:56 am (utc) on May 12, 2004]
[edit reason] Tidying up [/edit]
I guess my conundrum is gathering lists of sites to scan. I'm probably like a lot of you; if it's not mentioned on WebmasterWorld then it probably isn't newsworthy, right? ;) I've been alerted to more major news events via Foo than any other source in recent years...I guess I'll have to keep plugging away on my list then. There don't seem to be many shortcuts to building a quality resource...but there rarely are...