News aggregators may be the best new tools to appear on the Web since the browser, but as the programs and the underlying RSS standard grow more popular, some question whether the Internet will be able to handle the traffic.
As RSS feeds proliferate (almost every major American newspaper site and blog has one) and as more and more Internet users adopt readers, websites and bloggers are starting to feel the weight of thousands of hourly pings.
That's one of the pluses to using an on-line aggregator like that on MyYahoo. Their server fetches each feed on a frequency that agrees with the update history of the feed and then caches it. That way it doesn't hit the supplier web server for each impression. The end-user gets one set of HTML with the RSS content already woven into it.
If desktop aggregators are written to implement some type of cacheing, that too would help to reduce the amount of traffic.
Most current RSS readers are freeware programs developed by one guy. So far, we just don't have any professional, commercial, popular readers out there. As soon as the *popular* demand is there, we'll get standards compliant readers out there.
Despite that, Winer had to track down a rogue reader that was checking his website a hundred times a minute.
Block it. Send back a page saying "the program you are using is c**p. Use one of these instead." The market will soon adapt!