Welcome to WebmasterWorld Guest from 18.104.22.168
Have a large web-site with (35k pages of raw scientific data). I was wondering if creating a very large RSS feed and submitting it to RSS Aggregators would G crawl it better.
Would be glad to hear what Bill thinks about this.
In RSS 0.91, various elements are restricted to 500 or 100 characters. There can be no more than 15 <item>s in a 0.91 <channel>. There are no string-length or XML-level limits in RSS 0.92 and greater. Processors may impose their own limits, and generators may have preferences that say no more than a certain number of <item>s can appear in a channel, or that strings are limited in length.
There used to be a limit of 15 items, but now there are none (in the specification).
35K pages? That would be overkill for general usability of a normal feed. If you were trying to get other sites to copy all your data and re-use it on their sites then it might be a way to syndicate, but I fear it it would be too large for most aggregators or users to handle.
An RSS file is primarily used to tell subscribers when there's is new content on your site. You would generally set the number of <item>s in your feed to reasonably accommodate the number of pages that change on a regular basis over a certain period.
If you want the SEs to know about your pages then another type of XML file would be suggested; site maps.
joined:Mar 29, 2009