Forum Moderators: Robert Charlton & goodroi
The other content blocks on the page -- where you have your own unique content -- can bring in search traffic. So I would suggest you offer RSS feeds only to serve your visitors and not be thinking of Google when you add a feed.
Obviously from your rank in here and number of posts you are a expert at this. so no disrespect but I couldn't disagree more. The fresh content on the page that RSS delivers is immmensly powerful when dealing with G-bot.
What makes you think it's "fresh"? The original issuer of the feed already has that text on their website, and so do other sites with this feed.
Can you say "duplicate content"? that's why your feed page needs more then just the feed, it needs lots of original content too.
So how about somes details? Let us have something to sink our teeth into. "Immensely powerful" in what way? Is syndicated content bringing you search traffic on its own keywords? You mentioned googlebot - has adding feed content to your site improved crawling frequency or depth?
Is syndicated content bringing you search traffic on its own keywords?
If the page is 90% feed I would find this hard to believe...
It is my experience that feeds benefit the user more then the search engine. I have feeds on one of my sites. The do good traffic in the logs BUT from the users not the SEs. Feeds can be a great plus for your visitors, but SEs? hmmmm.... The duplicate content thing kinda shoots that down.
I get brassed off when I click on a Google search result only to find a page full of RSS code. I get doubly annoyed when the content of the page has moved on from what Google has spidered and indexed, the text of the snippet being nowhere on the page at all. In that case I would rather that such pages never appear in the regular SERPs.
I agree with this 100% and has been my experience as well.
However, I created those pages for our vistors and not solely for the spiders, my hope was to offer informative articles and information to my vistors on a daily basis and the side benefit of fresh content from a bots point of view is just gravy.
On the other hand, outgoing RSS Feeds that you create yourself, which highlight your products, news, information etc about your site are very very powerful in terms of both vistors and the spiders. Our outgoing xml feed is the second highest page viewed daily on our site, our home page is the only page that comes above it. To me this means two things, The vistors are suscribing to our feed and using it to stay up with our products, sales etc and it is spreading our site all over the web as it gets spidered more and more every day. I know for a fact that it has created many thousands of links back to not just our home page but deeper level product pages.
Of course this takes a fair amount of work as to it has to be updated at least weekly for this to happen, old content removed and new content added.
Another thing I think that also helps the vistors at least is the fact that we offer them an adware and virus free rss feed reader right on site for download if they wish. I know it is adware and virus free because I wrote it...I am hoping that for the vistors it will be sort of like a one stop shop, get the feed and the reader all in the same place.
Again, the incoming RSS seems to help for vistors more than spiders whereas the outgoing seems to help with both vistors and spiders.
My 2 cents
MThiessen
What makes you think it's "fresh"? The original issuer of the feed already has that text on their website, and so do other sites with this feed.Can you say "duplicate content"? that's why your feed page needs more then just the feed, it needs lots of original content too.
This is not duplicate content. Do you think the nytimes gets punished for the millions of RSS feds it is supporting or vice versa, do I get punished for placing NYtimes RSS on a page? By fresh content we mean something relevant that wasn't on the page the last type G-bot swept it.
No way No how this is duplicate content!
do I get punished for placing NYtimes RSS on a page?
Ok, the data generated from this feed is IDENTICAL to the data on the NYtimes page correct? It is their stories they wrote them. So we KNOW it is a duplication.
By fresh content we mean something relevant that wasn't on the page the last type G-bot swept it.
And by DUPLICATE I mean it might be "fresh" on your site, but the data itself is by no means "fresh" it has already been published before and is part of the NYtimes site.
They don't get nailed for it because they are the original author.
You take a hit becaue they published that data first, you are the duplicator, not them.
Make sense? RSS feeds shouldn't give you a dup content penalty if there is more on the page then just the feed. If a good percentage is original, then you might escape the dup content filter.
If this DOESN'T make sense, think about this. What would stop someone from loading down a bunch of sites with RSS feeds and slapping adsense on em and getting rich? No effort to write original content, no skill, just zip zip zip... Get my point?
There is no way, no how, ever, you can use the same data as someone else and not get hit with this filter.
[edited by: MThiessen at 10:35 pm (utc) on Dec. 4, 2006]
Yahoo News -Page Rank 9 - Not a single original Yahoo authored article on the page.
Most of the most popular sites in the world are filling them selves to some extent with other peoples content.
The dup. content filter is designed to control abuse by way of putting up 1,000s of pages that are exactly the same, not little bits of content that are also on other pages throughout the internet.
A RSS feed of relevant headlines is an incredible SEO tool that helps take a small amount of manual labor out of keeping "Fresh" content (not original, fresh) on a page and keeping G-bot happy. We can always bring up extreme examples of things and of course they will be "BAD". If you sat down and ate 5 gallons of Rocky Road you would die, but a little ice cream make LogicMaze happy!
Back to the original poster's question - YES Google will recognize the content on your site and if you don't eat 5 gallons of RSS on your home page it will benefit both viewer and bot!
RSS feeds shouldn't give you a dup content penalty if there is more on the page then just the feed. If a good percentage is original, then you might escape the dup content filter.
There is no way, no how, ever, you can use the same data as someone else and not get hit with this filter.
Don't those 2 statements contradict each other?
If you create pages that offer something of unique value to a visitor, then you're going in the right direction. If you are trying to piggyback your traffic onto someone else's content creation, then you are at odds with Google's purposes. It may even work for you for a while, but it's not much of a business plan.
I basically have some questions for anyone who has had experience with RSS, and they bring up some issues pertinent to this ongoing discussion (albeit unfriendly at times).
What about managing incoming RSS feeds for a certain topic? If you have a site, for example, that is centered on the topic of the New York Stock Exchange - couldn't you take incoming RSS from various newspapers and news services, establish some back-end means, be it PHP/MySQL or whatever, of cherry-picking through the feeds for those articles pertinent to the NYSE and only display those articles?
In addition, is there any reason you can't store those articles in your own database provided you include all source citation?
What I'm getting at is this: can you use RSS to build a website that serves, more or less, as a customized news service in itself - an archive of news articles from many reputable sources all relevent to a given topic? Wouldn't this be ranked very highly in Google?
This is a very new problem for me and I'm admittedly less-than-experienced with RSS feeds, so any comments would be greatly appreciated.