Welcome to WebmasterWorld Guest from 3.214.184.196

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Syndicating PR panda-penalized site

     
2:52 pm on Jun 8, 2014 (gmt 0)

New User

5+ Year Member

joined:Aug 21, 2013
posts:10
votes: 0


Hi all,

As some of you know, Search engine land has recently published that Panda is after Press Release sites such as Business Wire.

One of my clients is syndicating most of the stock exchange listed companies news from business wire, in the same way that 1000s of other sites do. This of course doesn't feel too good and therefore I'm thinking what should be done next.

Besides the obvious which is removing all of their articles, is there anything else that can be done?

The client is looking for a solution that will allow him to both keep the content (as users do read it) but also stay on the safe side in terms of duplicate content from a site that was hit.

Does anyone have any suggestions or ideas how this can be approached?
4:11 pm on June 8, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


This is not so much an SEO issue as a business model issue in my opinion. You can get away with some duplicate content as long as you have a signficantly higher percentage of unique added value as well, OR if you offer up your duplicate content in a better way (user experience) than the original source.

If you can't do either of those, then you probably need to look at the business model again.
5:22 am on June 9, 2014 (gmt 0)

New User

5+ Year Member

joined:Aug 21, 2013
posts:10
votes: 0


To be honest I don't agree, I'm not talking about the business model as this one is quite clear. The client is looking for a way to keep publishing the content. The question is, do you guys think that publishing content by a site that was already officially hit by Panda might be a real problem?

And if yes, is there anything technical we can do besides removing it completely?

Thanks.
5:55 am on June 9, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3072
votes: 27


I'm with Netmeg. If you can't do this, then what differentiates you from 1000's of others with the same / similar content that's out there. The originality of the content is not necessarily the greatest issue in itself.
6:53 am on June 9, 2014 (gmt 0)

Full Member

10+ Year Member

joined:Oct 11, 2003
posts:255
votes: 0


What really is the strategy of hosting a website carrying thousands of recycled press releases?

If it is an altruistic endeavor, host it and pay for it as a public service and do not sweat the search engines rightly ignoring it. You will be rewarded in heaven.

If you wish to make money off business information go out and cover business events, conventions, expos, tour the factories etc., and produce original articles and illustrate with original images, videos, interviews and so on and so forth.

The days of every Tom, Dick and Harry throwing up a website and pasting material they did not create and expecting to make a buck are (hopefully) over.

I am sick and tired of the wikepedias, wikihows, ehows, wikitravel and same breed websites stealing and rewriting original material and ranking so high in the serps. It sets a bad example which the OP or his client cannot be faulted for trying to emulate.

It is nothing mysterious why traditional media is dying. Having all the equipment and degreed staff to cover anything from a celebrity cocktail to a major disaster is for naught when a blogger can scrape the story essentials, rank #1 for it, then point to the source as an afterthought.

Classic example drudge report and hundreds of fellow bed-mates.

[edited by: mromero at 7:26 am (utc) on Jun 9, 2014]

7:15 am on June 9, 2014 (gmt 0)

New User

5+ Year Member

joined:Aug 21, 2013
posts:10
votes: 0


Actually you reminded me that I missed an important point.

This client doesn't get ANY organic traffic to the Business Wire news, and he actually doesn't even care if those pages are indexes or not.
In fact the client himself asked if adding a NoIndex to all of their articles will solve our dilemma.
But because we are talking about 1000s of articles per month, I'm not so happy about the NoIndex solution in this case, besides - NoIndex means that it won't be in Google's Index but it doesn't mean that Google won't see that this client syndicates content from a penalized source.

Hope this helps
7:27 am on June 9, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 19, 2008
posts:1337
votes: 121


Mike, If the Pages are in a separate Folder, go and put this Folder into robots.txt file. disallow it to google. Google will Index the file without any further Information and it will still Show it in serps in the Case That this page is highly relevant to the query.
10:54 am on June 9, 2014 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2630
votes: 191


it will still Show it in serps in the Case That this page is highly relevant to the query

Only if there are links to that page from which Google can conclude the page is relevant. Because if the page is disallowed by robots.txt, Google does not know what is on the page, hence it would not know whether the page is relevant or not unless it can conclude this based on links.
12:22 pm on June 9, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


If the content is NOINDEXed (no matter how much there is) it probably won't matter.
5:09 am on June 10, 2014 (gmt 0)

New User

5+ Year Member

joined:Aug 21, 2013
posts:10
votes: 0


All articles are in a separate folder, so do more people think that NoIndex in this case is a safe solution?

What about the fact that 1000s of pages (and growing) are going to be under NoIndex? I'm thinking long term.
6:12 am on June 10, 2014 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4504
votes: 347


You might well be better off noindexing the content that is the same as everyone has. If having them indexed causes or aggravates the current concerns, noindexing can't make it worse - assuming that the rest of your content is unique and useful.

According to the Google Webmasters Guidelines, fewer, higher quality pages is preferable to folders of duplicate information pages that are available in many places. Your visitors will still have the content, you just aren't asking it to be counted toward your site's valuable content or trying to use it to bring in traffic. Just my opinion.