Welcome to WebmasterWorld Guest from 35.173.47.43

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Does Google Penalize Public Domain Content?

     
10:39 pm on Sep 2, 2014 (gmt 0)

New User

5+ Year Member

joined:Sept 2, 2014
posts: 2
votes: 0


I have a couple of articles on my website that are from a popular blogger whose content is in the public domain.

These articles are fit perfectly into my niche and offer a perspective that I could not offer myself.

Will Google penalize me for having this content on my website? I ask because my website got hit hard by Penguin 2.0 and still hasn't recovered.
11:40 pm on Sept 2, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 11, 2007
posts:774
votes: 3


Penguin = Low Quality links... Such content would not be the cause of a Penguin filter.
12:38 am on Sept 3, 2014 (gmt 0)

Full Member

10+ Year Member

joined:June 4, 2008
posts: 202
votes: 0


I'm curious... why republish the content? Why not write your own article about this blogger and/or his/her views and link to the relevant content at the original source within that article? I just personally think that would send a better signal to both search engines and users. Otherwise, it might appear that you are trying to rank for someone else's work (even if it is in the public domain as you say).

You could always put noindex tags on that content. Or, maybe a canonical tag pointing to the original? (I have no experience with syndicated content so hopefully someone else can chime in on that one).

And as ZydoSEO said, those articles wouldn't be Penguin. Panda is about duplicate content. So if you are using a lot of content that's found elsewhere, you could also be suffering from Panda.
3:59 pm on Sept 3, 2014 (gmt 0)

New User

5+ Year Member

joined:Sept 2, 2014
posts: 2
votes: 0


Thanks for both of your replies.

My site did get hit by Panda, although not half as hard as Penguin.

I should mention that I only have 2 articles that are duplicate content out of around 150 completely unique articles. Both of those articles contain links back to the original source, are slightly modified, and end with my own personal commentary, so they're not entirely regurgitated.

I am linking to one of those articles from my most viewed article though, so maybe that magnifies the effect.

I kept them up because my website visitors seem to like them (low bounce rates and long duration of visit). They did actually rank alongside the original author for the keyword a few years back.

The author in question actively encourages his visitors to share his content, so if there was a penalty, it'll probably be magnified over time as more republished versions appear on the web.

Either way, if there's any chance of them negatively impacting SEO I'll either take them down or re-write them.

For the time being, I guess I could hide the articles from Google using robots.txt?
4:26 pm on Sept 3, 2014 (gmt 0)

Administrator from US 

WebmasterWorld Administrator not2easy is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 27, 2006
posts:4464
votes: 332


Hiding with robots.txt is not a good idea because they will still index the pages but with a description something like, "robots.txt on this site is blocking an accurate description of this page". If you want these pages out of the results index, add a noindex tag. If you want those pages to appear in the results without a weird description, don't block in robots.txt.
8:43 pm on Sept 3, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 11, 2007
posts:774
votes: 3


It's very unlikely that 2 duplicative articles out of 150 articles would trigger a Panda filter.

Panda not only targets sites with a lot of pages containing duplicate content, but it also goes after sites with lots of pages containing thin or no content. Perhaps you're looking in the wrong place.

Do you have other pages on your site that have thin content? Or pages like product listing pages that list tons of products (possibly paginated) with short descriptions that might appear on other pages (such as when the same product teaser appears on multiple product category/sub-category listing pages?

And if you subsequently got hammered by Penguin then you need to be looking at your backlink profile.

I agree w/ not2easy that robots.txt is not your solution. It rarely is. A <meta name="robots" content="noindex"> seems more appropriate if your goal is to get the page deindexed.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members