diberry - 4:43 pm on Jul 12, 2012 (gmt 0)
I think I know why I was hit by Penguin, because something just confirmed a theory I've been bouncing around in my head, thinking "Nah, it can't be that." But I think it is. From this thread:
Tallon is talking about competitors that have gotten hit by Penguin, and says:
What they all have in common: Maybe less than 20% of their entire site content could be considered truly fresh or truly unique. It's mainly a regurgitation of what's already on the web.
Now, I'm not that bad, LOL. When I'm writing about a popular subject, I always try to add something unique to it. But looking at some of my pages, I can see Google thinking it's not unique enough. And if too many of my pages were "not unique enough", then it would follow that I would get affected by Penguin (spam) rather than Panda (thin content).
I've been wondering for a while if the point of Panda and Penguin had to do more with unique content than thin or even spammy content. Although, I'm still not sure about Ehow. Does it rank because even though it copies others, way more sites copy it? Maybe.
Since Penguin, I've deleted a lot of these types of pages, but I'm not sure where the uniqueness threshold is set. Rather than delete more, I think I'll have to start creating some more 100% unique content, and that's difficult to do in this niche because there's just so much written on it. Basically, the direction I've been taking since last year was right: looking at user metrics, and trying to improve pages according to visitor response. It's just that there's a lot of work yet to be done, and figuring out WHY visitors dislike a page is tricky and I don't always get it right.