austtr - 5:40 am on Aug 28, 2013 (gmt 0)
I was just looking at page code for a few of the top ranked sites in my niche for possible signs of things they might be doing better than me... any lessons I might take from them.
Was that ever a reality check. The world's consistently top ranking site in this niche is overflowing with some of the most systematic, deliberate, built-in spam practices that I personally have ever seen. Keyword stuffed title tags and descriptions, duplicated alt and title attributes stuffed with keywords... keyword rich (to put it mildly) page content... all the hallmarks that we are constantly told will trigger an OOP.
It's like this site is holding up its middle finger to the world and proclaiming they are untouchable. There is no way that if the algos are as good as we keep hearing, that this site could avoid serious scrutiny for spam practices.
I can hear the replies about to be written saying "ah yes, Google knows all about those techniques and they don't count for anything.... the site must be ranking due to other factors."
Not the point..... the question is how sites displaying such a disdain for the guidelines that we all get judged by, can go unscathed? How can such obvious intent to game the SERP's avoid an OOP? And this is not a new site, its been around for many years.
Which raises a few other questions.
How can sites that deliberately choose to spam, no matter how dated or silly the techniques, dominate SERP's in this day and age of super algos?
Are OOP's applied selectively. Is there a whitelist of "authority" sites to whom the guidelines don't apply?
Is there still such a thing as OOP?