|Is OOP a myth? Or just not applied to authority sites?|
| 5:40 am on Aug 28, 2013 (gmt 0)|
I was just looking at page code for a few of the top ranked sites in my niche for possible signs of things they might be doing better than me... any lessons I might take from them.
Was that ever a reality check. The world's consistently top ranking site in this niche is overflowing with some of the most systematic, deliberate, built-in spam practices that I personally have ever seen. Keyword stuffed title tags and descriptions, duplicated alt and title attributes stuffed with keywords... keyword rich (to put it mildly) page content... all the hallmarks that we are constantly told will trigger an OOP.
It's like this site is holding up its middle finger to the world and proclaiming they are untouchable. There is no way that if the algos are as good as we keep hearing, that this site could avoid serious scrutiny for spam practices.
I can hear the replies about to be written saying "ah yes, Google knows all about those techniques and they don't count for anything.... the site must be ranking due to other factors."
Not the point..... the question is how sites displaying such a disdain for the guidelines that we all get judged by, can go unscathed? How can such obvious intent to game the SERP's avoid an OOP? And this is not a new site, its been around for many years.
Which raises a few other questions.
How can sites that deliberately choose to spam, no matter how dated or silly the techniques, dominate SERP's in this day and age of super algos?
Are OOP's applied selectively. Is there a whitelist of "authority" sites to whom the guidelines don't apply?
Is there still such a thing as OOP?
| 11:59 am on Aug 28, 2013 (gmt 0)|
I'm facing a similar problem with a competitor - the main content of their homepage is basically just a *big* list of model names and numbers of the widgets they supply in a 10px font (effectively keyword-spamming). Majestic reports a big spike (almost overnight) in the number of backlinks a few months ago from approx 100 ref. domains to 1000's of ref domains, all from thin content blogs, directories and even link exchange sites!
To make matters worse, their site uses a tables-based design, has *no* title or headings tags, has image based navigation, takes 25s (yes, that's twenty-five seconds) to load the 2.5Mb homepage, makes over 200 http requests, a page speed score of 52 (not that really means anything) and has over 450 validation errors.
And yet they consistently rank higher than any of the more deserving websites in our niche.
(ps they're not a "brand" or one of the big boys, just an independent ecom store)
| 12:37 pm on Aug 28, 2013 (gmt 0)|
Maybe OOP was misunderstood or oversimplified to begin with? From SEL [searchengineland.com]:
|“Well, what about all the people who are sort of optimizing really hard and doing a lot of SEO?”... |
And the idea is basically to try and level the playing ground a little bit. So all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level.
Is it possible the Over-SEO is simply ignored and a site stands or falls based on what's left?
Or do you think the OOP been superceded?
Personally I think it's a better user experience to ignore the SEO and focus on the content. There's content ranking without the H1/Title Tag SEO recipe too you know. That has to be taken into consideration, big picture etc.
| 1:43 pm on Aug 28, 2013 (gmt 0)|
An old (April 23rd, 2009) Youtube video of Matt Cutts addressing over optimization penalty:
Just to add one thing to martinibuster's post and quote above, it seems, looking in hindsight, that Matt Cutts was referring to the then-upcoming Penguin algorithm (as the interview was in March of 2012, and Penguin rolled out in April of that year).
One part of that interview that seems particularly interesting to me is where Matt Cutts states:
|"...and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page, or whether they exchange way too many links, or whatever they are doing to sort of go beyond what a normal person would expect in a particular area." |
So, since it seems highly likely that he was referring to Penguin, maybe there is more to Penguin than just bad backlinks? Maybe Penguin considers on-page factors as well?
| 3:23 pm on Aug 28, 2013 (gmt 0)|
The old -950 OOP has been a rare beast since Caffeine, when a more nuanced approach became possible
The way I imagine the current scoring environment is this:
1) The main algo gives positive scores based on "normal" criteria.
2) The Panda module gives a negative score based on textual analysis
3) The Penguin module gives a negative score based on off-page (though possibly on-site) SEO.
The things that get picked up in 2+3 are unavoidably contributing to the positive score in the main algo. In order to negate this advantage, the relative importance of the negative scoring is turned way up. No need for OOP.
In order to account for why some sites appear unaffected, you have to do some funky things with maths. Here are some that might work
A,B,C = the "dial" on each score
x,y,z = unmodified score for main algo, Panda and Penguin
Ax + By + Cz = Doc Score
2)Same but more extreme
IF (x/y < n) THEN B=0
The above is over-simplified conceptual stuff, not definite analysis. Frankly, apart from it's use as a conceptual tool, it's total rubbish. Enjoy.
| 8:00 pm on Aug 28, 2013 (gmt 0)|
Actually, I quite agree with your theory on this!
| 9:13 pm on Aug 28, 2013 (gmt 0)|
Wanted to chime in with a few quick thoughts.
In many cases that it appears there is overt spamming, etc.. and we feel like the bigger companies are getting away with it, there is a question of whether those things are what is causing the rankings. If it is determined their rankings are important for the search results, or that the actions of keyword stuffing, spam links, etc. have negligible contribution to their rankings, then devaluing the benefits of those actions will not result in visible consequences.
Moral of the story... make sure you have enough diversity and natural occuring links, rankings, etc... and other things may be overlooked. For offsite stuff, we have seen some of our large brand clients hammered with spam links from other companies trying to take them down. Since they already had rankings before this occurred, the spam links had no impact.