Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Does an Search Engine Over-Optimization Penalty Exist?

I'm sure you're probably thinking about (OOP) exists or not?

         

freaky

11:55 am on Nov 26, 2005 (gmt 0)

10+ Year Member



Over-Optimization penalty because of too much anchor text.

<edit> Suppose you change the navigation on your site so that all the home page links say "keyword keyword home" instead of "home". Will you incur an over optimization penalty?

My site appears to be suffering from this penalty. What are the views of the SEO experts here at WW? </edit>

[edited by: lawman at 3:00 pm (utc) on Dec. 6, 2005]
[edit reason] Edited To Conform To TOS 10 [webmasterworld.com] [/edit]

2by4

12:19 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"Then there are people who know for sure, but they have their own agendas and make posts to the contrary to keep the masses dazed and confused."

I'm glad you mentioned this point jane d, this part of the seo agenda is something a lot of more casual posters and visitors might not be aware of.

The last updates I watched those people with interest, they are pretty obvious once you realize there is this other agenda, since there can be no rational foundation for the types of claims and red herrings they raise. I like to give people the benefit of doubt, but it's probably not a good idea in general to do that when it comes to shady activities like seo.

Objectively they are easy to spot though, since they say things that clearly do not accord with observed reality, but which do fit various agendas, both from an seo and a search engine's perspectives.

caveman

12:32 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've used the sandbox term before and the OOP term before, especially the latter...for the sake of shorthand communication. People wanna call it OOP, that's fine.

Lame, but fine. ;-) Oh, lighten up. Just kidding.

Either way, it's not semantics and that's why I referenced that old debate here.

Over Optimization Penalty.

steveb might take issue with th notion of "over optimization" since "optimization" is a GOOD thing. The phrase doesn't make sense (too much of a good thing?). Optimization implies not going too far. But hey, at least we all know what we mean when we refer to OOP, generally.

My concern is that people understand the difference between the terms "Penalty" and "Filter."

There is a world of difference between a penalty and a filter. They work differently, they act differently, they produce different effects (or can, anyway), and the difference between the two should not be overlooked or underappreciated by anyone. Too many backlinks with the same anchor text may trip a filter. But we think of that as OOP!

What is being referred to as OOP, IMO, is mainly a set of filters. That is quite different from being penalized. The causes are generally different and so are the remedies.

What we refer to as OOP: It's more like the Over Aggressive Filter...or, OAF, for short. ;-)

2by4

12:47 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"But hey, at least we all know what we mean when we refer to OOP, generally."

It would be nice if this were the case, I have to admit, since my reaction whenever I see one of these types of threads is to say to myself, look, I know exactly what the poster is refering to, it's totally clear.

However, something I saw on the site I tested this one did in fact look and act EXACTLY like a penality, not a filter. Even down to the amount of time it took for the page to recover once I de-optimized it. Like I first said, I don't know if this particular penalty is still in affect, this was last year I did this, so I won't claim it is, but on other sites I reviewed this year, I saw that either a penalty or a filter was definitely in place, I can't say which because you have to track the item that has been affected over time to determine that. In my case, we're talking about pr dropping to zero, page out of serps, all that, a penalty, not a filter. But this could have changed, I won't say this for sure.

The analogy given is totally misleading however, when steve says you approach 100%, or then go back to 93%, that's not at all accurate, and is to put it mildly very misleading... well, it's completely wrong, but I'll drop it for the sake of good manners, the more accurate is that 100% is the goal, but if you go beyond that, the house of cards simply collapses. Like the games we've all played at some point, there's lots of versions, but they all come down to when you go too far you lose.

Of course even hitting 100% is very risky, since it means that you are right at the limit if you're talking only about onpage/offpage optimizations. The smallest turn of the dials could dump the page, we saw that a lot in jagger, the dials were spinning fast, still are. So I'm more comfortable with maybe 90%, it gives me some leeway, I don't have to make as many moves to keep ranking, and updates aren't really all that stressful for me anymore. They are for clients of course, since they tend to push much closer to 100%, and the glass spills over, the marbles fall, etc.

Obviously this part of the algo could move from being a filter to being a penalty, like with the sandbox stuff, if you trip it, your site isn't filtered, it's penalized, if you don't, you don't believe that can happen. But this particular one is really easy to test, try it yourselves and see, get really carried away, I gave some examples of how to trigger it, I'm sure there are others.

By the way, some might want to say, but things like invisible text don't trip it, or whatever, but that's not the point, the point is finding out what does trip it, then avoiding it. My experience was that it was very easy to trip if you got slightly clever about what you did, clever in a bad sense that is.

"optimization is a good thing"

this is the problem, when you define something so that it makes your point work for you, you aren't proving anything, you're just defining something. Optimization isn't good or bad, it's just something you do to try to get the page ranking better. Go too far, do too much, and you may be sorry. SEO isn't about good or bad, it's just about getting your page ranking or not. Thus the term, seo. You optimize for search engines - with the hope that this optimization will trip a positive filter. Sometimes it does, sometimes it trips a bad one.

We focus so much on negative filters that we forget that your page hits top 10 because you tripped all the positive ones. But we tend to not talk about that part of it, instead we just say, oh, my page is ranking now. But your page is being rewarded for doing things that the algo expects to see. When it does things the algos don't like, you get punished. Call it a filter, call it a penalty, only difference is in how much time passes before you can get rewarded again.

... just get trustrank, it's all I can say, it's worth the effort.... it's gotten to the point that I can pretty easily see who has it and who doesn't based on their comments and experiences in updates. When you have it the mountain is a lot less high, and when you fall you don't fall nearly as far.

Jane_Doe

1:20 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Objectively they are easy to spot though,

Like the ones where it would fit to say "thou doth protest too much"?

2by4

2:03 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



yes, exactly those. I'm naive, I'll admit it. It's a weakness of mine. I used to actually believe that all posters really were simply posting what they thought or believed. Obviously with this much money on the line it's pretty silly to think that.

Or 'thou doth guide attention away from things we don't want people to understand...'

caveman

2:24 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just as an FYI, I wasn't trying to define the term "optimization" ...

"optimization": The procedure or procedures used to make a system or design as effective or functional as possible, especially the mathematical techniques involved.

Seems like a good thing to me. ;-)

"Under optimized" makes sense. "Over optimized" doesn't make sense. It's not possible for something to be over optimized. It's like saying "over perfect." (That's all I was trying to say there. Not a biggie.)

Frankly I don't mind the term except that, just like sandbox, it has a tendency to be misleading to those newer to the world of SEO.

My main concern is just that most of the issues assoicated with OOP are not penalties. Which if you've tripped a filter that might be thought of as related to OOP, is good news. Good news, because tripping a filter is not that hard to rectify...if you can sort out which one you tripped.

However, there's also the issue that many of these filters are, I think, co-dependent...hence the notion of red flags. ;-)

[edited by: caveman at 2:38 am (utc) on Dec. 15, 2005]

2by4

2:37 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oh, sorry caveman, I realized after edit time ended that that could be read as saying you defining it, that's not correct, I mean steve is defining it that way.

As usual with steve, there is a kernel of truth in what he's saying, which is I guess why I keep reading him. But there's a layer of confusion around that kernel.

However, when you talk about optimizing a page for search engines, I'd have to disagree, to me it's not like that at all. It's like a teeter totter more, you can get as close to the center as you want with your center of gravity, and it won't tip to the other side, but the second your center of gravity crosses the line the teeter totter tips over. Filters versus penalties is simply deciding how long it will stay tipped over for if you try going back a few steps, maybe it gets locked down for a year, maybe it balances again.

Perfection is not a word I would use to describe optimizing a web page, I'd have to disagree with you completely on that, I see it as understanding more or less how algos see pages and sites, then constructing a site that will more or less satisfy that condition, while not triggering future conditions. There's nothing perfect or imperfect about that process, it's just doing some tricks when the time is right, and doing other tricks when the time is wrong. And seeing if the time is right or wrong, that's probably by far and away the biggest trick. I keep a term like perfection for art and similar things, where it actually works in the way you describe, getting closer to perfection or further from it.

It's useful to think of it as a continuum from very good to very bad, not as approaching perfection or moving away from perfection. My suspicion, since you put it this way, is that a steveb is seeing only half of the picture.

Of course, red flags, penalties, filters, talk about angels dancing on pinheads... can't see much difference from where I stand, a page ranks or it doesn't, a site ranks for new pages and content or it doesn't.

I guess it's not a continuum, but starting with the page being totally worthless, zero value but not negative, to getting closer and closer to number 1, but as soon as you go too far, you're in a new area, and it's all bad. But there are levels of bad too, very bad equals penalty, a little bad means filter. I really like jagger by the way, did I mention that?

[edited by: 2by4 at 2:47 am (utc) on Dec. 15, 2005]

caveman

2:45 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hey, no worries, and I don't want to speak for steveb since more often than not he's smarter than me anyway. It's just that I understand those who have issues with the idea of a penalty for "over optimizing." In all likelihood, it's tripping filters for "going too far."

BTW, anyone read: Building the Perfect Page [webmasterworld.com]?

:o

It's a great series, and also gets at what Brett has been preaching, well, forever. Essentially: Get the basics right and keep working hard at it. Combine that with BT's 26 Steps and you shouldn't have to worry much about the OAF. ;-)

(Basics: It's THE big secret of SEO. Pushing the envelope is kinda dangerous, and so often, not even necessary.)

2by4

2:51 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"Pushing the envelope is kinda dangerous, and so often, not even necessary"

yes, exactly, that's what I'm finding. As I pull back from high 90s to low 90s (keeping with the < 100% model ) my sites rank better. The sites that were 60s and which were brought up to 80s rank better, and for serp counts they didn't rank for before, I take it as a reward for being good.

The sites that were over 100 and started tripping filters all dropped. That's good enough for me, I'm happy working in this model, I don't need to see exceptions that prove the rule, there are lots.

I don't think it's coincidental that as I've pulled back on optimizations of varying types that my sites are ranking better and better. Plus when I bring up parts that weren't optimized well at all, they also rise, so it's pretty obvious to me that it's just a balance, keep it conservative if you want it to last, florida and jagger, it's all I need to finally learn this lesson, it's enough, I'm advising all clients to stop playing games and start getting serious. If that's something they are capable of doing. Long term success takes a lot of work, just look at brett.

It's especially funny that brett told everyone how to do it but people still get impatient, and then they get hooked on the rush of how well tricks can work, until the next big update, when tricks are much more transparent. I'd recommend to anyone who's sites dipped massively in jagger then recovered to take a very close look at their long term strategies, then take a look at how to create quality sites and pages, and make sure you're not skipping any steps.

steveb

3:58 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Jeez 2x4 why do you deliberately try to confuse people? There is no "greater than 100". It's impossible, like "we gave 110% today". You seem to deliberately either refuse to understand a simple concept, or enjoy trying to confuse people. I don't know which, but you should try to read what caveman said. There is no Spinal Tap "11".

A 100 would be to simply have Google liking your page as much as possible, and ranking it as highly as they would ever rank it. That's optimized. Optimization is not some total of a pile of tactical trickery.

You seem to live in a world where eveything is filters and penalties and other stuff that many other webmasters don't face because they are just optimizing web pages, not gaming search engines.

BeeDeeDubbleU

8:12 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Jeez 2x4 why do you deliberately try to confuse people? There is no "greater than 100". It's impossible, like "we gave 110% today".

Perhaps I am missing something but I can't find the quote "greater than 100" on this page? SteveB you appear to be off at a tangent. You may be the one who is confusing people by "quoting" imaginary quotes.

You seem to live in a world where eveything is filters and penalties and other stuff that many other webmasters don't face because they are just optimizing web pages, not gaming search engines.

Sorry? Can you run that past me again? Now I am really confused ;)

steveb

9:10 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"but I can't find the quote 'greater than 100'"

"The sites that were over 100"

Um, yeah you do sound confused. Maybe the thread is overoptimized.

Patrick Taylor

9:32 am on Dec 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



... optimizing web pages, not gaming search engines

I think it's important to understand the difference between the two, otherwise this thread just stays a tennis match.

Pico_Train

10:18 am on Dec 15, 2005 (gmt 0)

10+ Year Member



Well said, two different ball games, excuse the pun.

white-hat, whatever that really means

and

black-hat, whatever that really means.

joco

4:31 pm on Dec 28, 2005 (gmt 0)

10+ Year Member



I think I still want to talk about this topic but not whether seo or oop is good or evil or possible or whatever. I think I have an actual example that might clarify the real issue at stake.

I have been asked to answer whether this lady's site has been "over-keyworded" to explain why a seemingly ok page is not working as well as its brothers.

This product page has its "widgets x" in the front of the title, h1, keywords, desc, and copy and all over the two navigation ul's . . . but it's sucking bad.

The way that this page is different from the other high-ranking pages on the site is that this particular page has fewer included non-"widgets x" items in the side-navigation or, in other words, fewer non-matching items in the page-specific, server-generated 2nd navigation. That is, other pages repeat their keyphrases a zillion times because there are zillions of like products listed in the navigations.

But on THIS page, all the items match up and it looks like spam. But it's not spam. No hidden text or anything, no repeats in the "meta keywords." It's just a list of products that happens to include repeated keywords. And the page is indexed, no problem there.

It seems to me that the problem is either that SE's don't like the repetition or high-density . . . or that
they're actually tracking user click-thru's and don't see high interest in this page for "widgets x." I have no reason to believe that users don't like the page or listing; it's just that I can't think of anything else.

So, OOP or bad architecture or whatever you want to call it?

This 75 message thread spans 3 pages: 75