homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 75 message thread spans 3 pages: 75 ( [1] 2 3 > >     
Does an Search Engine Over-Optimization Penalty Exist?
I'm sure you're probably thinking about (OOP) exists or not?

 11:55 am on Nov 26, 2005 (gmt 0)

Over-Optimization penalty because of too much anchor text.

<edit> Suppose you change the navigation on your site so that all the home page links say "keyword keyword home" instead of "home". Will you incur an over optimization penalty?

My site appears to be suffering from this penalty. What are the views of the SEO experts here at WW? </edit>

[edited by: lawman at 3:00 pm (utc) on Dec. 6, 2005]
[edit reason] Edited To Conform To TOS 10 [webmasterworld.com] [/edit]



 2:19 pm on Nov 26, 2005 (gmt 0)

I believe that the OOP is now applied mainly to new sites. I use some of the techniques you mention on a couple of my older sites and they are still ranking quite high.


 2:26 pm on Nov 26, 2005 (gmt 0)

I don't believe in it. All my sites are over-optimized. Or are they just optimized?


 3:26 pm on Nov 26, 2005 (gmt 0)

I believe an OOP is in play and agree

Having worked on many sites this is how i see it. If you are good at SEO obviously a new site you build is going to be built properly using your seo skills from the offset. With those skills and good backlinks you would be able to rank high for your keywords overnight on a new site. Now Google on the other hand is not going to let you hence the use of the sandbox - your new site could be held back say 9 mths then slowly rise to the top.

So a new site is held back. (some disagree with this sandbox debate but in my experience certainly in the big money keyword sectors a new site is sandboxed.

Next, you might think, lets buy an old site thats already ranking in Google for your money keywords and put some seo in. Building new pages etc fully optimised - this is where the OOP comes into play. The page does not rise to the top and gets held back due to an OOP.

Same imo if you do seo on one of your pages it can be pushed back as a result, ive seen it many times but mainly only in the money keyword sectors.

To prove the point further i took one page on one of the sites i was working on at the time which was a site about widgets and the page was "green widgets", The page was listed at about 32 in Google for "Green Widgets" out of about 1.7 mil results. i did loads of SEO on it, Green Widgets title, on page perfect dencity, perfect links etc, etc, etc. The page seo wise near perfect imo. The page tanked within about two -three days in Google to about position 250! Having done no further work on that page of that site its now 9 months later rising again and improving on its earliest position, its now about 14/15 i think for its search term.

So to conclude i would say OOP means in effect putting the page back in the sandbox. This whole issue is more to do with pushing adwords sales imo than anything else, otherwise rather than buying adwords you would seo your page so that it ranked one for the keyword. This way, in the money keywords a new page is going to take a minimum of 9 months and more like two years before it ranks well, hence you are likely to buy adwords for that keyword whilst you are waiting!

So OOP has to exist in the same way that the sandbox does to stop webmasters buying old sites and doing seo on them to try and avoid the sandbox.

Thats my take on this issue anyway


 3:55 pm on Nov 26, 2005 (gmt 0)

What does it mean when you search "word" and one of your pages comes up on top for the correct key word. But searching word without the quotes comes up totally non-existent?

Is there any after 9/22 keyword OOP penalty tied to this?


 8:01 am on Dec 6, 2005 (gmt 0)

Any more comments please?


 8:29 am on Dec 6, 2005 (gmt 0)

One big factor in this, I believe, is change

I have regular problems with KWD, not because of keyword stuffing but because of what I need to put on some of my pages in order to do them in a proper manner for the visitor.

You can see older pages rank well with high KWD yet a newer one, or one that has been altered, would be filtered out for the word or phrase.

I am convinced G' treats changes in a different manner than an original page


 9:41 am on Dec 6, 2005 (gmt 0)

This was the point I was making in message two of this thread. We all know the things that made our sites rank in the past. But so does Google!

EXAMPLE 1: I have a very high ranking site that uses keywords in menu links and it is quite heavily optimised in a white hat sort of way. It has been ever present for more than four years and during this period it has gained lots of inbound links, including some from authority sites in its niche. It also has lots of information and useful content. Google probably looks at it and thinks that on balance this site is not spam.

EXAMPLE 2: I have a site that I launched just over a year ago. This is a site that offers free service that could be very useful to many people. I used similar SEO techniques on this site and it carries Adsense. It has been firmly sandboxed since day one.

* It has a www.keyword1-keyword2.com type domain.
* It uses keywords in file names.
* It uses keywords in <H1>,<H2>,<H3>,etc.
* It uses keywords in <title>.
* It uses keywords in meta description, etc.
* It uses keywords in navigation links.
* It uses keywords in text content with several of these hyperlinked and bolded.
* It has inbound links that use the keywords.

Can you see a pattern developing? ;)

My point is that it must be the easiest thing in the world for Google also to detect this pattern. They probably conclude that there is a high chance that the new site in question is spam and drop it into the quarantine box. After some time and probably when some other factor comes into play (authority IBLs?) the site can be released. If I had a search engine this is probably the way I would play it so, yes. There is probably an OOP penalty involved in the sandbox filtering process.


 9:54 am on Dec 6, 2005 (gmt 0)

Perhaps I should have said that I have also added new pages to the older site and most of these have gained rank very quickly.

[speculation]This would point to the filter being a site filter as opposed to a page filter.[/speculation]


 1:05 am on Dec 7, 2005 (gmt 0)

BeeDee, same experience here. New pages on an old, established site seem to more trusted by Google than new pages on new sites. I find myself now adding more to the old site (my first) than to new ones, simply because the new ones have a harder time with Google. I think that's kind of lousy, but that's the way it is.


 1:22 am on Dec 7, 2005 (gmt 0)

I have noticed that old sites can do almost whatever they decide to, including heavy textlink purchase (even totally unrelated) and they still do well.

Not my call to judge if it is good or bad however new sites have lots of value to my eyes too, they often offer a good alternative when operated correctly, Google should be able to recognize the efforts and increasing popularity without adding a blind penalty out of the box based on very dubious assumptions (as it appears).

Also you guys seem to end up doing what most people do...adding pages to old sites - That brings up one of the biggest issue on the internet right now - balkanization that is - my 2 cents


 9:22 am on Dec 7, 2005 (gmt 0)

Personally I never had OOP except for heavy inbound linking, that is thousands of links linking to my site using the very same anchor text.


 12:12 pm on Dec 7, 2005 (gmt 0)

Hi, I am wondering if text hyperlinks are seen as link spam on a new site, after all a list of them would look :- keyword,keyword, keyword ect.

Perfectly natural way to do things but there again google seems so picky.

If that is the case what would be the best way to do text hyperlinks - use an image or something else?



 2:29 pm on Dec 11, 2005 (gmt 0)

Personally, I'm convinced that there are oops for page elements/filenames etc - on Google and Yahoo. I'm pretty sure I've hit them in the past.


 4:12 pm on Dec 11, 2005 (gmt 0)

As I said in message eight, it would be very easy for the SE's to do this so we can conclude that they probably do.


 4:37 pm on Dec 11, 2005 (gmt 0)

I totally agree with BeeDeeDubbleU, Its like the “Old Vine Theory” seen with Google, The oldest the better. I have see many old sites with over optimization and some even have pages with same content, but they are not penalized.


 8:08 pm on Dec 11, 2005 (gmt 0)

I've triggered that before, I did it as an experiment. Took over a year or so for the page I tried to over optimize to rerecover. I made it easy on googlebot though, I used onpage css so the bot didn't need to try to read the css libary files. Using about 3 separate overoptimization techniques I dropped the page from 20-50 to > 200, where it stayed for a long time, even after I deoptimized the page. This was before any of the recent wave of updates however, so I don't know if any of this applies, but it definitely did back when I did it.

I no longer try things like that, my working theory is that the limits will grow tighter in the coming year, what worked before will become increasingly more likely to trigger penalties, in areas that previously let you have some room to mess around. Currently I'm exploring the wild idea that high quality content will generate high quality inbounds. I'm also dumping more and more seo related stuff every few months. However, very clean stuff is still working very well, as long as the site has not triggered any other potential warning flags, read the patent application to find out what those might be.


 8:17 pm on Dec 11, 2005 (gmt 0)

Perhaps a rethink is required. The site I have used in my example is now about 13 months old and it still receives no G traffic. If I had launched it without optimisation it would probably have come out of the sandbox much sooner. I could then have chipped away at it to push it up the hill.

It may be that this method would have been quicker?

Patrick Taylor

 8:40 pm on Dec 11, 2005 (gmt 0)

* It has a www.keyword1-keyword2.com type domain.
* It uses keywords in file names.
* It uses keywords in <H1>,<H2>,<H3>,etc.
* It uses keywords in <title>.
* It uses keywords in meta description, etc.
* It uses keywords in navigation links.
* It uses keywords in text content with several of these hyperlinked and bolded.
* It has inbound links that use the keywords.

Why is this over-optimisation or spam-like? It's as clean as it gets. It's nailing one's colours to the mast fair and square, and trying to disguise pages by "detuning the SEO" or some such phrase wouldn't make sense.


 8:45 pm on Dec 11, 2005 (gmt 0)

"Why is this over-optimisation or spam-like? It's as clean as it gets. It's nailing one's colours to the mast fair and square, and trying to disguise pages by "detuning the SEO" or some such phrase wouldn't make sense."

Completely agree with you, if this if the case then how on earth do you go about designing a site - dump all the text on a page and leave alone, I think it is more a case that a new site is held back no matter what you do - unless you get backlinks from a few PR8's of course.



 9:12 pm on Dec 11, 2005 (gmt 0)

"Why is this over-optimisation or spam-like?"

It isn't but that won't stop this goofy concept from getting mentioned every month or so.


 8:12 am on Dec 12, 2005 (gmt 0)

Why is this over-optimisation or spam-like?

This is over optimisation because if we are honest with ourselves this combination of circumstances (particularly the hyphenated domain name) usually only happens when people are deliberately targeting KWs. Genuine companies just don't call themselves we-have-the-cheapest-viagra-in-the-usa.com.

Google's quality guidelines actually tell us, "Don't load pages with irrelevant words." It's hard to retain relevance using the techniques I described above.

Why is this over-optimisation or spam-like?

It isn't but that won't stop this goofy concept from getting mentioned every month or so.

I can assure you that there is nothing goofy about it when you are suffering from it. It's easy to just say, "It isn't". Perhaps you would be kind enough to enlighten us why you are so sure of yourself on this?

Patrick Taylor

 2:21 pm on Dec 12, 2005 (gmt 0)

BeeDeeDubbleU, you are talking about something different with "Don't load pages with irrelevant words." I'm talking about relevant keywords used to reinforce the topic of a page as clearly and consistently as possible. I very much doubt that Google would consider the example list you gave as "over-optimisation" (whatever that is). It wouldn't make sense - it's just plain honesty. If what you are saying is true, the solution would mean disguising the true topic of a page with less relevant keywords... that's why I don't think it makes sense.


 5:14 pm on Dec 12, 2005 (gmt 0)

I respectfully disagree Patrick. I am not talking about a site that uses the KWs naturally.

Let's say that a (new) site is about tartan widgets. If the site is called www.tartan-widgets.com and the KWs "tartan widgets" appear everywhere possible I would say that this is a site that has been designed to be found specifically for tartan widget searches.

Example ...

Domain name: www.tartan-widgets.com
Sub pages like: www.tartan-widgets.com/tartan-widgets-aboutus.htm and www.tartan-widgets.com/tartan-widgets-contact.htm
H1: Tartan widgets available here
H2: Tartan widgets for sale
H3: Tartan widgets info
<Title>Tartan Widgets</Title>
Meta Description: Tartan Widget site with information and sales of Widgets in Tartan
Meta Keywords: Tartan widgets, tartan, widgets, widgets in tartan
Hyperlinks: Tartan widget contact details (etc).
Inbound links: Tartan Widget Site (etc).

I would contend that the chances of the above occurring naturally are very slim. No?

BeeDeeDubbleU, you are talking about something different with "Don't load pages with irrelevant words."

The example above is surely that of a page being loaded with irrelevant keywords. If the subject of a page is established with an H1, "Tartan widgets available here", then the H2 should probably just be, "Sales" and the H3, "Info".

If I was a search engine and I was looking for signs of spam or pages that are trying to game the system then the above would be a very easy place to start. No?


 7:18 pm on Dec 12, 2005 (gmt 0)

I have tested this OOP on a few sites I built recently. One site I highly optimized and added a few links to it from other similar sites.

Right out of the box, it ranked well for it's keywords for about a month and then was sandboxed.

The second site was basically plain text without H1 tags or any optimization whatsoever. It did not rank well right away like the first site and is still sandboxed also.

The only difference between the two is I actually made money off the one that I optomized for a month.


 7:38 pm on Dec 12, 2005 (gmt 0)

The OOP and the sandbox are two different things. Even sites that have not been optimised are likely to be sandboxed.

I launched three websites at roughly the same time just over a year ago. All three were sandboxed. Two were non-commercial and not heavily optimised they are now out and ranking well. The other is commercial and heavily optimised. It has now been in the SB for almost 13 months. Conclusions? Over optimisation may cause your site to be SB'd for longer periods.


 8:03 pm on Dec 12, 2005 (gmt 0)

I am tending to think that it is best to optimize as that will go well with MSN - you may as well get some hits off them.

With google you are sandboxed whatever you do as regards SEO - may as well make some money via MSN whilst waiting?

Patrick Taylor

 10:47 pm on Dec 12, 2005 (gmt 0)

BeeDeeDubbleU - as you point out, it's a matter of degree. Keyword stuffing, yes. My point is that to use a keyword or keyphrase in all those places you listed is (in itself) not spamming but clear communication and logical construction.

Maybe in some cases it does trip some kind of filter (or something) but it wouldn't make sense to "penalise" a page or site purely for that.


 2:47 am on Dec 13, 2005 (gmt 0)

IMHO there are sites that are optimized, There can be no overoptimized, or underoptimized, just optimized, or not.
There are no set rules, you can overstuff and trigger filters, you can understuff and not do well also. You can possibly have too much or too little exact match inbound link text. You can have problems with keyword proximity in your content, and this is only the beginning of the what if's.

If you hit the nail on the head and have your site listed in the number 1 slot, then in Big G's opinion, your page is optimized for the keyword, but Y & Msn, may not think so.

As discussed many times in the past, there can be no OOP.

Again IMHO,

Back to watching


 4:51 am on Dec 13, 2005 (gmt 0)

Well, it all depends on the point of view.

If due to the white hat optimization the site drops in SERP, for the webmasters it looks like the penality.

For Google it may be just an attempt to calculate the value of the site to the end users who cares little about blue_widget_best_discount_deal in the URL because most never look at the URL address and never type it.

However since most people here are the webmasters why not call it penalty for short? If it looks like a duck, sounds like a duck, and swims like a duck, it's probably a duck :)


This 75 message thread spans 3 pages: 75 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved