Forum Moderators: open

Message Too Old, No Replies

Over Optimization Penalty ( OOP)

Please Explain...

         

experienced

6:44 am on Dec 13, 2003 (gmt 0)

10+ Year Member



Hi,

Could any one explain the exact definition for OOP. @ Which stage G counts a page over optimized.

According to my little experience...

Title - optimization for 2-3 phrases

Description - must be good description about the page or product or services whatever it. is.

Keywords - target your general keywords although it does not make more efffect but some times.

Alt tags - must be relevant. Must not be like anything. I think 2 phrase OK.

H tags - Make your page with the proper use of this beautiful heading tags like <H2><h3>. Also page must have these tags once only.

Content - This is the key factor for any of the industry doing on SE. Put the content into the page as much as you can but it must not be duplicate, and must also be relevant content.

Rich content - don’t feed your keywords and key phrase in your content to make the keywords tags in between your content :)

Keyword Density - Use 3 to 7

Incoming Link - Try to get a lots of incoming link from the relevant sites.

Outgoing links - while making a link on your site pls be care full if it is a relevant site or
not. (Here I would like to clear my mean of Relevant - the site you are referring from your pages It must also be following all the rules of SE, and also into the same industry.(Not fully but it must not fully out of your industry)

Submission – do it as much as you can manually.

Now any expert I want here to analyze these points, and would like to ask if this is the OO page then what type of the page G accept. If this is not OO page then why my sites got dropped from google.

Thanks in Advance

Exp...

I_am_back

9:34 am on Dec 13, 2003 (gmt 0)



That all sounds pretty fair. Just these points..

Keyword Density - Use 3 to 7

Would not even look at it! Just write to aptly describe what the page is about.

H tags - Make your page with the proper use of this beautiful heading tags like <H2><h3>. Also page must have these tags once only.

I can see no reason why not to use H2, or greater, more than once.

Rich content - don’t feed your keywords and key phrase in your content to make the keywords tags in between your content :)

I don't get this?

Submission – do it as much as you can manually.

Possible true for all except Google. I have heard (no idea if it's true) that Google would rather find a site via link from another site.

The whole SEO industry has been built on fear and lies IMO. Just write for humans and follow standard HTML practices.

amazed

9:43 am on Dec 13, 2003 (gmt 0)

10+ Year Member



I am not sure there is an over optimization penalty as such.

I noticed that keyword combinations can be used too often quite some time pre Florida and actually regained a page by reducing. Its not as simple as keyword density I think but is connected with nearness in the text.

What seems to happen now is that google has identified key search phrases used world wide in the context of google news and probably froogle (I can't really check that from here) and uses this identification in the SERPS too.

You can see that when you type in a single search word of some news (or - probably in the context of froogle - commercial) importance. You will get google news items on top of the page - and the SERPs quote authoritative sources i.e. government agencies, .edu, newspapers etc. Connect that search word with a second word that makes sense in the context - google news will have disappeared and you will get the normal SERP, where everybody has a chance to get his or her stuff in.

To me it seems that Google is moving into the direction of a mix of an authoritative sources/news/froogle directory on single keyphrases and a automated web search on multiple keywords.

valeyard

9:46 am on Dec 13, 2003 (gmt 0)

10+ Year Member



OOP is simply a hypothesis to explain the recent changes to SERPS. Some people are convinced it exists - others aren't. They can all provide observations to support their point of view.

There's no solid proof as to whether or not OOP exists, let alone the exact details.

Dolemite

10:03 am on Dec 13, 2003 (gmt 0)

10+ Year Member



The problem is, there's no linear pattern or consistent correlation between optimization level and SERP position (or lack thereof).

Everything varies by keyword and then varies again based on factors we haven't yet pinned down.

percentages

10:06 am on Dec 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>There's no solid proof as to whether or not OOP exists, let alone the exact details.

Well said....Personally I think the hypothesis that the World was flat had more merit.....go too far and you will fall off the end....both sound similar ;)

I_am_back

10:11 am on Dec 13, 2003 (gmt 0)



hypothesis that the World was flat

You mean the world *is* flat :o)

Bobby

1:08 pm on Dec 13, 2003 (gmt 0)

10+ Year Member



I noticed that keyword combinations can be used too often quite some time pre Florida and actually regained a page by reducing

I think you're on to something amazed.
Could you elaborate on the circumstances and remedy employed?

I have focused primarily on 3 word keyphrases, they seem to really get singled out.

amazed

1:25 pm on Dec 13, 2003 (gmt 0)

10+ Year Member



in my case it was two combined keywords, non commercial, nobody really searching for it much, so it must have been a general filter.

As I said, it was way before Florida, so it should be nothing new.

experienced

5:46 am on Dec 15, 2003 (gmt 0)

10+ Year Member



Hi,

I_am_back. Some for you..

"I can see no reason why not to use H2, or greater, more than once. "

I think if we use H tags again and again Like <H1><H2><H2><H3><H3><H3>, This is must make the page OO, and will get the P. <H1><H2><H3> is OK.

About Rich content I must have clear here.
Rich content is like using your keywords into your content like this "text text text text text Keyword 1 text text text text Keyword 2,Keyword 3,Keyword 4, Text Text Keyword 5". This also makes your page Over optimize.

Submission – do it as much as you can manually.

" Possible true for all except Google. I have heard (no idea if it's true) that Google would rather find a site via link from another site. "

This is possible true but if you will not submit your site to the different engines, different sites, directories, how will G get your link.

I am following all the rules of every engines specialy G, But still not getting the sites back. :(

martinibuster

6:10 am on Dec 15, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I am following all the rules of every engines specialy G, But still not getting the sites back.

Time to rethink your rules.

Folks, this is all speculation. To me it doesn't make sense, and this is my reason why:

To say that Google is going after websites that bear the traits of over optimization is saying that, in the quest to return relevant results Google is penalizing sites that meet criteria other than relevance.

I think Google is more interested in finding relevant websites- part of doing that is going beyond dependance on website designers to create websites that are "search engine friendly" because, let's face it, most websites on the net are un-search engine friendly.

I'm not saying that Google has abandoned the old criteria, only that there's been a shift (which is saying the obvious, I know).

What this means is that there are a large amount of relevant websites that are not returning in the results because of their so-called search engine "unfriendly" design.

I'm not saying this is what Google is doing, however, if Google would take a step toward determining what a website is about, what a particular page is relevant for, apart from traditional factors, this would be a good first step and would appear to some as an OOP.

Of all the updates I recall, there has always been some kind of fallout where people claim that inbound anchor text has been dampened, etc. The OOP is the latest in many speculations.

Furthermore, looking at the serps I can say with some confidence that there are still many sites out there employing the old formulas that are doing quite well, so from my observation the OOP is a red herring.

Shoplifter

7:53 am on Dec 15, 2003 (gmt 0)



The oop is a myth. It is a keyword filter and simple spam is getting through an any non filtered term.

Take a look at the results for "paris hilton video".

Bobby

8:50 am on Dec 15, 2003 (gmt 0)

10+ Year Member



Martinbuster and Shoplifter,

I agree that there is no OOP and it is simply a KW filter. The KW filter appears to take into account KW density quite heavily, possibly in addition to KW in title tag. So in a sense there IS an OOP for certain KWs and phrases.

The KW filter seems to be based on a dictionary of sorts, which many have suggested is a commercial filter. It may be generated by looking at the most common queries, much the way Overture provides suggestions in its keyword suggestion tool.

I believe the combination of:

  • marked keywords (from the dictionary)
  • over-optimization of these keywords
  • kw density (of marked keywords) with respect to other words in query

    trigger the filter.

    The threshold point is apparently somewhere between 60% and 75%.

    Therefore, if 3 words are marked for death by Gexecution (don't bother looking it up in the dictionary), like big blue widgets, a search for "new big blue widgets" gets the axe. A search for "brand new big blue widgets" makes it thru.

    While this is only my theory, it holds up for all of my sites which have been hit by the OOP/filter and numerous searches.

    I'd love to get some feedback from others as to whether or not it holds its ground.

    Oh yeah, who remembers the song "don't fear the Gexecutioner?"

  • shaadi

    10:10 am on Dec 15, 2003 (gmt 0)

    10+ Year Member



    The oop is a myth

    I kinda agree on this.