homepage Welcome to WebmasterWorld Guest from 54.166.116.36
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 85 message thread spans 3 pages: < < 85 ( 1 [2] 3 > >     
Fear of Over-Optimisation Penalty ...
... well fear not.
le_gber

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29032 posted 12:53 pm on Apr 14, 2005 (gmt 0)

Hi,

first of all this is not a rant or any similar 'my competitor does this' kinda post, it's facts based, so can we try to keep it that way.

There's been a lot of post written about the fear of over-optimisation penalty, but I want to put everybody's mind at rest - THERE IS NO OVER OPTIMISATION PENALTY.

Common fear of Over-Opt. often appear when you want to do something with your site that you wouldn't necesseraly be able to justify if you were to follow one or the other guidelines given by the W3C: use of keywords in comments, use of multiple wordy and linked h1's, multiple keywords in img alt attributes, etc.

Let's try to weed out some of your fears.

The site used in this example ranks no1 on G for a lot of very competitive keywords (20/30+ Mil). It's also in the top 10 on all three major player in the SE market (organic listing), so maybe we can learn from them. The site 'tested' always returned the same page in for the tried keyword so the examples below apply to this page.


  • comments with keywords in it - there's no proof of it being read by search engines and comments should be used to make your site clearer - so if you want to use them, why not
  • multiple H1's, although not structurally correct (according to W3C), it doesn't seem to have a negative effect on a site ranking
  • wordy H1's (20 words or so), although long and painful for screen readers (especially if you have many), it doesn't seem to have a negative effect on a site ranking
  • linked H1's, again no negative effect can be seen
  • text - a lot (1500+ words on the homepage) - this seem to match what we know about search engines, they love content
  • PR - very good for a site on this topic (7) so again work on your backlinks
  • meta keywords and description - G displayed the meta description in the SERP for all the searches I did and so did MSN - so work on you meta description
  • text manipulation using CSS - this doesn't seem to have a negative impact on the site
  • too many links per page - again no proof of that having a negative impact - on the site 'tested' there is 250+ links sharing an average of 6 words each (yes that's almost a whole page of link)


    As always, discussions about what other people do is very difficult. It can sound like a rant or a complaint. This will not help.
    It can also sound like finger pointing - 'he' doesn't follow the W3C guidelines, I do and I am nowhere near the no1 spot. This will not help either.

    I am not condamning this company, they do offer the services that they rank well for. They are also prepared to take some 'risks' that I am not (if there is no over-optimisation penalty - are these really risky method?).

    So what are my options? I can take it upon myself to try to find alternative keywords, or I can be prepared to take those 'risks' as well. I choose the former.

    Anyway, I hope that those of you who were thinking of optimising your site but feared the over-optimisation penalty will have had found the peace of mind to do so, thanks to this post. I wish you good luck here.

    Leo


    [disclaimer]these are facts noticed on a unique site that is doing well, if you try and do the same on yours and get banned - don't come and blame me ;)[/disclaimer]

    I also want to add that it's not worth stickying me to get the site URL - I won't give it to you :)

  •  

    MHes

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 29032 posted 9:38 am on Apr 16, 2005 (gmt 0)

    >a) Spammers have a tendency to over optimise.

    This is often as a result of generated pages. A formula for generating pages will take a phrase and put it into the title, H1, bold, navigation etc.

    I think it is these pages google wants to catch, thus the oop is aimed at auto generated pages where the phrase is repeated in key areas. A handwritten page can have the phrase heavily optimised, but not necessarily be penalised. I would suggest that the template is a trigger component of a page that gets caught by oop. If the phrase appears in a number of places and the site has a regular template then this could flag up oop. There are plenty of pages that rank well and are keyword stuffed, but I wonder if these pages do not fit a 'generated page' profile.

    bbcarter

    5+ Year Member



     
    Msg#: 29032 posted 9:54 am on Apr 16, 2005 (gmt 0)

    The sites I'm thinking of have AdSense and internal links and little or nothing else.

    If it is computer generated, it uses patterns- perhaps in such a small site you could introduce randomness in terms of which internal links you placed on a given page...

    but still, I would think these computer generated pages would follow a formula, and the fact is that many SEO minds like ours follow our own formulas... the alternative is random variety, which at some point places you in the indistinguishable band of mediocrity which won't ever rank highly.

    can an algorithm tell the difference between a computer following a formula created by a human, and a human following their own formula? I doubt it, especially if the computer formula introduces perhaps 2% of 'random' error.

    So the only place for google to go is to make the formula so complicated that no one could figure it out. At that point, I suspect at least two things could happen-

    1. There become numerous ways to rank well
    2. It may become impossible to beat all criteria if some criteria are contradictory
    3. Your search results could suck

    The crazy thing is, if you punish for OO, you're screwing the white hat SEO-ers who just want ppl to find their page. And I'm talking even obscure things- you should be able to find, e.g. the unique copyright text that's on every page in my site a lot higher than #21. You shouldn't have to say your company name 15 times on the same page to get G to realize it's your company's website. But if that's what it takes, then why would G penalize people who do that?

    Maybe G has dug painted themselves into a corner. Or else they never expect any website to have more than say 10% of its pages ranking well?

    Who knows. :-)

    Leosghost



     
    Msg#: 29032 posted 10:13 am on Apr 16, 2005 (gmt 0)

    MHes ..re your message #27 ..wish it were so :)..in fact according to the latest research over 25% of surfers still don't know that the right hand side of serps are paid for ads ..the rest probably think it's something that's done to make the page look nice..
    "G" know this ..hence adsense and adwords work ..

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 10:21 am on Apr 16, 2005 (gmt 0)

    LG, have you actually seen this research and can it be trusted? I think it may be underestimating the knowledge that people now have about the Internet. But then even if it is 25% this would mean that 75% are aware that they are looking at ads as opposed to something to make the page look nice.

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 10:22 am on Apr 16, 2005 (gmt 0)

    Sorry. I just realised that we are moving off topic :)

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 11:41 am on Apr 16, 2005 (gmt 0)

    I just noticed something on one of my sites that (I think) adds weight to the OOP argument.

    I have a site that I created in November last year. As yet it gets virtually no Google traffic. It has a domain name keyword1-keyword2.co.uk and it has been highly optimised specifically for the phrase keyword1 keyword2. It is nowhere to be found in G on a search for this phrase. However this morning I noticed that it appears in position 10 in Google UK on a search for keyword2 (a common word that yields 7.6M results in Google UK).

    Wouldn't this suggest that my site is suffering from an OOP as opposed to the sandbox?

    Wizard

    5+ Year Member



     
    Msg#: 29032 posted 12:13 pm on Apr 16, 2005 (gmt 0)

    I have a site that I created in November last year. As yet it gets virtually no Google traffic. It has a domain name keyword1-keyword2.co.uk and it has been highly optimised specifically for the phrase keyword1 keyword2. It is nowhere to be found in G on a search for this phrase.

    I have noticed very similiar thing. For example, I have a site, where URLs look like mysite.com/global_keyword1/global_keyword2/local_keyphrase.html
    where global keywords are keywords defining the topic of the whole site, and local_keyphrase is the targeted phrase of a particular page. Deep pages typically have 12% density of targeted phrase. And they rank well for everything but their targeted phrases :))

    On another site of mine, I happen to have 50% density, made mostly with alt attribute in images (it's an image gallery with category name given in each alt), and sometimes the keyphrase is also in URL, sometimes not. In first case, ranking is below #100 (nothing strange to me, as 50% density is clearly too much), but in second case - if keyphrase doesn't repeat in URL - they are #1.

    What about spammy snippets - I think Google could get these sites easily - if any keyword repeats more than once in title and three times in meta, give a penalty. I'm surprised they don't do it right now.

    bbcarter

    5+ Year Member



     
    Msg#: 29032 posted 8:57 pm on Apr 16, 2005 (gmt 0)

    BDW- re: % of surfers who realize ads are ads

    you can find a similar stat on a certain itfacts blog...

    "Only 18% of Web users can distinguish between search result and advertisement"

    unfortunately, the source linked to from that blog no longer displays that info.

    ah, here we go- it's actually in a WebmasterWorld post
    [webmasterworld.com...]

    jd01

    WebmasterWorld Senior Member 5+ Year Member



     
    Msg#: 29032 posted 10:44 pm on Apr 16, 2005 (gmt 0)

    Has anyone ever toyed with the idea that maybe google is applying a non-penalized version of the algo to the results of adwords? (On the right)

    What I mean is what if they apply some form of the algo, so as the paid listings increase in relevance the middle of the page results drop?

    For instance if three of the sites that are paying for adds would be included in the top 10 if they had no penalty, why not start the middle of the page results at #4 and drop the top results?

    This could give the appearance of an OOP as highly relevant sites are dropped, because the paid results are also relevant and it would also give the appearance of an ever changing algo, because the paid results are always changing.

    Don't know, just wondering, but it sure would make it tough to optimize for, because you would never know what adds are going to be displayed, so do you need to be #1 to be #1 or do you need to be #7 to be #1?

    Justin

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 1:02 am on Apr 17, 2005 (gmt 0)

    "Only 18% of Web users can distinguish between search result and advertisement"

    'Let's get back on topic :)

    maccas

    10+ Year Member



     
    Msg#: 29032 posted 11:23 am on Apr 17, 2005 (gmt 0)

    I have just noticed that one of my sites has been hit by some sort of over-optimisation penalty or keyword penalty. It is only with certain pages. One of my biggest hits is for a page that has ranked number 1,2 or 3 for about 3 years for a non commercial term that pulls 23,100,000 results.

    As of the today or the last few days it now ranks nowhere for "keyword keyword2 keyword3" or any combination of those words, the only way I can get it in the results is if I add another term in there that is on my page. It still has its pr and is one of the few sites for this search that actually has outside links to it.

    maccas

    10+ Year Member



     
    Msg#: 29032 posted 12:03 pm on Apr 17, 2005 (gmt 0)

    Yep definately some sort of keyword (phrase)penalty, that page ranks number 2 if I obmit one of the words.

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 1:30 pm on Apr 17, 2005 (gmt 0)

    Anyone else seeing this?

    BillyS

    WebmasterWorld Senior Member billys us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 1:46 pm on Apr 17, 2005 (gmt 0)

    I can believe this is true. I don't know about you, but it seems to me that pages are ranking well for phrases if they really don't even have that phrase in it. For example, you seem to get more related topics than the actual topic itself.

    I did some testing around this on Friday. I asked some pretty simple questions of Yahoo, Ask and Google. Seemed to me that Google did not get to the optimum answer until around result number 10. I could usually find among the first 5 in Yahoo.

    The other thing I've noticed lately is that result #1 and #2 sometimes has very little to do with the search term. None of these searches in Google had the exact phrase in the title.

    RS_200_gto

    10+ Year Member



     
    Msg#: 29032 posted 2:35 pm on Apr 17, 2005 (gmt 0)

    We have the same problem as (maccas) but we also found out that Google indexes us by widget.com.aboutus/ instead by the front page.

    maccas

    10+ Year Member



     
    Msg#: 29032 posted 3:07 pm on Apr 17, 2005 (gmt 0)

    it seems to me that pages are ranking well for phrases if they really don't even have that phrase in it

    Absolutely! I have come across some real shockers over the last week. And like yourself I am finding Yahoo results more relevant than Google.

    MHes

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 29032 posted 5:04 pm on Apr 17, 2005 (gmt 0)

    I have a site in the travel sector which for years has appeared around position 15 for many keywords. Internal pages rank well. 10 days ago I ran an experiment with an internal page and split the target phrase on the page (not the title) so nowhere did it appear as the phrase.... no difference. I then did the same on the home page (with some apprehension!) and the same... no difference. Go figure?!

    maccas

    10+ Year Member



     
    Msg#: 29032 posted 5:29 pm on Apr 17, 2005 (gmt 0)

    I have just reshuffled the keywords around a bit on page and on title, and reduces the keyword density. If it is still down the gurgler in a week or so I will try changing my footer which also has "keyword keyword1 keyword2" as a href to that page. If it is still no where to be found after that then I guess its a permanent penalty.

    [edited by: maccas at 5:31 pm (utc) on April 17, 2005]

    Spine

    10+ Year Member



     
    Msg#: 29032 posted 5:30 pm on Apr 17, 2005 (gmt 0)

    I've seen evidence of that too. I can make some fairly drastic changes to a page and a few days later it appears to be cached and listed with the changes, but ranks the exact same in the SERPs.

    I've officially switched to Yahoo over the past week for my own searching. Even when I thought Google was a little weird lately, I'd still stick with them for my search needs - no more.

    Yahoo seems to have fixed their spammy 'inktomi' results up, right now it's fairly clean of computer generated spam, relevant and fresh, and the SERPs seem to have a more interesting blend of sites. It's good, kinda like Google 2-3 years ago :D

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 8:16 pm on Apr 17, 2005 (gmt 0)

    I then did the same on the home page (with some apprehension!) and the same... no difference. Go figure?!

    It may be that any OOP, like the sandbox, is only being applied to new sites?

    bbcarter

    5+ Year Member



     
    Msg#: 29032 posted 9:34 pm on Apr 17, 2005 (gmt 0)

    Well I don't know about you guys, but I've never liked using weird phrases that don't contain stop words...

    So if this is how it's going to be, we're going to start using the normal phrasing that's only implied by keyphrase helpers like overture suggestion. Don't you think that's where the SEs will have to go in the future anyway?

    Or do you suppose a better response would be just to vary your usage more throughout the html? Then I wonder if you're risking losing ground because of mediocre lack of pattern... I'm lost trying to figure this one out.

    B

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 6:26 am on Apr 18, 2005 (gmt 0)

    I'm lost trying to figure this one out.

    You are not alone :(

    le_gber

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 29032 posted 7:58 am on Apr 18, 2005 (gmt 0)

    Wizard, Msg #24
    When you say kwd density of 7% and 12% are you talking 'enamel bodders' only, or 'enamel dobbers + possible variations (dobber, enamllling ...)?
    Also speaking of keyword density, how about keywords (and variation) occurences - a 7% density on a 500 words page would see more 'occurences' of your keyword than a 12% density on a 100 words page.

    MHes, Msg #27
    I think the average user is learning to take more time examining the snippets and titles

    off topic - I'm with LG on this one, I am not sure that this is true, I think that the advanced user does filter by checking the snipet, but the average user only looks at title (if he even does that).

    BeeDeeDubbleU
    OK, let's look at it from another angle because to me it's as easy as ABC.
    a) Spammers have a tendency to over optimise.
    b) Google wants to get rid of spam.
    c) Google has the technology to detect over optimisation.
    Can someone give me one good reason why they would not use it?

    Spammers spam end of story. Otherwise they wouldn't be called spammers. Over optimisation (which has yet to be defined), is only optimisation pushed to the limits of what one think is legitimate. Off course everyone has different limits in his/her mind, so a penalty applied to a site might be a spam penalty if you used hidden text or white on white txt, but again I don't think that there is an OOP. For me there's only 4 types of site:

  • non optimised
  • badly optimised
  • well optimised
  • spam optimised

    And again the spam optimised are not all weeded out, so why would G penalised someone who tried to make his site gain better ranking by using legitimate techniques, when they know that spam still 'infests' their results.

    Wouldn't this suggest that my site is suffering from an OOP as opposed to the sandbox?

    Not necesseraly, it may still be the sanbox, as I think (sorry I havent read a great deal about it), that while in it you can still rank for some keywords just not the ones that you targetted.

    Leo

  • BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 9:56 am on Apr 19, 2005 (gmt 0)

    Over optimisation (which has yet to be defined), is only optimisation pushed to the limits of what one think is legitimate.

    I wouldn't dispute this. My point was that my evidence suggests that there is an OOP at work within the algo that may or may not be part of the sandbox. I won't try to claim that I am convinced about this because I don't have enough experience at this to do so.

    And again the spam optimised are not all weeded out, so why would G penalised someone who tried to make his site gain better ranking by using legitimate techniques, when they know that spam still 'infests' their results.

    I think the issue is the definition of "legitimate techniques". If a site has the domain name enamel-dobbers.com and these two words appear in all the strategic places then is it not logical to assume that the site has been created expressly to be found for those keywords? It may follow that G has decreed that such sites, while perhaps legitimate, are more likely to be spam and as a result they should gain no ranking points.

    webace

    5+ Year Member



     
    Msg#: 29032 posted 10:08 am on Apr 19, 2005 (gmt 0)

    I know a site that ranks top in thousands of 2 KW SERPS using cfm and hidden anchor text.No penalty or sandbox.
    check # 1 on keywords: <snip>
    and click at Cached and then Click here for the cached text only.

    [edited by: Brett_Tabke at 2:43 pm (utc) on April 19, 2005]
    [edit reason] please no specific kws... [/edit]

    Catfish

    5+ Year Member



     
    Msg#: 29032 posted 8:57 pm on Apr 19, 2005 (gmt 0)

    Y'all would be better off forgetting about keyword density, write compelling content that gets your audience to do their target action, and get links from authoritive sites that are relevant to your own.

    kevsh

    5+ Year Member



     
    Msg#: 29032 posted 6:00 pm on Apr 24, 2005 (gmt 0)

    As others have suggested in this thread, is it possible that things like the number and quality of inbound links is essentially causing Google to say "Okay, they may be over-optimizing here but it is obvious from the value and number of links they have that this site is the most relevant for [keyword/phrase]. So in this case - where there may be some doubt, and we certainly don't want to hurt a site w/o being sure they are doing wrong - let's not penalize them."

    larryhatch

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 29032 posted 10:44 pm on Apr 24, 2005 (gmt 0)

    Maybe we should forget search engines exist for a moment and
    just write naturally for the benefit of the readers.
    Keyword density should take care of itself then. -Larry

    Vadim

    10+ Year Member



     
    Msg#: 29032 posted 1:35 am on Apr 25, 2005 (gmt 0)

    May be there is no OOP but there is an attempt to show the true value of the content

    1.If you have good content but *not* optimized, you are ranking high.

    2.If you have good content *and* optimized, you also are ranking high but *not higher* than in the case 1.

    3.If you have bad content and optimized (spam) you are ranking low.

    The problem is seems that the implementation is not perfect yet.

    Vadim.

    BeeDeeDubbleU

    WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 29032 posted 6:18 am on Apr 25, 2005 (gmt 0)

    2.If you have good content *and* optimized, you also are ranking high but *not higher* than in the case 1.

    My experience of the above is that I am not ranking at all, even where the sites in question have excellent, original content. This is not just my own assessment. I get emails from people congratulating me on the content of these sites.

    Vadim

    10+ Year Member



     
    Msg#: 29032 posted 3:57 am on Apr 26, 2005 (gmt 0)

    My experience of the above is that I am not ranking at all, even where the sites in question have excellent, original content.

    Well, as I wrote their algorithm probably is not perfect. I believe that it is difficult to calculate the correct weighting factor.

    In other words, I believe that the danger to over optimize is real but the low rank is not penalty. It is an error.

    Otherwise it is difficult to explain why your good content has low rank.

    Vadim.

    This 85 message thread spans 3 pages: < < 85 ( 1 [2] 3 > >
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved