homepage Welcome to WebmasterWorld Guest from 54.234.0.85
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 85 message thread spans 3 pages: < < 85 ( 1 2 [3]     
Fear of Over-Optimisation Penalty ...
... well fear not.
le_gber




msg:749353
 12:53 pm on Apr 14, 2005 (gmt 0)

Hi,

first of all this is not a rant or any similar 'my competitor does this' kinda post, it's facts based, so can we try to keep it that way.

There's been a lot of post written about the fear of over-optimisation penalty, but I want to put everybody's mind at rest - THERE IS NO OVER OPTIMISATION PENALTY.

Common fear of Over-Opt. often appear when you want to do something with your site that you wouldn't necesseraly be able to justify if you were to follow one or the other guidelines given by the W3C: use of keywords in comments, use of multiple wordy and linked h1's, multiple keywords in img alt attributes, etc.

Let's try to weed out some of your fears.

The site used in this example ranks no1 on G for a lot of very competitive keywords (20/30+ Mil). It's also in the top 10 on all three major player in the SE market (organic listing), so maybe we can learn from them. The site 'tested' always returned the same page in for the tried keyword so the examples below apply to this page.


  • comments with keywords in it - there's no proof of it being read by search engines and comments should be used to make your site clearer - so if you want to use them, why not
  • multiple H1's, although not structurally correct (according to W3C), it doesn't seem to have a negative effect on a site ranking
  • wordy H1's (20 words or so), although long and painful for screen readers (especially if you have many), it doesn't seem to have a negative effect on a site ranking
  • linked H1's, again no negative effect can be seen
  • text - a lot (1500+ words on the homepage) - this seem to match what we know about search engines, they love content
  • PR - very good for a site on this topic (7) so again work on your backlinks
  • meta keywords and description - G displayed the meta description in the SERP for all the searches I did and so did MSN - so work on you meta description
  • text manipulation using CSS - this doesn't seem to have a negative impact on the site
  • too many links per page - again no proof of that having a negative impact - on the site 'tested' there is 250+ links sharing an average of 6 words each (yes that's almost a whole page of link)


    As always, discussions about what other people do is very difficult. It can sound like a rant or a complaint. This will not help.
    It can also sound like finger pointing - 'he' doesn't follow the W3C guidelines, I do and I am nowhere near the no1 spot. This will not help either.

    I am not condamning this company, they do offer the services that they rank well for. They are also prepared to take some 'risks' that I am not (if there is no over-optimisation penalty - are these really risky method?).

    So what are my options? I can take it upon myself to try to find alternative keywords, or I can be prepared to take those 'risks' as well. I choose the former.

    Anyway, I hope that those of you who were thinking of optimising your site but feared the over-optimisation penalty will have had found the peace of mind to do so, thanks to this post. I wish you good luck here.

    Leo


    [disclaimer]these are facts noticed on a unique site that is doing well, if you try and do the same on yours and get banned - don't come and blame me ;)[/disclaimer]

    I also want to add that it's not worth stickying me to get the site URL - I won't give it to you :)

  •  

    Vadim




    msg:749413
     3:57 am on Apr 26, 2005 (gmt 0)

    My experience of the above is that I am not ranking at all, even where the sites in question have excellent, original content.

    Well, as I wrote their algorithm probably is not perfect. I believe that it is difficult to calculate the correct weighting factor.

    In other words, I believe that the danger to over optimize is real but the low rank is not penalty. It is an error.

    Otherwise it is difficult to explain why your good content has low rank.

    Vadim.

    helloponty




    msg:749414
     4:26 am on Apr 26, 2005 (gmt 0)

    well this explanation is quite confusing!

    le_gber




    msg:749415
     7:04 am on Apr 26, 2005 (gmt 0)

    Well, as I wrote their algorithm probably is not perfect. I believe that it is difficult to calculate the correct weighting factor.

    In other words, I believe that the danger to over optimize is real but the low rank is not penalty. It is an error.

    Otherwise it is difficult to explain why your good content has low rank.

    well this explanation is quite confusing!

    I think what Vadim meant is,

    As I wrote, their algo is not perfect

    Leo

    BeeDeeDubbleU




    msg:749416
     7:41 am on Apr 26, 2005 (gmt 0)

    In other words, I believe that the danger to over optimize is real but the low rank is not penalty. It is an error.

    Vadim, now I am confused by that statement. Are you saying that you believe that there is an OOP or not?

    As I said in my original post in this thread, I don't think that the OOP is a page penalty as such. The only "penalty" may be that when the Google Filter detects what it thinks is OO on a word or phrase it just ignores that phrase meaning that it won't features in any searches.

    I don't know if this will read OK but it's early and I am not fully awake :)

    Vadim




    msg:749417
     2:24 am on Apr 27, 2005 (gmt 0)

    Sorry for may bad English. I am trying to say that

    1.This is not a penalty.

    2.This is an error of Google algorithm.

    3.The algorithm tries not to punish but rather eliminate the effect of the optimization to estimate the real value of the content. In other words, it tries to make the rank to be independent of the optimization.

    4.To get the independent of the optimization rank is very complex task, so the errors in the rank estimation are rather often. Some of these errors look like the penalty and some like not justified promotions of spammers or bad designed sites.

    It seems means that as in real life to be better than average is risky but: "no risk Ė no glory"

    In addition, we may hope that Google will improve this algorithm eventually.

    Vadim.

    BeeDeeDubbleU




    msg:749418
     7:44 am on Apr 27, 2005 (gmt 0)

    The algorithm tries not to punish but rather eliminate the effect of the optimization to estimate the real value of the content. In other words, it tries to make the rank to be independent of the optimization.

    This is the point I was making too but it does get somewhat confusing. Let's say that the Google algo sees KWs in domain, Meta, Page Title, Content, Anchor Text, et al. It may then decide, "Sorry mate, you ain't getting any ranking points for that!" I suppose the fact that it does not award points in these cases could be seen as a penalty?

    (All pure speculation by the way.)

    Vadim




    msg:749419
     3:25 am on Apr 28, 2005 (gmt 0)

    I believe the goal of the algo is to make the rank independent of the optimization. In other words all sites with the same quality of he content should have the same rank regardless of the optimization.

    Therefore, it is not a penalty. The algo just observe the interests of the searches that donít care about of the optimization and are interested in the content only.

    The algo not cancel but rather diminish rank for over optimized sites.

    If it by mistake decreases the rank too much, it looks as penalty.

    If it by mistake decreases the rank too little, it may looks as over optimization sometimes works even for spammers.

    The algo also increases the rank for the poorly optimized sites i.e. for the sites, which optimization is below the average.

    If it by mistake increases it too much, it looks like that poorly designed sites have the advantages.

    All this may be related to "TrustRank" that was filed with the USPTO about a month ago but probably works in some form at least several month. See
    [webmasterworld.com ]

    Personally, I believe that implementing "TrustRank" was not a good idea because even the authors understand that at present state of the art it needs human intervention. They cannot correct all errors manually, however.

    Vadim.

    BeeDeeDubbleU




    msg:749420
     8:47 am on Apr 28, 2005 (gmt 0)

    Perhaps they couldn't correct all of the errors manually but manual intervention is the answer. It will happen, it is inevitable, and the sooner the better. I cannot understand why one of the major search engines does not get into this now, before the others. Their results could be improved beyond measure quite quickly. Think what a selling point that would be.

    le_gber




    msg:749421
     9:37 am on Apr 28, 2005 (gmt 0)

    manual intervention is the answer. It will happen, it is inevitable, and the sooner the better.

    I think that GG said that for site where it was of an urgent matter - adult sites targetting potentially kids children - they would do it.

    For any 'business-related' matter I don't see the need.

    I cannot understand why one of the major search engines does not get into this now, before the others. Their results could be improved beyond measure quite quickly. Think what a selling point that would be.

    How would you put that into place?

  • think about the sheer volume of pages they have in their DB and the time it would take
  • think about the volume of letters and complaint they would receive from unsatisfied webmaster - anyone heard of DMOZ - how many posts are there about a site taking ages to list - imagine having to manually check a whole website for its content and then giving it a value
  • as usual when there is human interaction there's is the 'checker' own perception of what's good and bad - what's good for one, might be worthelss to others
  • what rules would you tell them to follow to say what site is better than the other? why?

    Honestly I don't think that having 'humans' check for the whole spectrum of website 'worthiness' is viable.

    Leo

  • Leosghost




    msg:749422
     9:48 am on Apr 28, 2005 (gmt 0)

    Honestly I don't think that having 'humans' check for the whole spectrum of website 'worthiness' is viable.

    And would seriously affect the health conditions of many members of these fora ...;)

    BeeDeeDubbleU




    msg:749423
     11:10 am on Apr 28, 2005 (gmt 0)

    Honestly I don't think that having 'humans' check for the whole spectrum of website 'worthiness' is viable.

    What? This has nothing to do with worthiness, it's about spam and the whole spamming process is based on pushing your site to the top of the results. It's an absolute dawdle to find and nuke spammers. If you are a SE they offer themselves to you as a sacrifice.

    what rules would you tell them to follow to say what site is better than the other? why?

    It would not be their job to say what site is better. That would still be down to the algo. All they would be doing is eliminating spam and offending sites as defined quite clearly and unambiguously by the search engine's rules for inclusion.

    Google continues to maintain that their algo can take care of the spam. Well the steadily deteriorating results in most commercial categories would suggest otherwise. If they won't use manual editing then all I am suggesting is that one of the other majors should. I believe that their results would improve beyond recognition quite quickly, making them the SE of choice.

    I happen to believe that there is some sort of OOP in place and that it may be affecting some of my perfectly innocent, informational, non-commercial and within Google's guidelines sites. When I see them being penalised in favour of spammer's sites I get upset. Am I not entitled to?

    le_gber




    msg:749424
     11:47 am on Apr 28, 2005 (gmt 0)

    What? This has nothing to do with worthiness, it's about spam and the whole spamming process is based on pushing your site to the top of the results

    Sorry my mistake, let's say it's about spam, how would you determine that a site is spamming?

    Is it dorway pages, dodgy redirects, hidden text? All of the above? Let's say I use the keyword 'enameling widget' 3 times in an H1 linked to my Widget enamel page - all using css to 'disguise' the H1 and the link - would that be spam?

    defined quite clearly and unambiguously by the search engine's rules for inclusion.

    For the first three techniques above, it's clearly stated in G guidlines, but for the other possible 'spammy' techniques, not so. It would take too much time to write in details what is allowed and what is not, and anyway who's Google to tell us what we are allowed to do with our web pages (not that it's what you are doing G! - I LOVE YOU G - I didn't mean to offend/upset you G :))

    I happen to believe that there is some sort of OOP in place and that it may be affecting some of my perfectly innocent, informational, non-commercial and within Google's guidelines sites.

    Have you tried de-optimising them? not that I believe in OOP

    When I see them being penalised in favour of spammer's sites I get upset. Am I not entitled to?

    Off course you are, but it's often because angry webmasters have had their site passed over by other sites (using shady techniques or not) that the wildest 'I don't like G' post are born, that's why I wanted to keep this thread from going into feelings but rather facts.

    Let me add this, I was like you are a year or so ago, filling spam report with G Y! and MSN, because I found sites using spammy techniques ranking above mine. I think it's the natural way for many people to think. "If I use only relevant and quality information on my site and can't get the good ranking that I feel I deserve, but others using shady techniques can, why shouldn't I report them". But my way of thinking changed - mainly thanks to this board though - I now concentrate on finding other ways of getting the better over them, ie link strategy, on page optimisation etc...

    Leo

    BeeDeeDubbleU




    msg:749425
     12:38 pm on Apr 28, 2005 (gmt 0)

    Sorry my mistake, let's say it's about spam, how would you determine that a site is spamming?

    I wouldn't. I would be quite happy to leave this to the SEs. They own them so they can define spam anyway they like.

    Let me add this, I was like you are a year or so ago, filling spam report with G Y! and MSN,

    Leo, I must be missing something. Where did that come from, who said I was doing this? I would appreciate it if you could refrain from making statements like this that could lead people to believe that I actually said this. I didn't say that, what I said was that the SEs should deal with their own spam problems manually. There is no point in reporting these things to Google. They don't do anything about it.

    le_gber




    msg:749426
     12:55 pm on Apr 28, 2005 (gmt 0)

    how would you determine that a site is spamming?

    I wouldn't. I would be quite happy to leave this to the SEs. They own them so they can define spam anyway they like.

    then we're back on - who's G (or any other SE for that matter) to tell us what we are allowed to do with our web pages.

    I would appreciate it if you could refrain from making statements like this that could lead people to believe that I actually said this

    sorry I thought this was what you meant by

    It's an absolute dawdle to find and nuke spammers

    anyway there's no shame in admitting you did it - I know I did.

    Leo

    BeeDeeDubbleU




    msg:749427
     1:08 pm on Apr 28, 2005 (gmt 0)

    It's an absolute dawdle to find and nuke spammers

    I don't want to get into this any further but when I said the above it was in the context of the SEs doing this themselves. But then I am sure that you already know that this is the only way that it can be done. How could I ever claim that it was a dawdle for anyone outside of the SEs to do this?

    le_gber




    msg:749428
     1:26 pm on Apr 28, 2005 (gmt 0)

    dawdle

    with my limited english (I'm french) I assumed that this word was a synonym of fun, tried google it and seem that it's more like time-consuming.

    sorry about that.

    Leo

    Leosghost




    msg:749429
     1:32 pm on Apr 28, 2005 (gmt 0)

    he meant "doddle" ..meaning easy ..he's Scots si ma memoire est bon ..depending on where he's from in that fine country ...

    "dawdle" is how "doddle" is pronounced ..

    peut etre "phonetic typing" ..heh heh ;)

    BeeDeeDubbleU




    msg:749430
     2:42 pm on Apr 28, 2005 (gmt 0)

    Lo siento! (Well, I know more Spanish than French.)

    I have never been able to make up my mind how to speel it. Doddle or dawdle? But this thread made me do the research and even though it is often spelled Dawdle in Scotland I will stand corrected because the correct spelling seems to be Doddle.

    Le_Gber doddle is colloquial language in the UK for something that's very easy to do. Like falling off a log is a doddle.

    Wizard




    msg:749431
     7:18 pm on Apr 28, 2005 (gmt 0)

    who's G (or any other SE for that matter) to tell us what we are allowed to do with our web pages

    I see the point of this questions, but the other hand, who are we to tell what G should penalize and what should not? If they decided to ban all commercial sites from regular index, and show only non-commercial results + adwords - this is their right to decide.

    Google interest is to provide good results, as without this they are out of business. But they can make any guidelines they wish, if they think that penalizing and banning other sites will improve the results.

    And the results from G aren't as bad as some people say. They are horrible in commercial topics, and great in other. When I search for a popular open source operating system official website, or online documentation of popular open source database, I use "I'm feeling lucky" button because I already know this searches give perfect match for #1.

    If I were looking for cheap flights to city #*$! or hotels reservations in city xxx, I'd probably have no chance to find anything worth clicking. Or maybe I could find more interesting stuff among adwords.

    But Google is more for searching for information, than for products and services, isn't it. Because if we assume otherwise, there will always be the fight and chaos in SERPs, unless the algo eliminates all potential spam and a few good sites by the way.

    I believe the spam eliminating algos may be improved, but right now, Google doesn't seem to use algorithms to ban sites with white on white or CSS hidden keywords and links, so I don't understand the reason of penalizing OO. Unless they use black box algo, and it's the result.

    bbcarter




    msg:749432
     10:15 pm on Apr 28, 2005 (gmt 0)

    manual editing:

    Someone said something recently about seeing indications that yahoo was rewarding sites that handpick/review links, like a directory

    I've seen a number of pages with good SERP and PR that were simply loads of links to other sites

    It makes sense to reward for this if you can't handpick among 8 billion- leverage the handpicking that's already going on.

    Bard




    msg:749433
     7:56 pm on Apr 29, 2005 (gmt 0)

    Hello all... I'm new to this forum but not to SEO and certainly not to Google. I have been reading this forum for a long time but never really took the time to post. I thought I could add a piece of logic to this post. I have and still hold a lot of top 10 positions without insane linking campaigns.

    If I sell widgets then "widgets" will obviously be a keyword that would repeat itself over and over again in all aspects of my site. The necessity of repeating keywords is a fact of any web site based simply on it's construct regardless of SEO. Some web sites list product after product in one page but they are all the same "widget". Some web sites have hundreds of description landing pages for each "widget". And some web sites have both methods. The underlying fact I'm trying to point out here is that repeating keywords is and always will be a necessity of design simply for "Ease of Use". Slapping a filter on a web site for repeating it's product names hundreds of times on a page would be like slapping the face of god for creation.

    #2 That being said I would look toward hidden page elements for your answers to OOP. Remember do not place text over an image BG without changing the table color BG to a different color than the text:P

    <titles><descriptions><h1><h1><h1><h1><!--comments--><title="spam"> etc...

    #3 Googlebot is a enigma LOL

    BeeDeeDubbleU




    msg:749434
     8:47 pm on Apr 29, 2005 (gmt 0)

    Slapping a filter on a web site for repeating it's product names hundreds of times on a page would be like slapping the face of god for creation.

    I appreciate what you are saying but if a site was developed without using SEO techniques the incidences of the KWs in all the critical places would probably be much less. If we are honest with ourselves that is a fact.

    Google's guidelines state, "Think about the words users would type to find your pages, and make sure that your site actually includes those words within it." That's a bit different from saying repeat them as often as possible and include them in your domain name, page titles, description, etc. etc.

    incrediBILL




    msg:749435
     9:00 pm on Apr 29, 2005 (gmt 0)

    Ok, I know I'll attract a few flames for this post (all of my posts do) but I have to sheepishly admit one of my pages has the primary keyword appearing about 250 times.

    It wasn't done on purpose, no dirty tricks indended, it wasn't stuffed, it wasn't deliberately over optimized, it's just legitimate use of anchor text that evolved as the site evolved over 7 years and it's never been penalized. Most of my direct competitors sites have evolved in exactly the same way and they don't appear to be penalized either.

    I was worried about it looking like keyword stuffing and am thinking of at least splitting it in half just in case but since it's never been an issue in the past why change now?

    BeeDeeDubbleU




    msg:749436
     11:09 am on May 5, 2005 (gmt 0)

    Here is another thread that lends credence to the OOP theory.
    [webmasterworld.com...]

    ncgimaker




    msg:749437
     11:58 am on May 5, 2005 (gmt 0)

    The underlying fact I'm trying to point out here is that repeating keywords is and always will be a necessity of design simply for "Ease of Use". Slapping a filter on a web site for repeating it's product names hundreds of times on a page would be like slapping the face of god for creation.

    Yet that appears to be what they've done. Perhaps not per page, but per site. We are keyword1 keyword2, obviously those two words are used all over the site too and we were nowhere for phrases with those keywords either.

    Then we complained, then we reappeared for those keywords. We think they simply added those words to an exception list because keyword2 is the group name of a set of other keywords, keyword2a keyword2b keyword2c.... Obviously those keywords are all over the site and we don't rank on anything with those words.

    I can literally put a 15 word product description from our catalogue into Google and there are 3 results and we are not in that 3. Yet the same query in Yahoo brings up the best two pages from our site for that product as 1 & 2, with 4 other results that are simply scraper sites hanging off us (not 302s).

    The only result set worse than Google is Teoma which does not appear to have any pages from our site at all. Kanoodle even correctly chooses 2 good pages on that term as 1&2.

    This 85 message thread spans 3 pages: < < 85 ( 1 2 [3]
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved