Forum Moderators: Robert Charlton & goodroi
first of all this is not a rant or any similar 'my competitor does this' kinda post, it's facts based, so can we try to keep it that way.
There's been a lot of post written about the fear of over-optimisation penalty, but I want to put everybody's mind at rest - THERE IS NO OVER OPTIMISATION PENALTY.
Common fear of Over-Opt. often appear when you want to do something with your site that you wouldn't necesseraly be able to justify if you were to follow one or the other guidelines given by the W3C: use of keywords in comments, use of multiple wordy and linked h1's, multiple keywords in img alt attributes, etc.
Let's try to weed out some of your fears.
The site used in this example ranks no1 on G for a lot of very competitive keywords (20/30+ Mil). It's also in the top 10 on all three major player in the SE market (organic listing), so maybe we can learn from them. The site 'tested' always returned the same page in for the tried keyword so the examples below apply to this page.
As always, discussions about what other people do is very difficult. It can sound like a rant or a complaint. This will not help.
It can also sound like finger pointing - 'he' doesn't follow the W3C guidelines, I do and I am nowhere near the no1 spot. This will not help either.
I am not condamning this company, they do offer the services that they rank well for. They are also prepared to take some 'risks' that I am not (if there is no over-optimisation penalty - are these really risky method?).
So what are my options? I can take it upon myself to try to find alternative keywords, or I can be prepared to take those 'risks' as well. I choose the former.
Anyway, I hope that those of you who were thinking of optimising your site but feared the over-optimisation penalty will have had found the peace of mind to do so, thanks to this post. I wish you good luck here.
Leo
[disclaimer]these are facts noticed on a unique site that is doing well, if you try and do the same on yours and get banned - don't come and blame me ;)[/disclaimer]
I also want to add that it's not worth stickying me to get the site URL - I won't give it to you :)
This is often as a result of generated pages. A formula for generating pages will take a phrase and put it into the title, H1, bold, navigation etc.
I think it is these pages google wants to catch, thus the oop is aimed at auto generated pages where the phrase is repeated in key areas. A handwritten page can have the phrase heavily optimised, but not necessarily be penalised. I would suggest that the template is a trigger component of a page that gets caught by oop. If the phrase appears in a number of places and the site has a regular template then this could flag up oop. There are plenty of pages that rank well and are keyword stuffed, but I wonder if these pages do not fit a 'generated page' profile.
If it is computer generated, it uses patterns- perhaps in such a small site you could introduce randomness in terms of which internal links you placed on a given page...
but still, I would think these computer generated pages would follow a formula, and the fact is that many SEO minds like ours follow our own formulas... the alternative is random variety, which at some point places you in the indistinguishable band of mediocrity which won't ever rank highly.
can an algorithm tell the difference between a computer following a formula created by a human, and a human following their own formula? I doubt it, especially if the computer formula introduces perhaps 2% of 'random' error.
So the only place for google to go is to make the formula so complicated that no one could figure it out. At that point, I suspect at least two things could happen-
1. There become numerous ways to rank well
2. It may become impossible to beat all criteria if some criteria are contradictory
3. Your search results could suck
The crazy thing is, if you punish for OO, you're screwing the white hat SEO-ers who just want ppl to find their page. And I'm talking even obscure things- you should be able to find, e.g. the unique copyright text that's on every page in my site a lot higher than #21. You shouldn't have to say your company name 15 times on the same page to get G to realize it's your company's website. But if that's what it takes, then why would G penalize people who do that?
Maybe G has dug painted themselves into a corner. Or else they never expect any website to have more than say 10% of its pages ranking well?
Who knows. :-)
I have a site that I created in November last year. As yet it gets virtually no Google traffic. It has a domain name keyword1-keyword2.co.uk and it has been highly optimised specifically for the phrase keyword1 keyword2. It is nowhere to be found in G on a search for this phrase. However this morning I noticed that it appears in position 10 in Google UK on a search for keyword2 (a common word that yields 7.6M results in Google UK).
Wouldn't this suggest that my site is suffering from an OOP as opposed to the sandbox?
I have a site that I created in November last year. As yet it gets virtually no Google traffic. It has a domain name keyword1-keyword2.co.uk and it has been highly optimised specifically for the phrase keyword1 keyword2. It is nowhere to be found in G on a search for this phrase.
I have noticed very similiar thing. For example, I have a site, where URLs look like mysite.com/global_keyword1/global_keyword2/local_keyphrase.html
where global keywords are keywords defining the topic of the whole site, and local_keyphrase is the targeted phrase of a particular page. Deep pages typically have 12% density of targeted phrase. And they rank well for everything but their targeted phrases :))
On another site of mine, I happen to have 50% density, made mostly with alt attribute in images (it's an image gallery with category name given in each alt), and sometimes the keyphrase is also in URL, sometimes not. In first case, ranking is below #100 (nothing strange to me, as 50% density is clearly too much), but in second case - if keyphrase doesn't repeat in URL - they are #1.
What about spammy snippets - I think Google could get these sites easily - if any keyword repeats more than once in title and three times in meta, give a penalty. I'm surprised they don't do it right now.
you can find a similar stat on a certain itfacts blog...
"Only 18% of Web users can distinguish between search result and advertisement"
unfortunately, the source linked to from that blog no longer displays that info.
ah, here we go- it's actually in a WebmasterWorld post
[webmasterworld.com...]
What I mean is what if they apply some form of the algo, so as the paid listings increase in relevance the middle of the page results drop?
For instance if three of the sites that are paying for adds would be included in the top 10 if they had no penalty, why not start the middle of the page results at #4 and drop the top results?
This could give the appearance of an OOP as highly relevant sites are dropped, because the paid results are also relevant and it would also give the appearance of an ever changing algo, because the paid results are always changing.
Don't know, just wondering, but it sure would make it tough to optimize for, because you would never know what adds are going to be displayed, so do you need to be #1 to be #1 or do you need to be #7 to be #1?
Justin
As of the today or the last few days it now ranks nowhere for "keyword keyword2 keyword3" or any combination of those words, the only way I can get it in the results is if I add another term in there that is on my page. It still has its pr and is one of the few sites for this search that actually has outside links to it.
I did some testing around this on Friday. I asked some pretty simple questions of Yahoo, Ask and Google. Seemed to me that Google did not get to the optimum answer until around result number 10. I could usually find among the first 5 in Yahoo.
The other thing I've noticed lately is that result #1 and #2 sometimes has very little to do with the search term. None of these searches in Google had the exact phrase in the title.
[edited by: maccas at 5:31 pm (utc) on April 17, 2005]
I've officially switched to Yahoo over the past week for my own searching. Even when I thought Google was a little weird lately, I'd still stick with them for my search needs - no more.
Yahoo seems to have fixed their spammy 'inktomi' results up, right now it's fairly clean of computer generated spam, relevant and fresh, and the SERPs seem to have a more interesting blend of sites. It's good, kinda like Google 2-3 years ago :D
So if this is how it's going to be, we're going to start using the normal phrasing that's only implied by keyphrase helpers like overture suggestion. Don't you think that's where the SEs will have to go in the future anyway?
Or do you suppose a better response would be just to vary your usage more throughout the html? Then I wonder if you're risking losing ground because of mediocre lack of pattern... I'm lost trying to figure this one out.
B
MHes, Msg #27
I think the average user is learning to take more time examining the snippets and titles
BeeDeeDubbleU
OK, let's look at it from another angle because to me it's as easy as ABC.
a) Spammers have a tendency to over optimise.
b) Google wants to get rid of spam.
c) Google has the technology to detect over optimisation.
Can someone give me one good reason why they would not use it?
Spammers spam end of story. Otherwise they wouldn't be called spammers. Over optimisation (which has yet to be defined), is only optimisation pushed to the limits of what one think is legitimate. Off course everyone has different limits in his/her mind, so a penalty applied to a site might be a spam penalty if you used hidden text or white on white txt, but again I don't think that there is an OOP. For me there's only 4 types of site:
And again the spam optimised are not all weeded out, so why would G penalised someone who tried to make his site gain better ranking by using legitimate techniques, when they know that spam still 'infests' their results.
Wouldn't this suggest that my site is suffering from an OOP as opposed to the sandbox?
Leo
Over optimisation (which has yet to be defined), is only optimisation pushed to the limits of what one think is legitimate.
I wouldn't dispute this. My point was that my evidence suggests that there is an OOP at work within the algo that may or may not be part of the sandbox. I won't try to claim that I am convinced about this because I don't have enough experience at this to do so.
And again the spam optimised are not all weeded out, so why would G penalised someone who tried to make his site gain better ranking by using legitimate techniques, when they know that spam still 'infests' their results.
I think the issue is the definition of "legitimate techniques". If a site has the domain name enamel-dobbers.com and these two words appear in all the strategic places then is it not logical to assume that the site has been created expressly to be found for those keywords? It may follow that G has decreed that such sites, while perhaps legitimate, are more likely to be spam and as a result they should gain no ranking points.
[edited by: Brett_Tabke at 2:43 pm (utc) on April 19, 2005]
[edit reason] please no specific kws... [/edit]
1.If you have good content but *not* optimized, you are ranking high.
2.If you have good content *and* optimized, you also are ranking high but *not higher* than in the case 1.
3.If you have bad content and optimized (spam) you are ranking low.
The problem is seems that the implementation is not perfect yet.
Vadim.
2.If you have good content *and* optimized, you also are ranking high but *not higher* than in the case 1.
My experience of the above is that I am not ranking at all, even where the sites in question have excellent, original content. This is not just my own assessment. I get emails from people congratulating me on the content of these sites.