Forum Moderators: Robert Charlton & goodroi
<edit> Suppose you change the navigation on your site so that all the home page links say "keyword keyword home" instead of "home". Will you incur an over optimization penalty?
My site appears to be suffering from this penalty. What are the views of the SEO experts here at WW? </edit>
[edited by: lawman at 3:00 pm (utc) on Dec. 6, 2005]
[edit reason] Edited To Conform To TOS 10 [webmasterworld.com] [/edit]
Having worked on many sites this is how i see it. If you are good at SEO obviously a new site you build is going to be built properly using your seo skills from the offset. With those skills and good backlinks you would be able to rank high for your keywords overnight on a new site. Now Google on the other hand is not going to let you hence the use of the sandbox - your new site could be held back say 9 mths then slowly rise to the top.
So a new site is held back. (some disagree with this sandbox debate but in my experience certainly in the big money keyword sectors a new site is sandboxed.
Next, you might think, lets buy an old site thats already ranking in Google for your money keywords and put some seo in. Building new pages etc fully optimised - this is where the OOP comes into play. The page does not rise to the top and gets held back due to an OOP.
Same imo if you do seo on one of your pages it can be pushed back as a result, ive seen it many times but mainly only in the money keyword sectors.
To prove the point further i took one page on one of the sites i was working on at the time which was a site about widgets and the page was "green widgets", The page was listed at about 32 in Google for "Green Widgets" out of about 1.7 mil results. i did loads of SEO on it, Green Widgets title, on page perfect dencity, perfect links etc, etc, etc. The page seo wise near perfect imo. The page tanked within about two -three days in Google to about position 250! Having done no further work on that page of that site its now 9 months later rising again and improving on its earliest position, its now about 14/15 i think for its search term.
So to conclude i would say OOP means in effect putting the page back in the sandbox. This whole issue is more to do with pushing adwords sales imo than anything else, otherwise rather than buying adwords you would seo your page so that it ranked one for the keyword. This way, in the money keywords a new page is going to take a minimum of 9 months and more like two years before it ranks well, hence you are likely to buy adwords for that keyword whilst you are waiting!
So OOP has to exist in the same way that the sandbox does to stop webmasters buying old sites and doing seo on them to try and avoid the sandbox.
Thats my take on this issue anyway
I have regular problems with KWD, not because of keyword stuffing but because of what I need to put on some of my pages in order to do them in a proper manner for the visitor.
You can see older pages rank well with high KWD yet a newer one, or one that has been altered, would be filtered out for the word or phrase.
I am convinced G' treats changes in a different manner than an original page
EXAMPLE 1: I have a very high ranking site that uses keywords in menu links and it is quite heavily optimised in a white hat sort of way. It has been ever present for more than four years and during this period it has gained lots of inbound links, including some from authority sites in its niche. It also has lots of information and useful content. Google probably looks at it and thinks that on balance this site is not spam.
EXAMPLE 2: I have a site that I launched just over a year ago. This is a site that offers free service that could be very useful to many people. I used similar SEO techniques on this site and it carries Adsense. It has been firmly sandboxed since day one.
* It has a www.keyword1-keyword2.com type domain.
* It uses keywords in file names.
* It uses keywords in <H1>,<H2>,<H3>,etc.
* It uses keywords in <title>.
* It uses keywords in meta description, etc.
* It uses keywords in navigation links.
* It uses keywords in text content with several of these hyperlinked and bolded.
* It has inbound links that use the keywords.
Can you see a pattern developing? ;)
My point is that it must be the easiest thing in the world for Google also to detect this pattern. They probably conclude that there is a high chance that the new site in question is spam and drop it into the quarantine box. After some time and probably when some other factor comes into play (authority IBLs?) the site can be released. If I had a search engine this is probably the way I would play it so, yes. There is probably an OOP penalty involved in the sandbox filtering process.
Not my call to judge if it is good or bad however new sites have lots of value to my eyes too, they often offer a good alternative when operated correctly, Google should be able to recognize the efforts and increasing popularity without adding a blind penalty out of the box based on very dubious assumptions (as it appears).
Also you guys seem to end up doing what most people do...adding pages to old sites - That brings up one of the biggest issue on the internet right now - balkanization that is - my 2 cents
Perfectly natural way to do things but there again google seems so picky.
If that is the case what would be the best way to do text hyperlinks - use an image or something else?
cheers
I no longer try things like that, my working theory is that the limits will grow tighter in the coming year, what worked before will become increasingly more likely to trigger penalties, in areas that previously let you have some room to mess around. Currently I'm exploring the wild idea that high quality content will generate high quality inbounds. I'm also dumping more and more seo related stuff every few months. However, very clean stuff is still working very well, as long as the site has not triggered any other potential warning flags, read the patent application to find out what those might be.
It may be that this method would have been quicker?
* It has a www.keyword1-keyword2.com type domain.
* It uses keywords in file names.
* It uses keywords in <H1>,<H2>,<H3>,etc.
* It uses keywords in <title>.
* It uses keywords in meta description, etc.
* It uses keywords in navigation links.
* It uses keywords in text content with several of these hyperlinked and bolded.
* It has inbound links that use the keywords.
Why is this over-optimisation or spam-like? It's as clean as it gets. It's nailing one's colours to the mast fair and square, and trying to disguise pages by "detuning the SEO" or some such phrase wouldn't make sense.
Completely agree with you, if this if the case then how on earth do you go about designing a site - dump all the text on a page and leave alone, I think it is more a case that a new site is held back no matter what you do - unless you get backlinks from a few PR8's of course.
cheers
Why is this over-optimisation or spam-like?
This is over optimisation because if we are honest with ourselves this combination of circumstances (particularly the hyphenated domain name) usually only happens when people are deliberately targeting KWs. Genuine companies just don't call themselves we-have-the-cheapest-viagra-in-the-usa.com.
Google's quality guidelines actually tell us, "Don't load pages with irrelevant words." It's hard to retain relevance using the techniques I described above.
Why is this over-optimisation or spam-like?It isn't but that won't stop this goofy concept from getting mentioned every month or so.
I can assure you that there is nothing goofy about it when you are suffering from it. It's easy to just say, "It isn't". Perhaps you would be kind enough to enlighten us why you are so sure of yourself on this?
Let's say that a (new) site is about tartan widgets. If the site is called www.tartan-widgets.com and the KWs "tartan widgets" appear everywhere possible I would say that this is a site that has been designed to be found specifically for tartan widget searches.
Example ...
Domain name: www.tartan-widgets.com
Sub pages like: www.tartan-widgets.com/tartan-widgets-aboutus.htm and www.tartan-widgets.com/tartan-widgets-contact.htm
H1: Tartan widgets available here
H2: Tartan widgets for sale
H3: Tartan widgets info
<Title>Tartan Widgets</Title>
Meta Description: Tartan Widget site with information and sales of Widgets in Tartan
Meta Keywords: Tartan widgets, tartan, widgets, widgets in tartan
Hyperlinks: Tartan widget contact details (etc).
Inbound links: Tartan Widget Site (etc).
I would contend that the chances of the above occurring naturally are very slim. No?
BeeDeeDubbleU, you are talking about something different with "Don't load pages with irrelevant words."
The example above is surely that of a page being loaded with irrelevant keywords. If the subject of a page is established with an H1, "Tartan widgets available here", then the H2 should probably just be, "Sales" and the H3, "Info".
If I was a search engine and I was looking for signs of spam or pages that are trying to game the system then the above would be a very easy place to start. No?
Right out of the box, it ranked well for it's keywords for about a month and then was sandboxed.
The second site was basically plain text without H1 tags or any optimization whatsoever. It did not rank well right away like the first site and is still sandboxed also.
The only difference between the two is I actually made money off the one that I optomized for a month.
I launched three websites at roughly the same time just over a year ago. All three were sandboxed. Two were non-commercial and not heavily optimised they are now out and ranking well. The other is commercial and heavily optimised. It has now been in the SB for almost 13 months. Conclusions? Over optimisation may cause your site to be SB'd for longer periods.
Maybe in some cases it does trip some kind of filter (or something) but it wouldn't make sense to "penalise" a page or site purely for that.
If you hit the nail on the head and have your site listed in the number 1 slot, then in Big G's opinion, your page is optimized for the keyword, but Y & Msn, may not think so.
As discussed many times in the past, there can be no OOP.
Again IMHO,
Back to watching
WW_Watcher
If due to the white hat optimization the site drops in SERP, for the webmasters it looks like the penality.
For Google it may be just an attempt to calculate the value of the site to the end users who cares little about blue_widget_best_discount_deal in the URL because most never look at the URL address and never type it.
However since most people here are the webmasters why not call it penalty for short? If it looks like a duck, sounds like a duck, and swims like a duck, it's probably a duck :)
Vadim.