homepage Welcome to WebmasterWorld Guest from 54.225.57.156
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
What is over optimisation?
Any specifics? Some observations
surfgatinho




msg:129024
 9:42 am on Dec 10, 2004 (gmt 0)

A few of my sites survived Florida without too much damage, others never recovered.
Anyway I have seen many of my sites slowly slip down the SERPs over the past year. I started looking atwho was doing well in my niche. There is an awful lot of content free jnk so I concentrated on the sites that are similar in size and aproach to mine.

What I noticed is a lot do not use the exact keyword in the title, rather combination with or without pluralisation. Also I noticed a lack of <h1> and <h2> tags on many. Is it the case that repeating the page title in the <h1> or <h2> tag is considered over optimising. Seems a bit ridiculous if it is, I thought that was the purpose of <h1> tags, ie a title.

Anyway, that's what I noticed. I tried taking out my <h2> tags and my site dropped a page so maybe that's not my problem.

The page I am tracking has higher PR than most of my competitors and mor relevant incoming links (I live in the are and most sites are national).

Another thing I was wondering about was page names and directory depth - I'm noticing a lot of spammy page names with underscores doing well. These often match the page title.
As for directory structure - I see many root level pages listed rather than directories, particularly deeper directory structures.

Any comments/suggestions?

 

rfgdxm1




msg:129025
 3:29 pm on Dec 10, 2004 (gmt 0)

In the way the term tends to be used around here, I doubt that an over optimization penalty exists. I can think of a number of amateur sites doing very well in the SERPs with keyword in URL, title, H1 tag, and a huge on page keyword density. These webmasters clearly didn't know about SEO, and this happened by accident. Since the page was about widgets, "widgets" can be found all over the page source code. This sort of thing happening on amateur sites is a good reason NOT to have an over optimization penalty. There would be too much collateral damage to innocent sites.

Much more likely to Google algo just doesn't give any extra benefit to page beyond a certain amount of apparent on page optimization. After that point, it comes down to offpage and offsite factors, such as PR and inbound anchor text from other sites. The problem commercial sites tend to face is legitimately getting relevant inbound links. Amateur sites tend to link rather freely with other related sites. Commercial sites tend not to want to link to the competiton.

Marketing Guy




msg:129026
 3:43 pm on Dec 10, 2004 (gmt 0)

I'm noticing a lot of spammy page names with underscores doing well.

No huge significance to underscores - just part of an overall SEO process of slapping the keyword in everywhere possible.

Filename of a page certainly could be one factor in the ranking algo, but not a hugely significant one. Although that said, a secondary benefit of keword filenames is that if anyone links to you using your URL, rather than link text, then you have the keywords there.

Underscores mean the link text is "your_keyword.htm" instead of "yourkeyword.htm".

Personally, I used underscored keyword filenames long before I found out about SEO - it was just an easier way to recongise the files when I was messing with my site! ;)

Scott

surfgatinho




msg:129027
 4:50 pm on Dec 10, 2004 (gmt 0)

I doubt that an over optimization penalty exists

I'm not so sure - My reasoning is when I look in the SERPs I often find a page from a site that is obviously not the page which is optimised for that KW. Eg I look for 'London widgets' and find a page listed that is 'UK widgets map' - I know full well the site has a page about 'London widgets' with higher PR, more KW related anchor text etc.

Marketing Guy - my point about KWs in the page name was it seems to get higher importance in the ranking than I'd have thought (or maybe not!)

caveman




msg:129028
 5:18 pm on Dec 10, 2004 (gmt 0)

I view 'over optimization' as a catch all phrase that refers to 'going too far'. To make life simple (always my preference) we include off page factors in our view of 'over optimization.' (Some might view internal and external off-page factors differently.)

The point is well taken that there are sites of all shapes and sizes that do fine with apparent 'over optimization'; especially sites with gross repetition of kw's. Our view is that sites with one or more of these traits typically do well by virtue of having other on-page and off-page factors that offset their 'over optimization.'

Is there an OOP? Put it this way. You *can* get a page killed in the SERP's for something as simple as too much kw density. We've tested it repeatedly. And you can get it *unkilled* - or at least you could in the past - by fixing the kw density. In it's simplest form, that's what we think of when the 'over optimization' question comes up.

What I can also say is that we've identified, via testing, a rather lengthy list of techniques that, taken to far, or made too repetitive, will sink a page. Indeed, some will sink a *site.* And this, IMHO, is now very related to what people call the Sandbox.

For our purposes, it helps greatly to view the entire mess as algo(s) + filters. You pass muster, or you don't. Unfortunately now, especially for new sites, one increasingly well known was to pass muster is to take several elements of OO to extreme. However, I'm sure that G is watching these examples, and fine tuning their 'anti-new-sites' algo tweaks as we speak.

ChrisCBA




msg:129029
 5:54 pm on Dec 10, 2004 (gmt 0)

Depending on how much Google has incorporated LSI (latent semantic indexing) into their algo, this also can act as an OOP. Usually with LSI the more frequent a word, the less “weight” it carries. So the more a KW word is repeated, the “less impact” it makes.

Now granted, if G used this formula for every ranking they would get into a world of trouble, but if they used it selectively, where as certain factors may kick in LSI to different degrees, it might be an explanation as to why some sites seemingly do well with keyword stuffed pages & others rank #1 for a page practically vacant of that KW.

geekay




msg:129030
 9:21 pm on Dec 10, 2004 (gmt 0)

Is this an example of over optimisation, or is it just a triviality:

A main part of an informative site is a huge list in alphabetical order. For the convenience of visitors it is split up into three html files (A to L, etc). Now a SEO problem is my need to incorporate a few different, constant keywords in a table header, plus the different spellings for a major keyword. I hope Google would, in multiword searches, combine some selected keywords with the contents of any of my lists. The intention is not to get all three pages listed separately in the same SERP.

Can I try to solve the above problem by using different Meta Title Elements for each of these three files? Maybe even use synonyms for keywords there? Or is this something my site is likely to get penalized for by Google?

I could also have small variations in the text of the h1 or h2 headlines of three pages, as seen in the browser window, but that may be going too far, because normally such headlines of a split list would be identical.

is300




msg:129031
 9:44 pm on Dec 10, 2004 (gmt 0)

I can definatley say that if you implement 2 or more ROS links pointing to a page with a keyword density of 20% or higher, you can plan on the page dropping back about 40+ in the SERPs.

I think google wants to see slow link developement and denisty's in the 10-15% range. Too many links and too much density is most likely triggering flags right now.

The funny thing is, if you have other keywords that are ranking well, those don't seem to be effected by the OOPs you might receive for ROS links on specific phrases. WIth that said, it seems you can over-optimize for one word, get a penalty, but any other words you are optimizing for might not be effected.

onebaldguy




msg:129032
 10:06 pm on Dec 10, 2004 (gmt 0)

It has been speculated that OOP can be triggered with too high of a percent of identical anchor text for your IBLs (maybe this is simply a way to devalue purchased links, but I think this can be considered a type of OOP).

As to whether or not it exists, I don't know if anyone has given proof. But I am a believer in it. I lean more towards the idea of going too far with your main keyword for a page (too much identical anchor text, words in title, h1 tags, high KWD density, etc - all for a specific keyword).

Just putting the keywords in title, h1 tags, etc. is standard for web design, so it is very unlikely that including it in each of those areas would ever have a negative affect.

Do things in moderation and try to mimic what is natural on the web.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved