Forum Moderators: open

Message Too Old, No Replies

Can you over optimize for Google?

and not just keyword stuffing but other stuff

         

Buckley

5:37 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



Hi,

Can you over optimize pages for Google? Is there a problem with having more than one link on a page with your keyword in the anchor text of a link?

Can you have to many h1 h2 or h3 tags?

To many keywords in bold, italic etc?

Anything else?

Thanks

glengara

5:48 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



IMO you certainly can.
With so much depending on outside factors these days, I prefer a page that doesn't shout out OPTIMIZED.
Anything else? Stuffed Alt/Title tags.

jeremy goodrich

5:49 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My rule of thumb is: if it works for the human visitor odds are, it will work for the Googlebot.

Of course, if you are up to something spammy, say like semi auto generatings thousands of pages not that I would do something like that... I would follow the same rule: anything 'way out of wack' with the rest of the pages in their index might set off an 'algorithmic filter' geared towards catching 'spammers' which you don't want. :)

hth

dmorison

5:56 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Why are you optimizing for Google?

Optimize for human beings.

Follow all the best practice for website design.

That after all is what Google is looking for.

The rest - including great SERP and PR - will follow.

mifi601

7:14 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



Is there a specific in the algo that punishes for KW density?

Let's say your first KW is 40%, the next one only 10, will you punished over a certain ratio of KW1 to KW2?

And I have not even optimized it for google, just happened that way ..

tedster

7:56 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have seen very high keyword density do just fine on Google -- but not always. I'm now thinking that Google looks at something more than a simple density percentage, something more along the lines of word pattern analysis.

I think they use statistical analysis to spot "natural" and "manipulative" patterns in lots of areas. Overly optimized patterns get zeroed, or penalized, or possibly flagged for a hand check depending on what they're looking at. At least that's my current operating assumption.

mifi601

8:08 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



Tedster,

the KW density analyzer shows my top competitor (#1 ranked - i am #2) with about 4 words in his frame page and NOTHING else and his density is 40%.

The question is where (how many words) does word pattern analysis start. I could make a lot of 1 word (or 4) pages ...

michael

tedster

8:21 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's how would I approach it. If a 4-word page is what I need for a good communication to the visitor - then I would do that. Almost all my best practices became best practices because they first happened naturally and then I noticed good results, so I continue in that fashion. Or I notice bad results, so I change it.

I'm looking for "organic" optimization, pages that are built foremost for people, but are equally communicative to search engines. Google is on a mission to reward natural communication and not to be manipulated. So even if I find a tricky way to boost myself right now, in two months I may be right down the drain every place I used the trick. And to my mind, forcing pages to come in at a certain percentage of density is a kind of trick.

For instance, density measures will be widely different if the company name contains a keyword. I'm sure Google has noticed that and taken steps to "wash out" that effect to some degree.

albert

9:35 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



As tedster said: >>Almost all my best practices became best practices because they first happened naturally and then I noticed good results<<

If you act user-orientated and don't focus on SEO - as i did before i found this forum ;-) - it happens for example that you get some "unforced / not asked for" inbound links because you became some kind of authority to a certain subject. Links from really important sites.

And you'll be found, and get traffic, and analyse your logs, study sites of your competitors, understand what's users after, and adapt your sites ...

What I try to express is: SEO is not only studying rules (or guesses about rules). Mostly it's building sites users like. And "best practice" is a very good term: always to be updated. SEO may help, surely.

But don't over optimize. That's the wrong play.

mifi601

10:27 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



I agree with you guys wholeheartedly. I had no idea what i was doing and landed at #4 for my main KW. I do not see anything wrong though in improving my ranking, finding out why someone ranks higher and trying to copy that.

I assume that IS part of the reason we are all here.

The main reason my competitor is #1 is, he beat me to the KW domain name by 4 years. My PR is higher, my content is better (no really :)) and I have more inbound links. When I got the domain I was not looking or thinking about KW or anything the like (BTW I think I am also getting more advertising money) BUT STILL I WANT TO BE #1 :)

Thanks for the great advice in this thread as well as the whole forum - it has been truly awesome to find this community!

HenryUK

10:28 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



Optimize for human beings.

Follow all the best practice for website design.

That after all is what Google is looking for.

The rest - including great SERP and PR - will follow.

Nice to think so but not always true. There is more than one way of optimising for users - more than one way of doing things well - some of which do not result in being indexed.

eg database-driven sites with their own specialist search forms. Some don't suit having a fully browsable index system - that's why people design specialist search engines!

However if you do this without also putting in a simple indexable hierarchy structure alongside your form, you miss out on LOTS of referrals.

There are lots of other examples too. Just think of all the questions that come up here along the lines of "what does Google think of this kind of link?", "how will Google view this kind of technology?" etc.

Imagine for a moment that you are a website designer with no interest in search engines, but just trying to follow your clients' needs (they do exist!). You will try to put together the best-looking and most usable site that you can, without worrying about the SE take on different technologies etc.

Your choices (as we know here) can have a significant effect on your SERPS.

A decision that is easily defensible from a user POV may inadvertently fall foul of search engine algos.

It was the desire to unblock such problems and avoid new ones that led me to this site, and I have been grateful for all the help and advice I have found, which I have used with some success.

Optimise for the user is a great message, and a rule worth holding to, but it's certainly not the end of the story

tedster

10:40 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



not the end of the story

No argument with that. As I mentioned, I create pages for the user, but then I notice what gets results at Google and other SEs. I proceed from there, always going through two feedback loops: 1) how do the visitors respond/convert 2) how do the pages rank/pull search engine traffic.

I start with the user, stay with the user, but definitely pay attention to the bots. I've got no quarrel with people who have the time and resources to play with many disposable domains for different purposes. None of my clients are in that camp. Instead, they are all looking to build a long term business on a branded domain. If they loose the spiders because of something I did that was over the top, I've taken a big chunk out of their business model. I can't do that!

About the precise title of this thread -- I hoped my answer was implied, but let me state it right up front. Yes, any "technique" can be over-done, to the point of appearing obviously over-optimized to an algo and causing an opposite result.

Lots of what some people call search engine "spam" today evolved through that exact pattern: too much of a (once) good thing.

rfgdxm1

10:59 pm on Apr 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The main reason my competitor is #1 is, he beat me to the KW domain name by 4 years. My PR is higher, my content is better (no really :)) and I have more inbound links. When I got the domain I was not looking or thinking about KW or anything the like (BTW I think I am also getting more advertising money) BUT STILL I WANT TO BE #1 :)

Yep. Keyword in domain name is a big plus with Google.

mifi601

11:51 pm on Apr 7, 2003 (gmt 0)

10+ Year Member



now what does that have to do with user friendly ness - not much. it's just algo and knowing about it will in the end benefit the user.

assuming that better webmaster, better site = better content ...

coming back to the original question,

which are the areas where one can optimize too much?