Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
Exactly...it is all about how your copy reads and how you present it. You can have a keyword density of 50% if the page doesn't look like or read like spam, so when you get top rankings your competition doesn't turn you in.
Now, if the term London appeared 5 times in one paragraph <p> then you might be crossing the line. Structure is the key.
Exactly! I know of a particular hallway page (a bulleted list, pageone) that was probably hitting 30% kwd. Funny thing, a writer for US News and World Report didn't see it as a hallway page at all, but as a great "quick guide" for his readers and published it on their website.
Ther's also a difference in the percentage of occurrences of the words used individually or together in two or three word phrases. And not only the density, but where they are on the page and how they're used. It should look and read naturally.
Mind you - after a certain point - it won't help either.
should not get penalized - even though they have "hidden text" and the word test repeated dozens of times. Pages like this are what we could miss out on if google starts tightening the screws.
Notice - a smart search engine, like google, shouldn't/doesn't need to rely on those type of filters - as they should ignore the words after a certain point. Banning pages like that is the lazy way out.
[edited by: ciml at 10:18 am (utc) on Aug. 14, 2002]
[edit reason] URL Sig. Snipped [/edit]
joined:Jan 30, 2002
I hope google sticks to that philosophy then. Sometimes I can't help but use the same keyword/word again and again, sometimes on its own and sometimes as part of a phrase. Its like using the word webmaster in webmasterworld....or www2 during a google update :) If they started banning pages purely because of "keyword proximity", then IMO, the world is a poorer place.
I hope everyone doesn't start bumping a few extra keywords on their pages now - or it will become a problem ;)
As I see it, the problem exists only if Google thinks it does. If someone just throws a load of extra words onto a page than that isn't enough to make it get good rankings under anything remotely competitive.
I see this as being similar to the heavy cross-linking penalty that I believe used to (and still does in some circumstances?) give PR0. Those spam farms weren't helping their listings much (there are equally and more effective ways to use PR without doing that), but Google had to act when the by-product (old fashioned similar-content spam) was clogging up the results.