3-8% is generally considered safe. i have seen high ranking sites with much higher 25 or more% though that rank well until their competitor turns them in. hth dave
>>>>25 or more% though that rank well until their competitor turns them in.
Exactly...it is all about how your copy reads and how you present it. You can have a keyword density of 50% if the page doesn't look like or read like spam, so when you get top rankings your competition doesn't turn you in.
soapystar, I believe the way you have that information structured will also have an impact. Lets say you were building a directory. If those directory listings were separated by <li> or some other structural element, then I don't think there is a problem. If so, then all of us promoting directories would be in a world of hurt!
Now, if the term London appeared 5 times in one paragraph <p> then you might be crossing the line. Structure is the key.
ok...theres food for thought!
>You can have a keyword density of 50% if the page doesn't look like or read like spam
Exactly! I know of a particular hallway page (a bulleted list, pageone) that was probably hitting 30% kwd. Funny thing, a writer for US News and World Report didn't see it as a hallway page at all, but as a great "quick guide" for his readers and published it on their website.
You can get away with pretty high density, though I'm not comfortable with it, but sometimes there's no choice. It's possible that after a certain point it's disregarded. I've got a few that are up to 20%, which is too much, but it's appropriate for the size and structure and content of the page. And have also seen some sites with it much higher who get by, also depending on how much text is actually on the page. But that doesn't mean they're ranking well because of the density.
Ther's also a difference in the percentage of occurrences of the words used individually or together in two or three word phrases. And not only the density, but where they are on the page and how they're used. It should look and read naturally.
So are you guys saying that a certain level of density will bring a human being and then the decision that density constitutes spam is actually made by a human?
What we are saying is that it is unlikely that a high keyword density will trigger any alarms within the search engines. The main thing to worry about is if your competitors get suspicious or jealous, and turn your site in.
In the past I had raised the density on some of my pages to 20% and above. The effect was negative - the more I raised the keyword density the worse the pages ranked from update to update. Then I posted my problem here on the Forum and some of you guys advised me to stay within 8 to 12%. Perfect advise! My pages not only rank better now, they also read more naturally.
|Personally, I don't think keyword density will kill any page for spam. There are t0o many ways for this to be done by accident. It may be put into the equation along with other factors, but I wouldn't worry about it being to high in and of itself.|
Mind you - after a certain point - it won't help either.
should not get penalized - even though they have "hidden text" and the word test repeated dozens of times. Pages like this are what we could miss out on if google starts tightening the screws.
Notice - a smart search engine, like google, shouldn't/doesn't need to rely on those type of filters - as they should ignore the words after a certain point. Banning pages like that is the lazy way out.
howdie guys and girls.
I am realy new to this, how can i tell if my site will recieve a bad ranking. i made a template up with a sertin description and keywords in the meta tags.
now i used this templat for every page with a slight veriation in the key words like adding one or two per page, and changeing the title.
Will this effect the site being listed or will it be of a spam content?
[edited by: ciml at 10:18 am (utc) on Aug. 14, 2002]
[edit reason] URL Sig. Snipped [/edit]
|brotherhood of LAN|
>>>Notice - a smart search engine, like google, shouldn't/doesn't need to rely on those type of filters - as they should ignore the words after a certain point. Banning pages like that is the lazy way out.
I hope google sticks to that philosophy then. Sometimes I can't help but use the same keyword/word again and again, sometimes on its own and sometimes as part of a phrase. Its like using the word webmaster in webmasterworld....or www2 during a google update :) If they started banning pages purely because of "keyword proximity", then IMO, the world is a poorer place.
I hope everyone doesn't start bumping a few extra keywords on their pages now - or it will become a problem ;)
> I hope everyone doesn't start bumping a few extra keywords on their pages now - or it will become a problem
As I see it, the problem exists only if Google thinks it does. If someone just throws a load of extra words onto a page than that isn't enough to make it get good rankings under anything remotely competitive.
I see this as being similar to the heavy cross-linking penalty that I believe used to (and still does in some circumstances?) give PR0. Those spam farms weren't helping their listings much (there are equally and more effective ways to use PR without doing that), but Google had to act when the by-product (old fashioned similar-content spam) was clogging up the results.