Welcome to WebmasterWorld Guest from 54.242.63.214

Forum Moderators: open

Message Too Old, No Replies

keyword density

when does it become spam?

     

soapystar

2:43 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



heres a question i just cant find an answer too. At what point is it likely google would consider a high keyword density spam? for instance if i have a page of addresses and each one contained the word "london" ,is there a danger of being banned. since i want to avaoid any problems but as things stand one word has a density in excess of %20 .I hear a lot of ball park figures but can anyone say what is safe for sure..and at the other extreme what is dodgy for sure?

davaddavad

5:15 pm on Aug 13, 2002 (gmt 0)

10+ Year Member



3-8% is generally considered safe. i have seen high ranking sites with much higher 25 or more% though that rank well until their competitor turns them in. hth dave

agerhart

5:21 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>>>>25 or more% though that rank well until their competitor turns them in.

Exactly...it is all about how your copy reads and how you present it. You can have a keyword density of 50% if the page doesn't look like or read like spam, so when you get top rankings your competition doesn't turn you in.

pageoneresults

5:23 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



soapystar, I believe the way you have that information structured will also have an impact. Lets say you were building a directory. If those directory listings were separated by <li> or some other structural element, then I don't think there is a problem. If so, then all of us promoting directories would be in a world of hurt!

Now, if the term London appeared 5 times in one paragraph <p> then you might be crossing the line. Structure is the key.

soapystar

5:37 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ok...theres food for thought!

rcjordan

5:47 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member rcjordan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>You can have a keyword density of 50% if the page doesn't look like or read like spam

Exactly! I know of a particular hallway page (a bulleted list, pageone) that was probably hitting 30% kwd. Funny thing, a writer for US News and World Report didn't see it as a hallway page at all, but as a great "quick guide" for his readers and published it on their website.

Marcia

5:57 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You can get away with pretty high density, though I'm not comfortable with it, but sometimes there's no choice. It's possible that after a certain point it's disregarded. I've got a few that are up to 20%, which is too much, but it's appropriate for the size and structure and content of the page. And have also seen some sites with it much higher who get by, also depending on how much text is actually on the page. But that doesn't mean they're ranking well because of the density.

Ther's also a difference in the percentage of occurrences of the words used individually or together in two or three word phrases. And not only the density, but where they are on the page and how they're used. It should look and read naturally.

taxpod

7:15 pm on Aug 13, 2002 (gmt 0)

10+ Year Member



So are you guys saying that a certain level of density will bring a human being and then the decision that density constitutes spam is actually made by a human?

agerhart

7:23 pm on Aug 13, 2002 (gmt 0)

WebmasterWorld Senior Member agerhart is a WebmasterWorld Top Contributor of All Time 10+ Year Member



What we are saying is that it is unlikely that a high keyword density will trigger any alarms within the search engines. The main thing to worry about is if your competitors get suspicious or jealous, and turn your site in.

john5

7:38 pm on Aug 13, 2002 (gmt 0)

10+ Year Member



In the past I had raised the density on some of my pages to 20% and above. The effect was negative - the more I raised the keyword density the worse the pages ranked from update to update. Then I posted my problem here on the Forum and some of you guys advised me to stay within 8 to 12%. Perfect advise! My pages not only rank better now, they also read more naturally.

Chris_R

12:20 am on Aug 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member


Personally, I don't think keyword density will kill any page for spam. There are t0o many ways for this to be done by accident. It may be put into the equation along with other factors, but I wouldn't worry about it being to high in and of itself.

Mind you - after a certain point - it won't help either.

Pages like:

http://mindprod.com/htmlcolours.html

should not get penalized - even though they have "hidden text" and the word test repeated dozens of times. Pages like this are what we could miss out on if google starts tightening the screws.

Notice - a smart search engine, like google, shouldn't/doesn't need to rely on those type of filters - as they should ignore the words after a certain point. Banning pages like that is the lazy way out.

holdenvsford

6:51 am on Aug 14, 2002 (gmt 0)

10+ Year Member



howdie guys and girls.
I am realy new to this, how can i tell if my site will recieve a bad ranking. i made a template up with a sertin description and keywords in the meta tags.
now i used this templat for every page with a slight veriation in the key words like adding one or two per page, and changeing the title.
Will this effect the site being listed or will it be of a spam content?
kind Regards
Shannon

[edited by: ciml at 10:18 am (utc) on Aug. 14, 2002]
[edit reason] URL Sig. Snipped [/edit]

brotherhood of LAN

7:45 am on Aug 14, 2002 (gmt 0)

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



>>>Notice - a smart search engine, like google, shouldn't/doesn't need to rely on those type of filters - as they should ignore the words after a certain point. Banning pages like that is the lazy way out.

I hope google sticks to that philosophy then. Sometimes I can't help but use the same keyword/word again and again, sometimes on its own and sometimes as part of a phrase. Its like using the word webmaster in webmasterworld....or www2 during a google update :) If they started banning pages purely because of "keyword proximity", then IMO, the world is a poorer place.

/extra notice
I hope everyone doesn't start bumping a few extra keywords on their pages now - or it will become a problem ;)

ciml

10:37 am on Aug 14, 2002 (gmt 0)

WebmasterWorld Senior Member ciml is a WebmasterWorld Top Contributor of All Time 10+ Year Member



> I hope everyone doesn't start bumping a few extra keywords on their pages now - or it will become a problem

As I see it, the problem exists only if Google thinks it does. If someone just throws a load of extra words onto a page than that isn't enough to make it get good rankings under anything remotely competitive.

I see this as being similar to the heavy cross-linking penalty that I believe used to (and still does in some circumstances?) give PR0. Those spam farms weren't helping their listings much (there are equally and more effective ways to use PR without doing that), but Google had to act when the by-product (old fashioned similar-content spam) was clogging up the results.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month