Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
My findings are that usually 14 - 17% is ideal, although I see some sites doing very well with as much as 40 - 50 percent (which seems insane that Google wouldn't write such a site off as spam).
There are ways of writing copy that can be up to around 30% without fearing the chop...
Nick_W - I'm sure any thoughts you have on this would be very welcome reading for lots of people.
When you say 'ways of writing' - what do you mean exactly? Are you referring to the number of words in between key phrases etc?
I may be wrong about how most others feel ... but after 2 years of testing to find the ideal density, I'm not telling what percentage I use. ;)
I suspect that the acceptable density (acceptible to Google, that is) depends on the number of words on the page. Sites above mine vary from 8.8% to 33.4%. The pages with higher kw density have fewer words on the page, and are basically just entryway pages. On that kind of page you can probably get away with higher density.
My own preference though is to have a good amount of real content on the page with links to more real content. Unfortunately this strategy doesn't seem to be working very well. Perhaps something else I'm doing is inadvertantly getting in the way. I recently cut my kw density to see if this made any difference. This was pretty hard, since my site is content-rich. I guess I'll just have to wait until the end of the month to see if it's worked.
I do SEO for several clients and have lots of 'known' ways of improving SERP positions, almost all of which have been extensively documented by many people here at WebmasterWorld.
I thought the whole point fo this forum was for webmasters to share knowledge, as it empowers both the giver and the reciever alike.
I also assume (by reading a lot of the posts here) that most people are in favour and support the open source movement, the entire philosophy of which is to share so-called 'trade secrets'.
Perhaps I am misunderstanding you, but I see no reason for harbouring techniques that will be of benefit to other webmasters. It ultimately benefits everyone if information is shared.
I make a point of being as transparent as possible with clients and try to educate even non-programmers as to what exactly we are doing for them. Does this do me any harm by revealing my secrets? No, it means they have more confidence in my abilities and come back for more :-)
Many niches on the web are dominated by sites with KWD in the mid to high 30% range, while others might see the top 30 results filled with pages where the exact phrase does not even appear on the page.
Taking advice from other webmasters is fine in certain circumstances, but there is no short cut out there for KWD. You have to do your own testing specifically for the keywords you are dealing with and keep trying until you get your site to where you want it.
(Small tip: Do a search for the search term you are targeting and check the keyword density for the top three sites). Any correlation?
If someone here tells you that 30% is the ideal and then you go and change your site accordingly, you may find your site buried after the next update. Make small keyword adjustments and see if it makes a difference (up or down) in the SERPS.
If your site is called Big Widgets and you have 50 incoming links that say Big Widgets, you may very well rank high for Big Widgets even if the KWD is very low.
For secondary Keywords, you may have a great navigation system where every page on your site links to (including link text):
Widgy Widgets (my favorite type of widget. Have you ever tasted one? They are incredible!)
So now you have another 50 internal links going to Blue Widgets, Red Widgets and Widgy Widgets.
Keyword Density = Old School (still some value)
Page Title = Gen X Wave (probably more important than KWD)
Link Text Votes = The New Wave (best value IMHO)
Anyone have any clue on why different density checkers vary in their results?
It may be that one of the analyzers is taking into account word counts in alt tags, meta keywods and description tags. That would obviously inflate the density. Brett's tool allows you to customize the way you want the report configured.
While meta tags aren't one of the top priorities, I generally run it all three ways, and for some pages have to *not* leave in the stop words restriction. That's helpful when a page is doing well with Google and you have to decide how much tampering to do if those same pages need a nudge up at other search engines, or on the other hand, if pages are doing well across the board it's just another piece of data that's probably helpful in indicating who's looking for what. There's nothing definitive, but it helps to have the whole picture.
Also, for secondary words and phrases within a site, it can't hurt to run a density check for the homepage plus the interior page it's linking to for a given page.
It's also helpful to see the count of how many times the words are used in the exact phrase or individually, or in what order, relative to the amount of visible text on a given page. Actually, more of a concern than actual density is how the phrasing is done and the location on the page.