does keyword density really play that much importance anymore ? I don't think so - I work on the rule that so long as it reads OK and the words I want to rank for are within the body especially around the top that seems fine
We ran into the same discussion last week at out company. After research here is what we decided to do.
We decided to stick with keyword frequency(repetition) – 2-3x for short pages and 4-6x for long pages.(exact match for targeted keyword)
Here are a few pointers from a leading SEO community: (Not sure if is is against this forum's rules to mention the exact name hence I am dropping those details)
# Number of Keyword Repetitions - It's impossible to pinpoint the exact, optimal number of times to employ a keyword term/phrase on the page, but this simple rule has served us well for a long time - "2-3X on short pages, 4-6X on longer ones and never more than makes sense in the context of the copy." The added benefit of another instance of a term is so miniscule that it seems unwise to ever be aggressive with this metric.
# Keyword Density - A complete myth as an algorithmic component, keyword density nonetheless pervades even very sharp SEO minds. While it's true that more usage of a keyword term/phrase can potentially improve targeting/ranking, there's no doubt that keyword density has never been the formula by which this relevance was measured.
# Not surprisingly, a persistent myth in SEO revolves around the concept that keyword density – a mathematical formula that divides the number of words on a page by the number of instances of a given keyword – is used by the search engines for relevancy & ranking calculations and should therefore be a focus of SEO efforts. Despite being proven untrue time and again, this farce has legs, and indeed, many SEO tools feed on the concept that keyword density is an important metric. It’s not. Ignore it and use keywords intelligently and with usability in mind. The value from an extra 10 instances of your keyword on the page is far less than earning one good editorial link from a source that doesn’t think you’re a search spammer.
Keyword density is very much alive and well. The problem is that too many people simplify the definition of keyword density by thinking it is a simple matter of counting the number of times a keyword appears on a page.
When I write a page of text I first write it as I would speak it. Then I subject it to hours of analysis and editing until I get it to fall into my sweet spot. I will not reveal how or what it is all about but I have 2 values I aim for -- 3% and 75%. They are 2 different metrics that compliment each other and extremely difficult to achieve in balance. Most times I can get the 3% right where I want it but the other metric typically falls between 70%-74%. When that balance is achieved (3-75) magic happens.
I will say this much; critical density of your keywords is diluted by overuse of "stop words" such as "the, and, you, your, they, their...". Yes search engines do ignore them but they cannot compensate for the percentage of weight that those words have gobbled up. If you remove those excessive repetitions your targeted keyword weight increases naturally.
Analyze any top placing sites and you will see this as a very common denominator. My methods go deeper than that, but a guy has to keep a little bit of secret sauce to himself to indulge in :p
In my view, keyword density is essentially a tool that webmasters can use as a guide. With a metric like that you can be sure that your text hasn't gone over the top or accidentally gone out the bottom.
But I'm convinced that an actual density metric is not a part of the algorithm used at any major search engine. Pages can rank quite well at almost any level of keyword density, and you've got to serve your audience first and foremost. Otherwise any level traffic doesn't do you much real good.
The content writer's challenge today is that many semantic measures are in place - far beyond ordinary word stemming or synonym use. For example, co-occurring phrases are important today, where in the past they were something to keep minimal.
Here are a couple of discussions form the Google SEO forum that can get your ideas flowing:
But tedster, what may have been relevant yesterday will change tomorrow. Think of it like this; Google's algorithm is a perpetual work-in-progress and as webmasters, developers, content writers or what-have-you, we are playing a game of tag with them. Google can always be bested because you don't have to beat Google you only have to beat the first place result that their algorithm is in luv with. Google is only as good (or as bad) as what we challenge them with. It's fun, it's challenging, it's mind games.
I'll go so far as to say that the game of tag is played out in such a way that when someone comes along with a page that raises the high water mark and the engineers can't figure out what you've done to best them -- they will manually suppress your page until their engineers can reverse engineer your efforts and then say -- tag you're it and then let your site out of jail. No I cannot prove it, but my keen observations over a few years says it's not too far off the mark.
That statement should stir up a hornet's nest of controversy, but hopefully a healthy debate.
I wouldn't put much stock in keyword density anymore. High quality content naturally has a high KWD for target keywords, as it is on that topic. There is no magic number. If you try to force a higher percent you may be doing more harm than good.
Some times it is very difficult to include even 2% keywords in our contents so it is my advice that try to include your keywords without worrying about the 3 to 4% keyword density. Just include your keywords and try to make your site more friendly and use all the other on page optimization techniques.
If keyword density was a ranking factor, and it was possible to have guideline numbers as above, a very large amount of Google rankings would be inexplicable, since they are nowhere near the numbers given as "guides".
- Google's keyword density for "google" is 10% - Amazon's keyword density for "shopping" is 0.15% - Ebay's keyword density for "auction" is 0.05%
Yes, these are rather extreme examples, but personally, I'm not convinced of the usefulness of using a mathematical metric that displays a poor correlation with rankings.
Worse still, how are you even measuring keyword density? Are you taking into account plurals and spelling variations? Different parts of the same word stem? Or are you using a simple tool that counts words or phrases and divides by the total? Are you taking into account word relationships (e.g. "Ford" is related to "car"). CO-occurrence?
If the way you measure keyword density doesn't take this into account, then you need to start asking serious questions - because Google is certainly using them for textual evaluation.
Before posting a keyword density percentage, please make sure you've read the links in Ted's post above :)