Forum Moderators: open
Here's my beef. How is it possible for 6 of the top 10 competitors in my category to have ZERO mentions of the keyword on their page or in the meta tags and get such high rankings? If Google is committed to providing users with the most relevant search results, how is it possible that keyword density has become that irrelevant?
Keyword usage and density has always been part of the acceptable opimization formula but now it seems that optimization is taboo and some other mysterious factors now determine relevancy.
after obtaining the best kw density from brett i went and had a hunt around, to see if i could see a correlation.
All i have found is the opposite. For example a site was listed no1 for a highly competitive kw (keyword density for that term 21%). The site in question had aprox 10-15 pages (dont want to be too exact ;) might be here) all of which where link pages i.e links to reciped site the content isnt great and the site has not moved in the whole of the florida update?
even worse is the fact that all the rsults for this term are pretty poor, non relevent news etc
I suppose in short has google now given a lot less emphasis to kw density?
I'm sure you all know about this tool to check keyword density
[searchengineworld.com...]
It is interesting to note that all of the top sites, even with the 0 keyword density, are corporate sites. I am operating an affiliate site. If I was a conspiracy theorist I would be suspicious of some sort of affiliate penalty. But that would be restraint of trade. And since, according to Google, relevancy is the primary issue, then an affiliate site with good content (like mine, IMHO) should get just as fair a review as a corporate site.
But does it.......? Only The Shadow knows.
All you can do is go with what once was about right on the assumption that at some point, or to some engine, it will be important again.
Personally, I just make it read well - that seems about right for most engines. It's easy to over-egg it and end up with something that just looks spammy.
TJ
If I produce a FLASH guide for, say, building a PC and its deeemed by other websites to be the best guide available so gets linked by all sorts of descriptions then a good SE should be able to list that resource.
Text on a page is meaningless without a frame of reference. That is what Larry Page and Sergei Brinn saw all those years ago and why off page factors are more important.
And that is also why Google is obviously trying to stop its frames of reference being skewed by link manipulation (Dominic, Esmerelda, Florida). Still some way to go (unquestionably there are currently plenty of anomalies), but I think the intention of Google is clear.
Too often webmasters assume "content" means "verbiage" obeying some arbitary rules. Content doesn't (and should never) mean that. For "content" read "service provision". Those services may be informational, technical, commercial whatever - if your website provides it and lots of people think it provides it well then it ought to be found by an SE.
Perhaps that is why TJ finds that simply describing his services using natuaral language seems to work well - its not the words, its the service.
Perhaps that is why TJ finds that simply describing his services using natuaral language seems to work well - its not the words, its the service.
Perhaps you're right. I never thought about it like that, but when I first started out building websites, a couple of years ago, I used to examine things like keyword density.
Then when I knew a little more about what I was doing, I realised that it's so un-important that I could ignore it and just write what I felt looked good and worked the best gramatically. I like to make my content easy to read. I am aware of keywords, and I do use them in the text, but I do not deliberately place them. They're in my mind when I'm writing, and they naturally fall onto the page as a result.
I've learned, possibly, all there is to know about SEO and I've learned that right here. But I think the most important thing I've learned here, and I see this from from reading the Florida threads, is that I do not *practice* SEO. I used to *think* that I did, but actually I just follow Brett's "12 month guide...." and the W3C guidelines and do my best to write useful articles and build worthwhile sites.
And they all make it to the top 10 for their targetted phrases and keywords, and most of them to #1 (unless I'm competing with Yahoo!). Eventually. This approach takes longer I'm sure, but it has more longevity. No movement whatsover from Florida and google referrals going up by about 50% as testament.
TJ
Too often webmasters assume "content" means "verbiage" obeying some arbitary rules. Content doesn't (and should never) mean that. For "content" read "service provision". Those services may be informational, technical, commercial whatever - if your website provides it and lots of people think it provides it well then it ought to be found by an SE.
Like many terms, "content" is subject to various interpretations. To a person who designs search engines, it obviously means content that can be crawled, a.k.a. "spider food." To a person who's editing a directory like DMOZ or Yahoo!, it might refer to what the site offers as well as what's on its pages.
It just isn't reasonable to think that a spidered search engine like Google can deduce what services a site provides unless it finds tangible indicators on the site's pages. If your site offers widget consulting, then you'd better make sure that "widget consulting" is displayed prominently on your pages, because Google won't be able to figure out that "evaluations and implementation of widget-related marketing strategies" means the same thing.
Sorry for the newbie question but how do you count keyword density? I had thought it was just for visible text on a page but the keyword analyzer offered in msg #5 seems to include keywords in on-page links(i.e. - anchor text). Should I be counting the text in links? Also, what about title and meta tags?
And while I'm at it, if I have a multiple word keyword(I guess you'd call it keyphrase?), how do I count that? Say the keyphrase is 3 words(e.g. - soft fuzzy widgets) -- do I count 3 for each instance of the keyphrase or just one? Also, does the denominator of the keyword ratio include the keywords or does it just comprise the non-keywords?
Also, if someone could expand on the idea presented that proper keyword density is dependent on anchor text I'd be appreciative. Not quite sure what you mean.
Thanks.
GuinnessGuy
[thinking]how do I measure 12% with common sense?[/thinking]
[edit]software was WebPosition Gold[/edit]
[edit]never mind
htt*://www.searchengineworld.com/cgi-bin/kwda.cgi
[/edit]
My point is that finding out how (possiby millions) of other people describe your page can provide a better frame of reference than any onpage text - i.e., the links into a page take more precedence than the text on the page, with a check of say title to ensure that the page is in general context to the search.
No system is perfect, whether on page or offpage, but focussing more on the inbound links may describe more abstract views of the data not initially thought of when the page was published.
Here's my beef. How is it possible for 6 of the top 10 competitors in my category to have ZERO mentions of the keyword on their page or in the meta tags and get such high rankings?
Did you look at their backlinks, the words/phrases used in their inbounds, if they're in Yahoo! Dir and DMOZ dir, where they rank on other engines and directories, and what categories they're listed in?
We've positioned several sites in the top five for keywords and keyphrases that yield 500,000-1,000,000+ total results - and besides the content - they only have about 3-6 solid keyword-relevant inbounds from Yahoo!, DMOZ, etc. and a couple of affiliates or sister sites.
It almost seems to me that a psychologist should be involved in developing the Google algorithm (and could very well be). Keywords as search terms are the way a searcher expresses his or her interest in a particular subject. Every time Google or any other search engine provides results for a search it has essentially responded, "this is what I thought that you were looking for when you typed in 'theories of the origin of blue widgets.'"
The question is, who is doing a better job getting into the head of the searcher - Google or the Webmaster? By optimizing for Google (and to optimize or not to optimize seems to be the real question at this point) we are assuming that they have the best insights into mind of the searcher. And since they are the intermediary between the searcher and the Webmaster, we have no choice but to cooperate with G since they are the arbiter. But if keywords (the searcher's own expression of the information of interest) have become an insignificant component of the algo then Google must have some fascinating methodology to determine what the searcher really wants.
Are they, in fact, doing that? And who's doing a better job - the responsible SEO/Webmaster or Google? I think the juy is out at this point.
Just read your post. Good insights. I don't mean to over-simplify by focusing exclusively on keywords. My real point is that if you optimize for Google, then you assume that Google has the searchers intent figured out. If you optimize for the searcher, then you assume that you are more in touch with their thought process than G and hope that at some point G catches up to you. Or, you say "the hell with it, I'm just going to put my site together using my instincts and common sense."
Which begs the question, is there really such a thing as SEO anymore?
I intend to do more research to find out what those other sites have going for them. Your criteria will be very helpful. Unfoetunately for many of us, you live by the G or die by the G.