Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
If so, have you noticed that you have a lower or higher overall density than the sites ranking under you?
This may not be the case, but for the set of SERPS I'm watching, I've noticed that sites with a lower density seem to be out performing sites that are higher in density.
Maybe there is a spam filter? repitition of a certain word equals some sort of ban.
When you use some meta checkers for a site, they always say that repeating a kw more than 3 times is considered spam, and maybe the new google algo is taking that point more seriously?
As Brett said, I normally don't even look at density and haven't for a long time. But, I was trying to determine why I dropped like a rock for some terms, and held my own for others.
I feel for the terms I dropped on, I know the links with anchor text I had are no longer counted for whatever reason, and it did impact some of my SERPS. The older more established sites that had a lot of anchor text links prior to Feb seem to of remained steady. I'm trying to see if it has to do with the density of the targeted anchor text I've been using.
To go a bit further, I've noticed for the terms that dropped, an inset page showed up that had a much lower density. It did not show up to replace the position that was lost, but it did show up.
It may be a bit premature since the dance is not done, but I think what is there is a pretty good indication as to how it will settle.
[edited by: mrguy at 4:25 pm (utc) on June 17, 2003]
I highly doubt Google would go back to using the keyword tag for ranking. :) Actually, the only engine that claims to use them at all anymore is INK.
Do what you think is right, however, IMO, I would not waste much time analyzing meta tags...
I'm seeing newly created pages storm into the top with no links, nothing. They are making it strictly with what is on the page.
I've seen it talked about here and there that perhaps anchor text and links in general have been reduced in value for this go around. I think there is some merit to that.
As all of you are, I'm just trying to get a handle on what I need to do to get my pages back to where they were.
So far, I've noticed pages that were doing well for more than say 3 or 4 terms are no longer doing that.
Yep, but traditional %on%the%page% density? I don't think it does.
>recognize (and discount) unnatural keyword densities.
Run a density checker across say 30,000 top ranked pages on top sites. It will change your mind about "what is density and how is it calculated" in a big hurry. It is all over the map. Some of the top ranked pages have obscene levels of density, right next to some that have don't even have the kw in the page text. If there is density at work, there are so many qualifiers and exceptions to the rule, that there is little way you could determine which part and where that density played a part in the ranking.
I just looked at a popular keyword today due to this discussion. The #1 page has KWD 16% the #2 has 1.6%.
RFDgxm1 made a good point. I do believe that a somewhat higher keyword density helps an awful lot with INK. Of course, this will only matter if Yahoo starts using INK results in a meaningful way. If that day does come, however, the SEO world will change quite rapidly.
I am thinking of changing my title tag to "Widget X ¦ Y" though "X and Y Widgets" might sound better, but Google's algo (sigh) doesn't recognize "widgets" (plurals) as relevant to queries to my content that are primarily singular.
I will let you all know whether I jump back up into a top spot once a new crawl picks up my revised title tags without the repeated keyword in title and anchor text.
It is my perception, based on www-fi SERPs, that repeating a keyword in the title or in anchor text leads to a downgrade in placement.
I had one page on a content site get booted with this last update. The only thing I can see wrong with it was that it had a very high keyword density. Some of the other pages in the site went up or down a bit in the SERPs, but none of the other ones lost 100 positions like this one did. KW density is the only difference I can see between this page and the rest, so I suspect it tripped some kind of new spam filter.
Run a density checker across say 30,000 top ranked pages on top sites. It will change your mind about "what is density and how is it calculated" in a big hurry. It is all over the map.
I should have said "keyword usage patterns" (in the context of natural language patterns) rather than "keyword density."
Also, Google could have a cutoff point beyond which additional keyword density wouldn't be counted. This wouldn't involve a penalty; it would simply prevent blatant keyword repetition from boosting a page's placement for a given keyword or keyphrase.
Google probably has multicondition code like the credit card companies. Everytime I buy gas and then make a big purchase the credit card companies call me to make sure it was really me using my card. Buying gas alone or making a big purchase doesn't trip the call, but the combo does. It seems thieves like to test the cards out on a small place where they can make a quick exit first, and if they get away with it move onto a bigger purchase right after that.
So the Google folks probably have a large number of conditions they check for, but I do suspect having a keyword density too high is at least one of their negative ranking point criteria.
Anyway, in all honesty it probably wasn't one of my better pages and deserved to get the boot, but it is too bad because it was selling a lot of stuff compared to the time it took to put it up. I've changed the pages around a bit so I suspect the next time it gets indexed it will do much better in the SERPs.