| 8:41 pm on Aug 1, 2006 (gmt 0)|
|In conclusion, i believe the results are now dire due to the onpage density knob being turned so low that the serps relevance has been affected by it. This started about two/three updates back. |
In April, I did a KW density comparison on the top 10 sites and found a pattern ranging from the #1 site (approx 3.9%) to lower KW densities. After May, there was NO pattern, so you may be correct. I think I'll do another comparison check.
| 9:24 pm on Aug 1, 2006 (gmt 0)|
One of my sites on a US server (not my own server, I hasten to add) was hacked a few days ago and the index page was replaced by a political extremist. Before I could put it right this site was spidered by Y! and now ranks on the first page for an important keyphrase, despite containing just a mindless rant, whereas before the hack it was nowhere. Honestly, you couldn't make it up.
| 9:45 pm on Aug 1, 2006 (gmt 0)|
yep that sounds about right.
At a guess, your url was fine, the onpage obviously nothing to do with the keywords, you may have found the rant contained a word in it and that half percent density was enough to rank you well.
Your previous dedicated page meanwhile wont rank because it gave to much info on the search string request and had pushed up your density to high.
We have one site where various pages rank well in the Yahoo serps on non relevent pages and the reason is because these pages contain a footer with some links to the dedicated pages and those links are enough to give the page slight keyword density enough to rank. Meanwhile the dedicated pages no longer get a look in!
| 10:15 am on Aug 3, 2006 (gmt 0)|
I have found that a number of my sites are high in the Y! UK serps for the strangest search terms. These terms appear in outgoing link text only. None of them at all appear in the heading, description, body text or incoming links. Not only weak page density, but no page density at all!
| 2:41 pm on Aug 3, 2006 (gmt 0)|
|but no page density at all |
aha... you notice it too? I've been saying it for so long now. A couple of the top 10 or top 20 sites have absolutely NO relevance and not even a single key word!
| 5:20 pm on Aug 3, 2006 (gmt 0)|
Thats what Yahoo has done, hence why its hopeless
If you want to rank for say "Blue Widgets" you need a page with some text on and the keyword "Blue Widgets" a maximum of once only if at all on the page.
Have a page title tag like "Selection of Blue and sometimes other widgets"
Dont include the keywords together - certain death just put a few sentances together like "Today you will often find something blue available on this site" Yada, yada then further down mention the word widgets, again on its own, in another paragraph of text.
In all mention the word "Blue" about about 9 times (no more than about 3.5% density) and overall you want zero density or something minimal like 0.1% density for "Blue Widgets" as the two words together.
Doing this will ensure you start to rank for the "Blue Widgets" keywords.
The Yahoo current policy must be if you have any seo on the pages it must be spam, yet the reality is that an experienced webmaster has established pages that say what they are for the end user. I call a spade a spade. If the page is about "Blue Widgets" i say so. Title tag includes it, description includes it, H1, H2 on page etc.
Ive seen this loads of times currently with the yahoo serps. The laugh of it is that on a number of the sites we work on the pages rank that have thin off topic content with low density as mentioned above, whilst the quality pages all about the subject have dropped in the serps because they are correctly marked up with the headings stating what the page is about, have mention of the keywords on them (after all they are pages about the keywords!) and have high density as a result!.
Meanwhile the spammers clean up because the idiots at Yahoo think the low density removes spam when in fact all it does is drop good sites that have dedicated pages about the subject whilst spam sites rise nicely to the top. The serps end up next to useless hence why Yahoos revenue is in decline
Half the spammers anyway steal content from other sites and hash it together on one page, so they often have low density pages and the Yahoo serps features loads of spam sites and junk - its no wonder they are losing money
| 11:03 pm on Aug 5, 2006 (gmt 0)|
|Meanwhile the spammers clean up because the idiots at Yahoo think the low density removes spam when in fact all it does is drop good sites that have dedicated pages about the subject whilst spam sites rise nicely to the top. The serps end up next to useless hence why Yahoos revenue is in decline |
This is always a fustration for me. It seems like often times when a page gets finished being written and then I look at the key word density it is way too high and then I have to find ways to get the keyword out of the page in question. Once I was working on a site where the key words were in the company's name and the name of the dealership for the company. Every sentence or two seemed to have the key words for one reason or another and it was brutal to rewrite the page so that it was compling for the reader while not using the key phrase as often.
I find the whole way SE's handle SEO to be evil. If I don't do SEO and create things naturally they come out looking like they have been over optimized to SEs so I have to waste valuable time doing SEO so that my sites don't look SEOd. This is just perverse. If I wasn't so worried about getting banned for over optimization, I wouldn't even pay attention to most SEO details.
| 1:26 pm on Aug 6, 2006 (gmt 0)|
Thats exactly my point also.
Im writing a section about Blue widgets for my site users. I start with my title and heading "Blue Widgets" and the page is /blue-widgets.html as its dedicated to the subject. All hand coded etc and i include some links to say "greyish blue widgets", "pinky blue widgets" etc, etc that i think will be relevent to my visitors that may need information on something closely related.
When ive finished, like you i look at the keyword density and think oh no, look how high that is i need to water this down for the search engines (not the end user!), do i change my links out to just "Pinky" instead of "Pinky blue widgets"? but then it doesnt look right or clear enough to my users. do i perhaps have less links about the keyword going out? but again my users need as much available information as i can give them. Do i try to water down my use of the key words in the text? but then that can read poor.
The point is you start venturing into seo rather that giving your users what they want just because search engines the likes of Yahoo decide they want low density to rank you!.
The commercial spammers with zillions of sites clean up on Yahoo because they have a range of spammy pages on different sites of different density levels so they always rank.
Im still amazed at the number of directory sites that rank well and all they do is scrape paragraphs of text from sites with bits of keywords in to hash together a page that ranks well that fits in with yahoos search criteria. Meanwhile, quaility sites that write content for their users get stuffed by Yahoo for keyword over use - you couldnt make it up!
| 1:45 pm on Aug 6, 2006 (gmt 0)|
The problem with designing your site to suit the SEs is that the algo could change completely overnight. I sometimes wonder if the SEs do this deliberately just to 'shake up' the SERPs a bit - the same sites at the top, month after month may be a bit boring for some people, and how do you really decide which sites deserve to be number one out of the millions out there? Best to design your site to suit your customers IMO, there's not much point getting the hits if you don't convert those hits to cash. The fact is, the spammers aren't going to go away and if they have to produce thousands, millions or ultimately billions of sites that's what they'll do, as long as they're making a profit from them. Running a search engine is gonna get harder and harder in future and those that stand still will go the way of AV and all the rest from just a few years back.
| 1:59 pm on Aug 6, 2006 (gmt 0)|
|The problem with designing your site to suit the SEs is that the algo could change completely overnight. I sometimes wonder if the SEs do this deliberately just to 'shake up' the SERPs a bit - |
For the most part I try to ignore the SEs and this has served me very well for the better part of a decade. The problem is that on July 27th my site got delisted and I had to search hard to figure out why. I don't know the real reason, but I did make changes that I thought could be contributing factors.
In other words I did SEO so that my site wouldn't be banned for things that could be potentially flagged for over optimization. Basically, I was forced into doing SEO so that I wouldn't be accused of doing SEO. This is just is just perverted.
| 2:19 pm on Aug 6, 2006 (gmt 0)|
This is the problem with the virtual monopoly that a small number of SEs have got and we are forced to consider changing our sites to suit their arbitrary rules. Dunno if this situation will remain for much longer though, there are a lot of newcomers pushing for a place in the sun and things change quickly on the web. It's not so long ago I was optimising a site for Lycos! I don't expect Y! or anyone else to solve the spammer problem but if they don't at least control it, and improve their results considerably, the future for them looks interesting to say the least. I confess that the logic behind the current algo leaves me very puzzled.