Forum Moderators: mack
This is a bad thing to me because it devalues content. Their task should be to index the world's content/information. Not the world's blurbs and doorways. Simple pages designed to sell something are out ranking articles and informative pages. Why?
Of course this experiment comes at a price: reducing the content within a document just to rank better on MSN means ranking on fewer longtail phrases in Google (since Google actually indexes and ranks pages based on content), thus I am seeing a sharp drop in traffic overall (down 50% from the previous 3 Sundays). This drop also implies that despite having wonderful rankings in MSN, their traffic levels are simply not worth the time and energy right now. Time to revert the page back to my pre-experiment content.
You'll need more than a few KW's in your Meta Tags.
The density there is also important.
If you only put a few KW's in Meta Tags without any other wording, your KW density will still be too high.
It'll take some thought, but repeat no word more than 2 times.
Write something generic...Most respected (KW) breeder in (your city)with (years) in business. (KW) breeds are the most popular...etc.
Suggestion: KW Density in Meta Tags (KW used twice should = 8% Density) Words used once should be 4%.
Put your main KW's up front close to the beginning, put lesser KW's toward the end.
I'm sure one could go to 10% & 5% BUT...IMO being conservative will be protection against another Algo spam update.
You can test a few pages both ways.
All I can say is this has worked well for our sites with no further penalties.
By the way...Yahoo likes low KW's density also.
Good Luck, Fish Texas
Anyway, it's a very informative article, complete with a "how to do it," "how to interpret results," and a detailed example the user can reference. It's taking a concept taught in college and telling someone how to do it in the working world.
What is described in the article is something I do for a living - and it's a very desirable skill. I stand ZERO chance of getting any referrals from MSN. I expect first page rankings everywhere else.
BillyS...if you are getting good traffic elsewhere, I wouldn't touch a thing. You might consider developing another page with a condensed version of your article...MSN might pick it up.
Maherphil...yep, it works.
Fish Texas
You'll need more than a few KW's in your Meta Tags.
The density there is also important.
If you only put a few KW's in Meta Tags without any other wording, your KW density will still be too high.
I don't believe meta tags have anything to do with what I am talking about. All other things equal, reducing article length (document size) from 22 k down to 5k caused the page's rank for a competitive phrase to move from page 5 to page 1. I have been observing this for a year now in MSN. Metatags were unchanged, template unchanged, images unchanged. Just on-page text, paragraphs, etc were deleted.
Most of the pages ranking in the top 20 have little vertical scrolling (very small pages). This is affecting ranks.
I expanded this test to another site of mine, and boom. The same thing. It went from position 41 to 12 for the competitive key phrase.
Anyway, it's a very informative article, complete with a "how to do it," "how to interpret results," and a detailed example the user can reference. It's taking a concept taught in college and telling someone how to do it in the working world...I stand ZERO chance of getting any referrals from MSN. I expect first page rankings everywhere else
And if you reduced it down to a one-paragraph summary (removing all of the useful information), you may rank much, much higher for the important phrases (especially those contained in any anchor text pointing to the page). Fortunately, you leave the document unchanged because it is valuable to your visitors; unfortunately, MSN doesn't value the content.
I think MSN's reasoning is that small pages score better on usability. But some of the most informative documents out there are long, hard on the eyes, and require lots of scrolling...I still want them to be listed if I am looking for them, not suppressed in favor of small, uninformative blurbs and blogspots.
It is 2007, but no reason to be sad that keyword density is a big factor in some search engines. Keyword density indicates what an article is about. An article that is really about purple widgets will have a higher KWD, and all other things being equal, probably should be at the top of the search results when you type in purple widgets.
The problem is that people manipulate the keywords to spam a search engine. Just like they manipulate whatever they think Google likes (links and more links). I personally am sad that my guest book and a forum on one of my sites became over-run with links to sites about viagra and mortgage deals - and since I didn't have hours a day to police them, I decided to close them. Most spammers were too lazy or stupid to determine if there is a no-follow on those sites. Would tens of thousands of previously useful sites have been vandalized in this way if not for Google and their algorithm? Doubt it.
A heavy reliance on links is just as bad as a heavy reliance on keywords - except that weakness ultimately favors big corporations and professional spammers. I craft a 1000 or 2000 word article on a topic where I have expertise, include valuable original content, references, links to other valuable information, and get beat in Google by scores of useless articles that mention the key word once in passing. But some are from international news corporations or ecommerce sites with a budget to do obnoxious SEO ... they have page rank, they have links from their megasite or from black-hat bots. And they mention purple widgets once on the page, so that must be what people are looking for. Except it isn't.
All machine search enginge algorithms shuck. They shuck in different ways, and some shuck worse than others. But they are all utterly devoid of the intelligence that we expect from a reasonably bright human being. It's 2007, and no machine search engine can reliably distinguish between shirt and shineola.
[edited by: nonni at 1:26 am (utc) on Feb. 13, 2007]
I disagree that short-short is better on MSN. I get more traffic from MSN than Google, and most is from moderate to lengthy, well written pages that deserve to be on the first page of results. These are pages written to really inform the audience - I research, analyze, have photos and illustrations, and take time to write professionally.
There are many of us who write professionally. Unfortunately, in my industry, I have observed that very short content pages outrank the longer, more informative ones (on competitive phrases; the same may not be true for more obscure ones).
That is very interesting because my site's author is a well researched expert in the area. Maybe MSN is on to something?
Not sure what you mean, but an algorithm has no way of knowing if something is written by a Ph.d, a high school student, an "expert" or a novice. Granted, things like grammar and spelling can be indicators of quality, but I just don't think they should assign value based on document length, which is what I seem to be observing in some cases. Again, I had two test cases which had the exact same result. I think that says something, though it doesn't complete the picture.
Now thats an old approach...
Im with Simey on this one...now way in hell would I chop a page in half to please MSN. I might do other things but it would be much more subtle and I sure the heck wouldnt waste my time with KW density analyzer. I quit using those over five years ago and have thousands of top tens?
...an algorithm has no way of knowing if something is written by a Ph.d, a high school student, an "expert" or a novice.
It should be possible to determine, without being given explicit meta keywords: what an article is about (and thus to classify it by subject), whether it is in scholarly or informal or opinion style (giving some indication of its likely merits), and whether it is well written in terms of spelling, grammar, and usage (giving some indication of the author's educational level and/or writing expertise).
The few factors that Google has publicly acknowledged are considered by its algorithm, and the additional factors which outside people speculate might be considered by both the Google and MS algos are remarkably simple and simplistic compared to what could technically be done.
At various times, both Microsoft and Google have been reputed to have active natural language research departments, which one would expect to be studying this area. It is puzzling why there isn't overwhelming evidence that they're putting the results of such research to practical use as soon as they can. Maybe they're both waiting until they've "got it nailed" before they start implementing it, but the technical know-how to perform basic analysis of this type has been around for years. Maybe it's so computing intensive that they can't spare the time that it takes to process a page that way.
One thing I haven't seen speculated about much so far is the possibility that they're intentionally introducing random elements to confound those who would guess their algos and explicitly attempt to exploit them.
[edited by: SteveWh at 11:33 am (utc) on Feb. 13, 2007]
Im with Simey on this one...now way in hell would I chop a page in half to please MSN.
Under normal circumstances I would agree. I had my reasons for changing the content on the page, and in the interim it was reduced to just 5k as I stated. The temporary content reduction gave me an opportunity to see what would happen, since I had already been observing small pages outranking longer documents. During the 2 to 3 day period when the homepage content on two of my sites was being changed, MSN responded the same way to both. The content, though changed/rewritten, has since been added back to my pages, so I expect to see a resulting drop in MSN rankings on the next crawl.
The experiement really happened because of other circumstances, but it was still interesting to see. As has been expressed above, it is not recommended that you make sweeping changes JUST to please MSN. MSN simply doesn't drive enough traffic to justify the loss of indexing on longtail phrases at Google.
I try to break up a topic where I can, but it's not always possible. And personally, I have no intention of breaking up these pages for MSN. If I thought they were providing good results I might consider creating smaller pages as suggested - but their is no way I'm changing a thing based on the results I see.
At various times, both Microsoft and Google have been reputed to have active natural language research departments, which one would expect to be studying this area.
Yes, this is what I'm getting at. An algo can see HOW a subject is discussed. What are the words used to describe the topic, how are they inter-related.
A lawyer would talk differently to other lawyers (using their jargon) then one would talk to a client.
Natural language analysis combined with statistical analysis of, as you said, grammar and spelling, plus vocabulary, should have the potential of making a pretty good guess in that regard about any article, given a pool of similar articles written at various skill levelsand
An algo can see HOW a subject is discussed
Natural language analysis is a great concept. But if the algorithm ranks a page simply because it is smaller (perhaps making it "user friendly"), then the language hasn't truely been analyzed (has it?); there may not be any substance there to analyze on extremely small pages (blurbs, sales copy, etc). I am not a statistician or purport to know anything about language analyses. I am simply trying to understand why preference may be given to smaller documents, all things considered equal. Again, this has been my observation over the past year and I have seen it work when tested. Was the test worth it? No.
if the algorithm ranks a page simply because it is smaller (perhaps making it "user friendly"), then the language hasn't truely been analyzed (has it?);
I am simply trying to understand why preference may be given to smaller documents, all things considered equal.
Again, this has been my observation over the past year and I have seen it work when tested.The fact that an explicit test brought that result is most interesting.
I admit to not having followed closely discussions about what factors are used in G or MS algorithms, but something I do seem to detect -- can it be? -- is that people now seem to care more about the MS algo and how they rank in Live Search than they did even a few months ago.