homepage Welcome to WebmasterWorld Guest from 54.82.122.194
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Visit PubCon.com
Home / Forums Index / Microsoft / Bing Search Engine News
Forum Library, Charter, Moderators: mack

Bing Search Engine News Forum

This 32 message thread spans 2 pages: 32 ( [1] 2 > >     
Small pages rank best
Not a good thing
crobb305




msg:3249581
 11:52 pm on Feb 11, 2007 (gmt 0)

Has anyone else noticed that very small pages rank highest in MSN? This could explain why doorways and blogspots rank so well. To test this, I reduced text on my homepage down to just a paragraph or so, with links to my internal pages. The content reduced from 22k to about 5k. Within 2 days, the page went from page 5 to page 1 for a competitive phrase.

This is a bad thing to me because it devalues content. Their task should be to index the world's content/information. Not the world's blurbs and doorways. Simple pages designed to sell something are out ranking articles and informative pages. Why?

Of course this experiment comes at a price: reducing the content within a document just to rank better on MSN means ranking on fewer longtail phrases in Google (since Google actually indexes and ranks pages based on content), thus I am seeing a sharp drop in traffic overall (down 50% from the previous 3 Sundays). This drop also implies that despite having wonderful rankings in MSN, their traffic levels are simply not worth the time and energy right now. Time to revert the page back to my pre-experiment content.

 

BillyS




msg:3249678
 2:50 am on Feb 12, 2007 (gmt 0)

I've seen this too. MSN does not seem to care about the number of words on a page. I saw an "About" page rank very high for a competitive phrase - all the page had was about two paragraphs and the phrase was mentioned once.

crobb305




msg:3249693
 3:34 am on Feb 12, 2007 (gmt 0)

It is just sad that a reasonbly-sized document (150 to 200 words) about a certain topic would get knocked down 50 or more spots in the serps, while some 3-sentence blurb about a product gets the top 10 spots. The purpose of a search engine is to help people find information, right? Penalizing pages because they have "too many" words on the page (or rewarding pages with very few) is just ridiculous. MSN is never going to get it right [webmasterworld.com] are they?

Fish_Texas




msg:3249715
 4:28 am on Feb 12, 2007 (gmt 0)

It's not the number of words IMO, it's the number of keywords...density.
I went from top 10 down to 50 last Aug (penalty)...lowered KW density to 3% on page density and lowered KW's in Description Meta Tags to only 2 times per KW.
Bingo...all pages back up in top 10 in September 06.
Might be the smaller pages would naturally have lower KW density.
Hope this helps.
Fish Texas

crobb305




msg:3249731
 5:16 am on Feb 12, 2007 (gmt 0)

I suppose it does come down to key word density. So, well written, informative articles will only rank well if the writer deliberately increases keyword density. MSN needs to advance the search algorithm a bit. If I am writing about a particular breed of dog, I better make sure I mention that breed every few sentences and be as redundant as possible, or cut out all the information and just put "This [breed] is a great breed...The End" and watch it rank #1.

Shurik




msg:3249775
 6:02 am on Feb 12, 2007 (gmt 0)

You can also register a keyworded domain with a MSNboot disallowed in robots.txt
You can rank well for that keyword with just a few links.

Fish_Texas




msg:3250212
 4:53 pm on Feb 12, 2007 (gmt 0)

Crobb305...for MSN Algo

You'll need more than a few KW's in your Meta Tags.
The density there is also important.
If you only put a few KW's in Meta Tags without any other wording, your KW density will still be too high.

It'll take some thought, but repeat no word more than 2 times.
Write something generic...Most respected (KW) breeder in (your city)with (years) in business. (KW) breeds are the most popular...etc.

Suggestion: KW Density in Meta Tags (KW used twice should = 8% Density) Words used once should be 4%.
Put your main KW's up front close to the beginning, put lesser KW's toward the end.

I'm sure one could go to 10% & 5% BUT...IMO being conservative will be protection against another Algo spam update.
You can test a few pages both ways.

All I can say is this has worked well for our sites with no further penalties.
By the way...Yahoo likes low KW's density also.
Good Luck, Fish Texas

Fish_Texas




msg:3250218
 4:57 pm on Feb 12, 2007 (gmt 0)

The above post and suggestion is for Meta Tags only...
Overall PAGE KW density should be around 3%
Fish Texas

iThink




msg:3250280
 5:44 pm on Feb 12, 2007 (gmt 0)

This is 2007 and stuff like keyword density in meta tags can still be used to manipulate rankings in MSN. Kinda sad. Isn't it?

maherphil




msg:3250281
 5:45 pm on Feb 12, 2007 (gmt 0)

It's not the number of words IMO, it's the number of keywords...density.

Yes, I would agree with that as well. I've got a site that does amazing in MSN and has very long on-topic pages, but the overall keyword density is low.

BillyS




msg:3250294
 5:54 pm on Feb 12, 2007 (gmt 0)

I just got done putting together a 1,400 word article on a very specific topic. There is no way I could not help but repeat the keyphrase - that's just how it has to be written (trust me...).

Anyway, it's a very informative article, complete with a "how to do it," "how to interpret results," and a detailed example the user can reference. It's taking a concept taught in college and telling someone how to do it in the working world.

What is described in the article is something I do for a living - and it's a very desirable skill. I stand ZERO chance of getting any referrals from MSN. I expect first page rankings everywhere else.

Fish_Texas




msg:3250357
 6:51 pm on Feb 12, 2007 (gmt 0)

iThink...sad but real...all we can do is what the Algo wants. After all we didn't write it.

BillyS...if you are getting good traffic elsewhere, I wouldn't touch a thing. You might consider developing another page with a condensed version of your article...MSN might pick it up.

Maherphil...yep, it works.

Fish Texas

crobb305




msg:3250602
 11:02 pm on Feb 12, 2007 (gmt 0)

You'll need more than a few KW's in your Meta Tags.
The density there is also important.
If you only put a few KW's in Meta Tags without any other wording, your KW density will still be too high.

I don't believe meta tags have anything to do with what I am talking about. All other things equal, reducing article length (document size) from 22 k down to 5k caused the page's rank for a competitive phrase to move from page 5 to page 1. I have been observing this for a year now in MSN. Metatags were unchanged, template unchanged, images unchanged. Just on-page text, paragraphs, etc were deleted.

Most of the pages ranking in the top 20 have little vertical scrolling (very small pages). This is affecting ranks.

I expanded this test to another site of mine, and boom. The same thing. It went from position 41 to 12 for the competitive key phrase.

Anyway, it's a very informative article, complete with a "how to do it," "how to interpret results," and a detailed example the user can reference. It's taking a concept taught in college and telling someone how to do it in the working world...I stand ZERO chance of getting any referrals from MSN. I expect first page rankings everywhere else

And if you reduced it down to a one-paragraph summary (removing all of the useful information), you may rank much, much higher for the important phrases (especially those contained in any anchor text pointing to the page). Fortunately, you leave the document unchanged because it is valuable to your visitors; unfortunately, MSN doesn't value the content.

I think MSN's reasoning is that small pages score better on usability. But some of the most informative documents out there are long, hard on the eyes, and require lots of scrolling...I still want them to be listed if I am looking for them, not suppressed in favor of small, uninformative blurbs and blogspots.

nonni




msg:3250726
 1:21 am on Feb 13, 2007 (gmt 0)

I disagree that short-short is better on MSN. I get more traffic from MSN than Google, and most is from moderate to lengthy, well written pages that deserve to be on the first page of results. These are pages written to really inform the audience - I research, analyze, have photos and illustrations, and take time to write professionally.

It is 2007, but no reason to be sad that keyword density is a big factor in some search engines. Keyword density indicates what an article is about. An article that is really about purple widgets will have a higher KWD, and all other things being equal, probably should be at the top of the search results when you type in purple widgets.

The problem is that people manipulate the keywords to spam a search engine. Just like they manipulate whatever they think Google likes (links and more links). I personally am sad that my guest book and a forum on one of my sites became over-run with links to sites about viagra and mortgage deals - and since I didn't have hours a day to police them, I decided to close them. Most spammers were too lazy or stupid to determine if there is a no-follow on those sites. Would tens of thousands of previously useful sites have been vandalized in this way if not for Google and their algorithm? Doubt it.

A heavy reliance on links is just as bad as a heavy reliance on keywords - except that weakness ultimately favors big corporations and professional spammers. I craft a 1000 or 2000 word article on a topic where I have expertise, include valuable original content, references, links to other valuable information, and get beat in Google by scores of useless articles that mention the key word once in passing. But some are from international news corporations or ecommerce sites with a budget to do obnoxious SEO ... they have page rank, they have links from their megasite or from black-hat bots. And they mention purple widgets once on the page, so that must be what people are looking for. Except it isn't.

All machine search enginge algorithms shuck. They shuck in different ways, and some shuck worse than others. But they are all utterly devoid of the intelligence that we expect from a reasonably bright human being. It's 2007, and no machine search engine can reliably distinguish between shirt and shineola.

[edited by: nonni at 1:26 am (utc) on Feb. 13, 2007]

Fish_Texas




msg:3250730
 1:23 am on Feb 13, 2007 (gmt 0)

Crobb305...
I understand your situation, good luck...
Fish Texas

crobb305




msg:3250790
 2:29 am on Feb 13, 2007 (gmt 0)

I disagree that short-short is better on MSN. I get more traffic from MSN than Google, and most is from moderate to lengthy, well written pages that deserve to be on the first page of results. These are pages written to really inform the audience - I research, analyze, have photos and illustrations, and take time to write professionally.

There are many of us who write professionally. Unfortunately, in my industry, I have observed that very short content pages outrank the longer, more informative ones (on competitive phrases; the same may not be true for more obscure ones).

maherphil




msg:3250802
 2:44 am on Feb 13, 2007 (gmt 0)

There are many of us who write professionally.

That is very interesting because my site's author is a well researched expert in the area. Maybe MSN is on to something?

crobb305




msg:3250832
 3:31 am on Feb 13, 2007 (gmt 0)

That is very interesting because my site's author is a well researched expert in the area. Maybe MSN is on to something?

Not sure what you mean, but an algorithm has no way of knowing if something is written by a Ph.d, a high school student, an "expert" or a novice. Granted, things like grammar and spelling can be indicators of quality, but I just don't think they should assign value based on document length, which is what I seem to be observing in some cases. Again, I had two test cases which had the exact same result. I think that says something, though it doesn't complete the picture.

TrustNo1




msg:3250861
 4:30 am on Feb 13, 2007 (gmt 0)

.

[edited by: TrustNo1 at 4:39 am (utc) on Feb. 13, 2007]

simey




msg:3250943
 6:33 am on Feb 13, 2007 (gmt 0)

All I know... is personally I would'nt make any changes to my site in order to please MSN.
Unless the site was not-ranking or banned in both yahoo and google.

AndAgain




msg:3250964
 7:39 am on Feb 13, 2007 (gmt 0)

Keyword Density?

Now thats an old approach...

Im with Simey on this one...now way in hell would I chop a page in half to please MSN. I might do other things but it would be much more subtle and I sure the heck wouldnt waste my time with KW density analyzer. I quit using those over five years ago and have thousands of top tens?

adsoft13




msg:3251044
 9:19 am on Feb 13, 2007 (gmt 0)

not only msn.

Check top 10 or 20 for phentermine in google - highly spammed word ...
NO content a lot of links.

creative craig




msg:3251054
 9:36 am on Feb 13, 2007 (gmt 0)

This is one way of ensuring that Wikipedia wont dominate the search results.

SteveWh




msg:3251105
 11:30 am on Feb 13, 2007 (gmt 0)

...an algorithm has no way of knowing if something is written by a Ph.d, a high school student, an "expert" or a novice.

Natural language analysis combined with statistical analysis of, as you said, grammar and spelling, plus vocabulary, should have the potential of making a pretty good guess in that regard about any article, given a pool of similar articles written at various skill levels to use as the reference data for the analysis.

It should be possible to determine, without being given explicit meta keywords: what an article is about (and thus to classify it by subject), whether it is in scholarly or informal or opinion style (giving some indication of its likely merits), and whether it is well written in terms of spelling, grammar, and usage (giving some indication of the author's educational level and/or writing expertise).

The few factors that Google has publicly acknowledged are considered by its algorithm, and the additional factors which outside people speculate might be considered by both the Google and MS algos are remarkably simple and simplistic compared to what could technically be done.

At various times, both Microsoft and Google have been reputed to have active natural language research departments, which one would expect to be studying this area. It is puzzling why there isn't overwhelming evidence that they're putting the results of such research to practical use as soon as they can. Maybe they're both waiting until they've "got it nailed" before they start implementing it, but the technical know-how to perform basic analysis of this type has been around for years. Maybe it's so computing intensive that they can't spare the time that it takes to process a page that way.

One thing I haven't seen speculated about much so far is the possibility that they're intentionally introducing random elements to confound those who would guess their algos and explicitly attempt to exploit them.

[edited by: SteveWh at 11:33 am (utc) on Feb. 13, 2007]

crobb305




msg:3251214
 1:07 pm on Feb 13, 2007 (gmt 0)

Im with Simey on this one...now way in hell would I chop a page in half to please MSN.

Under normal circumstances I would agree. I had my reasons for changing the content on the page, and in the interim it was reduced to just 5k as I stated. The temporary content reduction gave me an opportunity to see what would happen, since I had already been observing small pages outranking longer documents. During the 2 to 3 day period when the homepage content on two of my sites was being changed, MSN responded the same way to both. The content, though changed/rewritten, has since been added back to my pages, so I expect to see a resulting drop in MSN rankings on the next crawl.

The experiement really happened because of other circumstances, but it was still interesting to see. As has been expressed above, it is not recommended that you make sweeping changes JUST to please MSN. MSN simply doesn't drive enough traffic to justify the loss of indexing on longtail phrases at Google.

BillyS




msg:3251328
 2:46 pm on Feb 13, 2007 (gmt 0)

I make pages as long as they have to be to cover a topic thoroughly. That ranges anywhere from around 100 words for definitions to around 1,500 words for more complex topics. My goal is to keep pages under 1,000 words. I've got helpful online tools that have instructions that are around 300 words. MSN hates them too. The others love those pages.

I try to break up a topic where I can, but it's not always possible. And personally, I have no intention of breaking up these pages for MSN. If I thought they were providing good results I might consider creating smaller pages as suggested - but their is no way I'm changing a thing based on the results I see.

maherphil




msg:3251440
 4:06 pm on Feb 13, 2007 (gmt 0)

At various times, both Microsoft and Google have been reputed to have active natural language research departments, which one would expect to be studying this area.

Yes, this is what I'm getting at. An algo can see HOW a subject is discussed. What are the words used to describe the topic, how are they inter-related.

A lawyer would talk differently to other lawyers (using their jargon) then one would talk to a client.

crobb305




msg:3251582
 6:33 pm on Feb 13, 2007 (gmt 0)

Natural language analysis combined with statistical analysis of, as you said, grammar and spelling, plus vocabulary, should have the potential of making a pretty good guess in that regard about any article, given a pool of similar articles written at various skill levels
and
An algo can see HOW a subject is discussed

Natural language analysis is a great concept. But if the algorithm ranks a page simply because it is smaller (perhaps making it "user friendly"), then the language hasn't truely been analyzed (has it?); there may not be any substance there to analyze on extremely small pages (blurbs, sales copy, etc). I am not a statistician or purport to know anything about language analyses. I am simply trying to understand why preference may be given to smaller documents, all things considered equal. Again, this has been my observation over the past year and I have seen it work when tested. Was the test worth it? No.

SteveWh




msg:3252174
 8:18 am on Feb 14, 2007 (gmt 0)

if the algorithm ranks a page simply because it is smaller (perhaps making it "user friendly"), then the language hasn't truely been analyzed (has it?);

The "user friendly" part might be a bit of an over-reach... or maybe not?: Consider the target audience. What if MS considers that topic to be primarily of interest to school kids doing research for a paper? I'm not really suggesting that's the case, but your questions about why shorter could ever be considered better brought to mind the idea that there might be in some cases logical reasons for it. Most of those reasons, like the one above, assume a greater amount of page analysis that it seems likely anyone does. How "concisely" a page treats a topic, while retaining the same quality, would take a pretty advanced analysis(!).

I am simply trying to understand why preference may be given to smaller documents, all things considered equal.

I have some very long articles on my site, pages 30 to 70 KB, all text, that rank higher than I would expect for some keyword searches (they have to be fairly specific to get them to turn up, but not so specific as to eliminate similar pages from other sites). But the page that ranks highest, for a slightly less specific keyword search, is the index page to those pages, which is short and doesn't really address the topic at all, being a summary and index page. One reason it rates higher in this case might be that it was detected as an index page, which will lead to all the others when followed.

Again, this has been my observation over the past year and I have seen it work when tested.
The fact that an explicit test brought that result is most interesting.

I admit to not having followed closely discussions about what factors are used in G or MS algorithms, but something I do seem to detect -- can it be? -- is that people now seem to care more about the MS algo and how they rank in Live Search than they did even a few months ago.

s0crates9




msg:3257106
 6:56 pm on Feb 19, 2007 (gmt 0)

Perhaps this has something to do with page loading time and less about stricly page size and content length? Has this experiment been used with a mostly CSS coded page or a standard HTML page?

Just a couple of considerations that may very well play a part in the algo.

This 32 message thread spans 2 pages: 32 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Microsoft / Bing Search Engine News
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved