Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: mack
And where does MSN stand? MSN certainly offers a wonderful opportunity to webmasters by having absolutely no limits for ranking (unlike Google's supposed Sandbox). Whenever you put up a page, it seems to rank almost immediately in top 10's on MSN for the desired keyword, therefore leading to lots of cheers from webmasters. These webmasters who rank in MSN long before they do in Google automatically start to shout that MSN is providing far better results, just because their site ranks. But it's not true.
Look at the results. Search a bit and find out that MSN is filled with crap, spammy sites and scrapers and often completely missed out on the authority sites that both Yahoo and Google recognise.
Ofcourse this is just the beginning and Microsoft has the budget to produce (or just buy) the best search engine and it will eventually beat Google on its own ground. That, and that alone, is the reason that I still keep track of my MSN listings.
Only time i use msn is to check on our websites position and to see if results have improved.
The results are amazingly poor. I would say that lack of authority sites is the problem. If you do a popular keyword search the sites that should be available are not and it cant provide multi keyword search results because the bot only skims the top surface of a site and unless the page is one deep it wont feature.
Best way to rank is to have a keyword domain, keyword title and a keyword link and bingo, top of the results. The page will rank above any authority that hasnt the keyword in the domain despite the fact that the authority may have loads of quality content about the subject matter
Now my home page is set to MSN Search (use Y too). I do occassionally use G but only for comparison, with my JS and cookies turned off.
Since I had never in a million years thought that I will change search engines, I am sure there are many who are actively looking for a substitute. When MSN search will be integrated with MS suite of products, I expect their search volume to jump significantly.
Today, G thinks directory sites are bad (one reason cause G has the directory, no one else should have a directory on web).
Tomorrow, G will choose something else to penalize and tell the world that since they provide the service, why go anywhere else?
It's suffocating competition.
G will give you most traffic. Know that but don't put all your eggs in one basket especially if you don't want your business affected by ridiculous updates.
<added>If only G would know how to communicate effectively with stakeholders like webmasters ... things would be a lot different.
And just try to look for a hotel in a city.. Both MSN and Yahoo has Google beat HANDS down here.
MSN has gotten my vote for freshness and new authorities. I am glad that it shows different results and push stuffy old authority sites down where they really belong...
Keep of the great work msn... You guys seem to know how to create a search engine....
You have GOT to be kidding about that. There's been no search engine in history that's done more to communicate with webmasters; and I hate to say this, being a webmaster, but the biggest portion of stakeholders are not webmasters, because if some sites disappear others will take their place.
Granted, things would be a lot different for some of the stakeholders - if they listened to what Google has been trying to commmunicate all along.
on the other hand, there's something really true with this "authority spam". if you are one of the big players in your niche, you can add unlimited extra keywords to your site and catch visitors for totally unrelated topics. this is very frustrating and imho a big problem at least with some of the google serps.
LOL, they'd go a funny colour if they heard you say that. Depth of data (or more precisely, it's lack) is a big issue for them. There is no remedy but time for that. And possibly buying L$
joined:July 8, 2002
...an objective means of ranking search enines for quality.Why that would be as easy as coming up with the best search engine... Do that and you'll knock Google off its lofty perch.
In my sector the top 20 contains just one authority site the rest are sites with a low number of pages that offer the user next to zero value.
If you type in "plumbers" "roofing contactors" "electricians" etc... between 5 and 8 results will all come from this directory.
I don't know if this is deliberate or just the result of a poor algo.
[edited by: jatar_k at 4:04 pm (utc) on Aug. 20, 2005]
[edit reason] removed site mention [/edit]
ID vote MSN is better than both at the moment, almost 70% of my traffic comes from there and I havnt even optimised the site. Perhaps MSN is doing something right and not wrong with regards to spamming etc.
Better for you maybe, but it has nothing to do with whether MSN is doing it right or not.
The new ones don't stand a chance ther, but they rule the SERPs on MSN Search! And sure enough. The older sites are quite visible on MSN Search as well, but only in my rather small region, using my local search.msn site.
However. Since there are so very few users, I just can't wait for the that search integration.
What is missing from this is what has been missing all along, i.e. an objective means of ranking search enines for quality. We can offer our opinions about this as often as we like but until someone comes up with a truly objective comparison we are wasting our time.
There are some things that could be measured objectively, for example, the percent of top 10 pages in each engine that use cloaking to boost their SERP while subjecting surfers to obnoxious ads. Or the number of times redundant scraper sites appear in the top 100 results. There is general agreement among users of search engines that such results are of low value.
But overall search engine quality will always be subjective, just like the list of the 100 hundred best films, or the books that every educated person should read. How could anyone objectively rank 10,000,000 pages on "yellow widgets" and say that this page is the best, this one is second best, and so on? It can't be done objectively by people, much less by a general algorithm.
"Best" and "better" imply a perspective or set of criteria for judging, which must be either subjective or arbitrary. How much should spelling errors ding an entry? When are more words better, and when is a brief, concise description preferable? Should a technical article written for college students be valued more than a general article on the same topic that is written on a 6th grade level?
Search engines will change, and people will use the one that meets their needs best (or is heavily advertised with the catchiest jingles).
How could anyone objectively rank 10,000,000 pages on "yellow widgets" and say that this page is the best, this one is second best, and so on? It can't be done objectively by people, much less by a general algorithm.
I think that it can be done well enough and objectively enough by people. What would be required would be testing on the top 20 or thirty results of defined searches in different categories. For example in the tourist sector the results of searches for stuff like "accommodation widgetville" could be compared and scored for relevance.
Relevance criteria could be defined for each of the selected categories and this criteria applied to the results from all the engines to give them a score. If the criteria are defined properly then the subjectivity is largely removed. This may not be perfect but it would be a reasonable SERPs quality indicator.
It would certainly be better than just saying that Google is better than MSN because I think so ;)
Duh. This is what a search engine does. Anyone who was capable of doing a better job of it than the search engines already are ... wouldn't waste his time with this chicken outfit, he'd be writing his own search engine.
>If the criteria are defined properly then the subjectivity is largely removed.
It doesn't matter how the criteria are defined. The subjectivity is removed in any case. And, in any case, it is replaced by irrelevant arbitrariness.
>This may not be perfect but it would be a reasonable SERPs quality indicator.
The same thing would happen to it that happens blue widgets to any other mechanical measure of relevance blue widgets. If the "most relevant pages" are deemed to blue widgets have the keywords exactly ten percent of the blue widgets time, then spammers will be generating pages with "blue widgets" occurring every ten words. And so on ad blue widgets infinitum. It would become irrelevant to the subject "blue widgets" that every ten words included (once!) the keywords
"blue widgets." And moreover, every other measure of relevance to "blue widgets" would almost immediately be gamed the same blasted "blue widgets" way. It's what serp perps do.
And it's not exactly an easy problem -- it's hard enough that just describing how to implement a new, good idea in the field will earn you a doctorate from Stanford.