homepage Welcome to WebmasterWorld Guest from 54.211.180.175
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Microsoft / Bing Search Engine News
Forum Library, Charter, Moderators: mack

Bing Search Engine News Forum

    
How can MSN improve their search engine results?
Lets help them do it better!
RichTC




msg:3278910
 2:56 am on Mar 12, 2007 (gmt 0)

Ok, I thought it about time we could discuss how msn could improve its search engine results.

We all know that they currently haven’t got a clue and that the msn serps are at best absolutely dire but what can we do about it?. I have a few ideas and im sure some of you have?

Lets face it we need msn to be a contender in search, its not good having this market controlled by one search engine (google) but unless they get it right that’s the way its going to stay!

Here’s my thoughts:-

Problem 1.
Deep crawling is still an issue – its needs looking at, without the data they cant even start to produce a half decent set of serps. I am still to find any site with all its pages cashed in msn

Solution 1.
Perhaps they should introduce site map submission? This would help them collect all the data and webmasters would be submitting the important pages only hence, their bot wouldn’t have to look for pages or waste time crawling pages on a site that are not required or of little use to the end user..

Problem 2.
Unless the domain name says “bluewidgets.com” msn can’t tell if the site is about blue widgets. If a site is a brand name like “nightsky.com” msn doesn’t know that site is about blue widgets – yet it could be the authority site on the subject.
MSN also currently ditches quality sites in error because it can’t tell what’s quality from junk or spam. It’s currently in a real mess now with more that 60% of quality sites missing from its serps imo.

Solution 2.
Use "Bcentral" data. Currently its directory doesn’t even accept UK submissions, yet this has to be a great way for msn to know if a site is junk or quality – if one of their staff has reviewed the site. Why don’t they charge a few hundred pounds a year to edit a site to get listed in the directory - they could then use the data within their algorithm. This way they would know what the site was about would improve quality. Also it would be all in-house so very little chance of corrupt data editing by its own staff.

Problem 3.
Due to the problem that msn has recognizing an authority site this means that it cant tell if a site in its index has back links from other authority or important sites or if the links are just from blogs and junk sites. Currently some junk sites rank high in msn with nothing more than junk back links.

Solution 3.
Introduce some kind of rating system for a site, im not saying “page rank” but some kind of “site rank” has to be used. Currently it looks like only .gov and .edu carry weight in msn and this is why you see a .gov site with a page that mentions a keyword ranking high in the results above a dedicated authority site on the subject matter. They need a way of rating site pages.

Problem 4.
They don’t seem to think they have a problem with their search results and this is worrying. They believe their own hype and think that msn search is relevant when it’s absolutely dire. They either cant see or dont want to see any problems with it.

Solution 4.
They should hold hands up and admit they need to improve the search. Take note that its search facility is currently next to useless and act on it!. Webmasters can help them improve the search; after all it was webmasters that started using google in the first instance that helped to get google to market leader where it is now. They should embrace us and let us work more actively with them to up their game. If webmasters start using msn and tell clients it quickly snowballs, add to that being able to retain users and not have them defect to google and they will make serious ground. Currently i dont know many webmasters that would use or recommend live search, the search is currently a laughing stock in the webmaster community and that needs to change.

The list is endless, but i think this makes a good start.....

 

willybfriendly




msg:3278969
 4:32 am on Mar 12, 2007 (gmt 0)

They could follow G's lead and just list Wikipedia in the top 5 spots :o

WBF

asiaseo




msg:3279047
 6:29 am on Mar 12, 2007 (gmt 0)

I agree, if only the search engines would think about working together with webmasters.
The chances of anyone listening in my opinion is low.
I do not think anyone has a problem playing by the rules, problem is we do not know what the rules are!
We have never bought links, we suffer I guess because we do not have so many links, we just make local content websites for clients with information for the user.
It has been a learning curve understanding what the search engines want, trouble is they keep changing the curve!
There are millions of perfectly good informtative websites that never get in the top 500 because of course they do not understand just how much we are all expected to do, and of course the different search engines require things done differently.
Sadly I think the chances of any of the search engines listening is almost non-existant.

RichTC




msg:3279191
 10:31 am on Mar 12, 2007 (gmt 0)

willy

Yes i would agree. Googles serps have been going down hill since september last year. The high rating for wiki and this everflux duplicate content filter introduction shows that despite googles experience in search even they can mess things up, but as market leader they now only need mediocre serps to retain pole position - MSN could take full advantage of this situation - yet they are not!.

Unless msn want to up their game surfers will have to put up with sub standard search results from both google and msn. Progress wise i think yahoo is currently making the most progress!

asia

We can only try, i agree that its likely that nothing will change but msn engineers do read webmasterworld periodically so you never know!

Reno




msg:3279310
 1:08 pm on Mar 12, 2007 (gmt 0)

Thank you RichTC for starting a positive thread that deals with ways for LIVE to improve itself. My only hope is that they avoid trying to "be like" Google and actually try to be "better than" G. That means doing many of the things that you listed, and not following Google's ridiculous policy of punishing/penalizing an entire site because of infractions on only certain pages.

Solution: Bring back the pages that do not break rules and ignore the others.

Solution 2: Put into place a mechanism whereby the siteowner/webmaster could receive some sort of fair warning that there are a number of pages on his/her site in violation of specific rules.

Solution 3: Here's a really radical idea -- Tell people what the rules are!

...................................

crobb305




msg:3279584
 5:33 pm on Mar 12, 2007 (gmt 0)

1) I have thought it would be nice for MSN support to have a "live chat" available for selected webmasters (trusted webmasters) to serve as the eyes for MSN, watching their searches and helping monitor for junk. This would not be used to tattletale on competitors (that is where the "trust" comes in-- MSNdude has seen many of us in action of the years). Rather, we could help monitor for the type of junk we see discussed in these threads over and over. This would be free quality-monitoring labor and many of us webmasters would love to participate to help make the MSN search better. There are so many of us out here and we are watching different phrases/industries. We try to communicate the really bad stuff to MSN, and even occasionally get stickies from MSNdude asking for examples. A realtime serp-quality-reporting system COULD be very effective. Just some thoughts. Of course, I could see abuse problems, so the selection program would be invitation only.

2) Increase spidering. Looks like 14 days, on average, for well-ranking sites to get a new cache.

3) Develop a sitemap program, or at least get the url-submit tool fixed. With such slow crawl rates, the url submit option seemed to have faster indexing response times, than waiting for msnbot to "discover" your page.

4) Get rid of blogspot and deceptive-redirect junk once and for all. Just do a sweep and get rid of it. If a url uses a refresh to another page, then index the destination page or nothing at all; stop indexing (and ranking) the url that immediately redirects to more junk.

5) Create longer-term serp stability. Well-ranking websites (even older/established ones) shouldn't be here, gone tomorrow, then back in two weeks after the next crawl. There needs to be some stability; your users expect that, especially when they know the website they are looking for but can't remember it's exact url.

[edited by: crobb305 at 5:48 pm (utc) on Mar. 12, 2007]

centime




msg:3279610
 5:45 pm on Mar 12, 2007 (gmt 0)

Give us specialised webmasters accounts like the other big 2 have,

creates a formal route for webmasters to relate directly with the msn corporate

crobb305




msg:3279622
 5:49 pm on Mar 12, 2007 (gmt 0)

/creates a formal route for webmasters to relate directly with the msn corporate

Yes, I agree. And much of what I suggested above would be solved. I still like the live chat idea, but at least with a webmaster console, we should have a "Report spam" and a "Reinclusion request" option that delievers our comments to a real place/person.

CainIV




msg:3280812
 5:47 pm on Mar 13, 2007 (gmt 0)

Maybe we should start with active responses from emails to either the msn rep here or webspam@microsoft.com, as communication has dropped sharply for webmasters.

RichTC




msg:3282166
 9:56 pm on Mar 14, 2007 (gmt 0)

None of the big three are strong on communicating with webmasters lets face it - msn could change this but its unlikely.

Also, i dont think any search engine wants human input either despite my idea regarding using Bcentral data in its algo. I guess the algo needs to take care of all issues without human involvement BUT webmasters submitting site maps could be a major win for msn in at least having site data submitted which they currently cant collect - i cant see why they dont offer a site map submit service - its a no brainer?

I think a complete pull on blogspam junk a great idea, sorry but i dont see any room for blogs in the natural serps full stop - no doubt some will disagree.

As for a webmaster console, i think this is also another good idea but they could do that far better than googles offering. That would certainly turn more webmasters towards using them more thats for sure.

Also, i still dont see why some sort of site review for a fee policy couldnt be introduced. If you run a quality site thats an authority on a subject and msn has ditched your site due to some reason unknown to you and you are 100% within their guidelines i cant see any reason why it cant be reviewed?. If they got it wrong they need to put it right and learn why their algo ditched a good quality site in error to improve the search quality for others.

Im sure there are many other ideas, but so far so good - something has to change if they are serious about being the best

roxyyo




msg:3289016
 10:28 pm on Mar 21, 2007 (gmt 0)

Too bad bCentral submissions are closed.

And good luck trying to get customer service about that.

Actually I had a form where you can report spammy results and now I can't find the link to it, does anyone know where I can find that?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Microsoft / Bing Search Engine News
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved