Forum Moderators: mack
We all know that they currently haven’t got a clue and that the msn serps are at best absolutely dire but what can we do about it?. I have a few ideas and im sure some of you have?
Lets face it we need msn to be a contender in search, its not good having this market controlled by one search engine (google) but unless they get it right that’s the way its going to stay!
Here’s my thoughts:-
Problem 1.
Deep crawling is still an issue – its needs looking at, without the data they cant even start to produce a half decent set of serps. I am still to find any site with all its pages cashed in msn
Solution 1.
Perhaps they should introduce site map submission? This would help them collect all the data and webmasters would be submitting the important pages only hence, their bot wouldn’t have to look for pages or waste time crawling pages on a site that are not required or of little use to the end user..
Problem 2.
Unless the domain name says “bluewidgets.com” msn can’t tell if the site is about blue widgets. If a site is a brand name like “nightsky.com” msn doesn’t know that site is about blue widgets – yet it could be the authority site on the subject.
MSN also currently ditches quality sites in error because it can’t tell what’s quality from junk or spam. It’s currently in a real mess now with more that 60% of quality sites missing from its serps imo.
Solution 2.
Use "Bcentral" data. Currently its directory doesn’t even accept UK submissions, yet this has to be a great way for msn to know if a site is junk or quality – if one of their staff has reviewed the site. Why don’t they charge a few hundred pounds a year to edit a site to get listed in the directory - they could then use the data within their algorithm. This way they would know what the site was about would improve quality. Also it would be all in-house so very little chance of corrupt data editing by its own staff.
Problem 3.
Due to the problem that msn has recognizing an authority site this means that it cant tell if a site in its index has back links from other authority or important sites or if the links are just from blogs and junk sites. Currently some junk sites rank high in msn with nothing more than junk back links.
Solution 3.
Introduce some kind of rating system for a site, im not saying “page rank” but some kind of “site rank” has to be used. Currently it looks like only .gov and .edu carry weight in msn and this is why you see a .gov site with a page that mentions a keyword ranking high in the results above a dedicated authority site on the subject matter. They need a way of rating site pages.
Problem 4.
They don’t seem to think they have a problem with their search results and this is worrying. They believe their own hype and think that msn search is relevant when it’s absolutely dire. They either cant see or dont want to see any problems with it.
Solution 4.
They should hold hands up and admit they need to improve the search. Take note that its search facility is currently next to useless and act on it!. Webmasters can help them improve the search; after all it was webmasters that started using google in the first instance that helped to get google to market leader where it is now. They should embrace us and let us work more actively with them to up their game. If webmasters start using msn and tell clients it quickly snowballs, add to that being able to retain users and not have them defect to google and they will make serious ground. Currently i dont know many webmasters that would use or recommend live search, the search is currently a laughing stock in the webmaster community and that needs to change.
The list is endless, but i think this makes a good start.....
Yes i would agree. Googles serps have been going down hill since september last year. The high rating for wiki and this everflux duplicate content filter introduction shows that despite googles experience in search even they can mess things up, but as market leader they now only need mediocre serps to retain pole position - MSN could take full advantage of this situation - yet they are not!.
Unless msn want to up their game surfers will have to put up with sub standard search results from both google and msn. Progress wise i think yahoo is currently making the most progress!
asia
We can only try, i agree that its likely that nothing will change but msn engineers do read webmasterworld periodically so you never know!
Solution: Bring back the pages that do not break rules and ignore the others.
Solution 2: Put into place a mechanism whereby the siteowner/webmaster could receive some sort of fair warning that there are a number of pages on his/her site in violation of specific rules.
Solution 3: Here's a really radical idea -- Tell people what the rules are!
...................................
2) Increase spidering. Looks like 14 days, on average, for well-ranking sites to get a new cache.
3) Develop a sitemap program, or at least get the url-submit tool fixed. With such slow crawl rates, the url submit option seemed to have faster indexing response times, than waiting for msnbot to "discover" your page.
4) Get rid of blogspot and deceptive-redirect junk once and for all. Just do a sweep and get rid of it. If a url uses a refresh to another page, then index the destination page or nothing at all; stop indexing (and ranking) the url that immediately redirects to more junk.
5) Create longer-term serp stability. Well-ranking websites (even older/established ones) shouldn't be here, gone tomorrow, then back in two weeks after the next crawl. There needs to be some stability; your users expect that, especially when they know the website they are looking for but can't remember it's exact url.
[edited by: crobb305 at 5:48 pm (utc) on Mar. 12, 2007]
/creates a formal route for webmasters to relate directly with the msn corporate
Yes, I agree. And much of what I suggested above would be solved. I still like the live chat idea, but at least with a webmaster console, we should have a "Report spam" and a "Reinclusion request" option that delievers our comments to a real place/person.
Also, i dont think any search engine wants human input either despite my idea regarding using Bcentral data in its algo. I guess the algo needs to take care of all issues without human involvement BUT webmasters submitting site maps could be a major win for msn in at least having site data submitted which they currently cant collect - i cant see why they dont offer a site map submit service - its a no brainer?
I think a complete pull on blogspam junk a great idea, sorry but i dont see any room for blogs in the natural serps full stop - no doubt some will disagree.
As for a webmaster console, i think this is also another good idea but they could do that far better than googles offering. That would certainly turn more webmasters towards using them more thats for sure.
Also, i still dont see why some sort of site review for a fee policy couldnt be introduced. If you run a quality site thats an authority on a subject and msn has ditched your site due to some reason unknown to you and you are 100% within their guidelines i cant see any reason why it cant be reviewed?. If they got it wrong they need to put it right and learn why their algo ditched a good quality site in error to improve the search quality for others.
Im sure there are many other ideas, but so far so good - something has to change if they are serious about being the best