Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: mack
MSN is making 6 major investments to the Fall Update
- Coverage - index size
- Query Intent
- Query Refinement
- Structured Information Extraction
- Rich Answers
P.S. The information from this event was embargoed until 2100. Apparently they pushed that up to 1700.
To see the new live in action...
Now, use a commercial term for your search query. ;)
I've found some issues with 301 redirects appearing as a URL title
so maybe thats where the issue is with my page totals, I've had the redirect on site for years now to stop any duplicate issues in G, hopefully that will sort itself out our the next few days.
Its the same old junk results as ever and the same old spin we keep hearing from them about how great the facility is.
In the grand scale of things in terms of search quality the order is as follows (top 10 with MSN at the back of the pack):-
I see absolutely zero improvements to the search and they have absolutely zero chance of pulling market share from either Google or Yahoo whilst they continue to churn out this junk yet maintain its a great search facility.
Once they wake up to the fact that the search facility is dire and want to seriously look at improving it we may then have a chance at having a serious contender to googles hold on the market. Currently i think they beleive their own marketing BS.
What a waste!
As the register says [theregister.co.uk]
Feel free to chuckle.
Also from this blog [oilman.ca]
new answers platform
answers and multi media are incorporated in main serp
main improvements to news, images answers, local and maps
Example - san jose weather - results in page. Donít need to click away.
Example - stocks - real time data.
Example - Barack Obama - integrated news
Example - space shuttle video - video in results
Wheres the realtime data?
And wheres my celeb ranking data?
The super new live search must not be live yet....
EDIT: The search is still behind the scenes so you have to go to this url first to set a cookie.
The stock results look like googles, except a bit smaller.
Altavista uses Yahoo Search, and their web search results are identical. Other searches are different. News seems to be better, images worse.
I prefer the Altavista UI. It also has some nice touches like inferring your country from your Accept-Language header, rather than your IP.
Many of my pages (ALL white hat, NO problems) that are indexed and showing on Live right now, are not indexed in the New Live (using the link posted in this thread)
All that's showing in the serps is my index page, and a few other pages. Other perfectly good pages are not even indexed.
Oh h*ll, here we go again...chasing our tails.
Same old BS serps RichTC...
[edited by: Fish_Texas at 6:46 pm (utc) on Sep. 27, 2007]
Hire the best talent in the world, pay them top dollar plus. Steal them from Google, Yahoo and others. Hire the top marketing men, PR guys, etc, etc.
If they and Yahoo don't act soon Google will be so far ahead they will never catch up!...KF
Found some of my page bloat totals, the new indices seems to be including urls that are blocked in my robots.txt I've thousands of url only listings for 301 redirects and pages that are blocked by my bot file.
The new live index reports 112064 for my site. Which is possible, but only if it includes all excluded pages (duplicates and honeypots).
The worst thing is that it seems to ignore both robots.txt (when I exclude a page, I'm expecting that it's not included in the index, at all) and robots meta-tags : I'm using wildcard in robots.txt, and since msnbot doesn't understand this, I duplicate the setting in meta-tags ("noindex"). But I find plenty of these pages in the new index, with snippet and correct title (so it has read the page, but ignores the metatag).
The worst thing is that it seems to ignore both robots.txt (when I exclude a page, I'm expecting that it's not included in the index, at all)
Not going to happen if you are using the robots.txt method to exclude pages. With Google, you get a URI only listing but it is there and counted in the overall numbers. You're performing advanced searches which uncover this type of stuff so don't expect the general public to be digging back there and seeing your blocked content. Anyone else can just browse to /robots.txt and see what is going on (in most instances).
Not going to happen if you are using the robots.txt method to exclude pages.
What's the point to include url-only references? How would it help users who are fed this type of result? And how does the SE know that they are relevant to the search query?
MSN is indexing thousands of my honeypots pages. That's plain stupid. Google does not.
And if these results only appear with "inurl" or "site" commands, why include them at all in the index?
They may still have some work to do but I really have to say way to go Microsoft, in our sector as long as you can get more traffic to your engine in general we may finally have some Google competition.