Forum Moderators: mack

Message Too Old, No Replies

Anyone else seeing big changes at MSN?

         

Marcia

11:18 am on Aug 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's up and down like a freight elevator every few weeks at MSN, but some pages have moved up, others down a bit and just about nothing I've got is the same as it was a few days ago. And it looks like a lot of the junk is now gone.

Anyone else notice anything?

DaveAtIFG

11:59 am on Aug 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yep. The keyword stuffed pages added by the last tweak are gone and the remaining sites have been reranked in an area I follow.

Liane

12:07 pm on Aug 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The results actually look pretty good! It appears they have actually separated the wheat from the chaff this time around.

Wonder how long this will last?

Marcia

12:16 pm on Aug 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The last MSN update [webmasterworld.com] was on July 24, which was two weeks ago Sunday, and people don't seem to get too interested in it, but I'm finding it almost as exciting as when Alta Vista used to have their weekly Monday updates.

MSN delivers some fine targeted traffic for some sites, and they sure do keep their index fresh. NICE job they're doing for being the new kids on the block, imho.

PaulPA

12:37 pm on Aug 9, 2005 (gmt 0)

10+ Year Member



Well I still find MSN baffling. In my area beyond the first 3 or 4 listings the results are absolutely ridiculous. And we are talking about a common business area. Yes I do have a personal grudge since on a very basic one-word business term my site is #2 in Google and #1 in Yahoo and I’m nowhere to be found in the first 300 of MSN, but beyond that the results are no match for the other two engines. Sometimes I wonder whether their algo is directed more toward consumer searches than business search.

Import Export

12:47 pm on Aug 9, 2005 (gmt 0)

10+ Year Member




In some areas, looks like a step forward.

Unfortunately: Then in more areas, I am find serps that resemble hijacked-browser-type-spyware-after-a-search-for-CASINOS...

ska_demon

2:33 pm on Aug 9, 2005 (gmt 0)

10+ Year Member



I am seeing something I have never seen before. I do a search for site:www.example.com.

Page 1 - 1-10 of 148 results.

I then move to page 2.

Page 2 - 11-20 of 1034 results.

Move on to page 3

Page 3 - 21-30 of 779 results.

This just seems weird to me. Does anyone else see this or have an explanation as to why it might be happening.

Ska

bostonBeans

2:47 pm on Aug 9, 2005 (gmt 0)

10+ Year Member



his just seems weird to me. Does anyone else see this or have an explanation as to why it might be happening.

Ska -

I have worked on some very large-scale search engines in the past, and we had the same 'issue'. In our case it was caused by a tiered architecture and an estimation algorithm that was used to estimate total count when enough results where returned from the primary tier (which included the 'best' content). Only in cases where more results where needed than could be served by the first tier would results actually be served from lower tiers. At this point, our total count would change since we would have actual data about 2 tiers and only have to estimate count for N-1 tiers. The net effect was that the total count would converge towards the actual number of results as you paged into the results set.

Not sure that is the exact cause here, but I am pretty sure it would be related some sort of estimation that improves as you drill into the results set. Search engines are always looking for ways to make the searches more efficient.

-Bb

ska_demon

2:59 pm on Aug 9, 2005 (gmt 0)

10+ Year Member



bostonBeans

Your explanation sounds more than feasible as the site in question is actually a multi tiered directory. Thanks for that. It makes a lot more sense now.

Sorry if I hijacked the thread a bit. I thought it may be relevant ;OP

Ska

Marcia

12:19 am on Aug 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Results are still shifting around, with individual pages moving between first and second page in some cases, and while I generally don't hover over search results with updates or changes, what's been fascinating with MSN is to watch how they handle clustering of pages or indenting results.

In our case it was caused by a tiered architecture and an estimation algorithm that was used to estimate total count when enough results where returned from the primary tier (which included the 'best' content).

bB, how would that relate to how individual sites are handled?

I'm asking because when I did a site:example.com search early this morning, for a site that has about 20-25 or so pages, the first pages to be returned in the results were the index pages of the /directories/ in the site with other pages following. I also saw the bot grabbing just those page in one pass within the day or so prior to this current change. I've yet to see that happen at any engine, though it did seem vaguely suspect that Google was crawling by directory structure at one time.

Also, I did see for that site at the same time n number of pages returned out of 90 - and there's nowhere near 90 pages, MSN currently has just under 20. So was that some kind of guesstimate based on a tiered directory structure?

Putting that together with how they cluster and seem to analyze linking patterns between site pages, I'm wondering how much actual site architecture and directory structure comes into play.

Added:

Now it's saying 10 pages out of 20 (not 90), so it's right this time - but this is one of the reasons MSN does need watching during update periods, if only to catch little things like this when they happen. And they are still showing the /directory/ index pages first - very interesting.

ska_demon

8:45 am on Aug 10, 2005 (gmt 0)

10+ Year Member



Does this mean we really can't be certain exactly how many pages are in the index?

Ska

bostonBeans

11:48 am on Aug 10, 2005 (gmt 0)

10+ Year Member



In our case it was caused by a tiered architecture and an estimation algorithm that was used to estimate total count when enough results where returned from the primary tier (which included the 'best' content)

I should have been more clear. The tiered architecture was on the search engine side. Documents were placed into different tiers based on their perceived importance (authority). If suffecient results where returned from the first tier - which was the case for most standard queries (two or three term query and 10 results requested) - the lower tiers were never quered since that would have been inefficient (slower and more expensive) and an estimation was done to get total count.

With the 'site:' examples here, you could get esimated counts if (1) the search engine is using similar architecture or estimation logic to improve performance and (2) all the results from the given site are not in the same tier. In most cases, not all pages on a site are of the same value, so it is likley the content could be spread across multiple tiers.