Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: mademetop
I'm sure it's mentioned but I couldn't find it on an initial scan through.
Any vaguely scientific report would cover details such as these.
If google commissioned a report which showed it produced the most relevant results, I wouldn't question it. The problem here is that it defies common sense. Therefore it is necessary to tear the research to pieces and see if anything significant stands at the end.
At the end of the day i am sure if the results were bad Inktomi would not have had them published, then no one would be any the more wiser - to many cynics round here ;)
However its all about how you display the results and the methods you use.
I can do a simple to test to find the best SEO Company in the UK but one of the main rules is that they are based within a 3 mile radius of my office ;)
The test looks OK only flawed as based on Ink's basic query log. My experience users on Ink run slightly different q's, bit more adapted to Ink.
Would have loved if they had used WT's all time base or so. Also Ink's base is smaller than G's thus final result relevancy should ultimately be weight by search base.
The simple fact is Google's algo is not half as good as all the diehard fans religiously believe.
It's a good search engine, and it's true that no other engine is better than Google. They got the largest index, and more fresh results than the competition.
But algo wise ATW and INK and AV are all close or on par.
Everything else is just PR :)
If the results are that close, it may be a good thing for Yahoo, and put a little less emphasis on many of us that 'live and breath' based on the whims of a single company.
I 2nd everything Heini says here!
I'm glad they provided the keywords because it enables any of us here to do random testing. Admittedly I only did five searches (picked at random from their list). However, for how I understand "relevance", Google has the slight edge not simply with regard to the content but also with the presentation of results. This can be important although this factor was eliminated from the test and from being presented to the judges- since they only saw the queries and the list of links- and NOT how the results are displayed. Google results tend to highlight and repeat the keyword phrases in a way that's easier for me to determine if I even want to visit the page to determine relevance. But this could be a subjective preference.
Coincidentally, when AltaVista commissioned eTesting Labs to compare search engines, AltaVista was judged the most relevant:
Strangely enough, when Ask Jeeves commissioned eTesting Labs to compare self-service search interfaces, the AskJeeves product came out on top:
It helps a lot to pick the ground rules, what queries to throw out, etc. :)
Perhaps if I commission a study for my own site, it'll beat Google, too?
Somehow, I don't think a simple, "study" will get people to switch to 'brand x' from Google.
Even with a bias in the study, imho, the results from Inktomi are 2nd only to Google.
Now, if they would only kick out all the paid for spam (XML CPC stuff) then they'd have quite an engine.
Not to mention that I'm sure the FTC will do something, at some point, to all the non labeled advertising being floated around...
If you look at relevancy historically Inktomi and FAST have been doing much better than Google. Google is a good search engine - they just haven't improved much over the last year or so. FAST and Inktomi have.
Anyway, this study show the need for a standard for relevancy, as spelled out by Danny Sullivan some weeks ago in ClickZ. If the search engines do not work together on a standard others will - and they most likely won't like it.
Whenever search engines come out with a new version they brag about:
1. How fresh they are - and they have numbers to prove it
2. How big they are - and they have numbers to prove it
3. How relevant they are - but they have NO numbers to prove it
So, Dear Search Engines, get your act together and make that standard. Also, remember to show all results - both good and bad and not like we have seen it with the NPD-studies where only the good results are made public.
As to the results of the portal test you mention: that comes as no real surprise to any open minded user.
Google is vastly overrated, even by experienced webusers.
I think it's true that Google has not substantially improved their relevancy for a long time. They are still good though. The question is just how scalable their algo is.
So the question of how relevant Inktomi results are would only be useful in convincing Yahoo! to ditch Google for its search results. For the average user, it is (next to) useless!
Only this way you can present the result urls in an SE-independent style, so that the jugdes make their unbiased decisions.
And of course you'd need a wide variation of judges, to represent all internet users. Difficult. But it would be worth the effort.
There is another industry that is a hobby of mine and there are several magizines to support it. But there is one that is independant and gives unaltered results and for that reason I trust it more than the others.
When you have your own search engine or directory it is easy to point out all of your good points and not the bad. Which is what I am seeing here. Another downside to this study is that most of these are pay for click/preformance. Google is not and the rest I think are paid, I think. So automatically you will have skewed results. IMHO this points out that even at Inktomis best they are still below googles index. Because Google is free and Inktomi charges for listings as do the rest.
There is an old saying one of my Gunnys told me "You cant polish a turd." I think that it fits in this instance. No matter how hard inktomi tries they will still be sub par with google as long as they are providing paid search results to googles free.
However the good side of this bias test is that Inktomi can "say" that they provide the best search results, but just barely.