homepage Welcome to WebmasterWorld Guest from 54.198.130.203
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Search Engines / Alternative Search Engines
Forum Library, Charter, Moderators: bakedjake

Alternative Search Engines Forum

    
Matt Wells interview
Gigablast search engine
robjones




msg:460840
 4:53 am on Apr 17, 2004 (gmt 0)

I hope it's alright to link to this,
[acmqueue.org...]

I thought it was interesting reading anyway, he gives some insight into what it's like to run your own search engine and the problems he faces, he also talks about Google and his view of the future of search.

 

Marcia




msg:460841
 5:53 am on Apr 17, 2004 (gmt 0)

Very, very interesting article. Nicely done and very understandable explanations by Matt Wells.

Freedom




msg:460842
 6:08 am on Apr 17, 2004 (gmt 0)

I do not think Google's results are the best anymore, but the other engines really are not offering enough for searchers to switch.

Fascinating article altogether.

<Added> I'll be happy for him when a big buyout or venture capital comes his way. If anyone deserves it...

[edited by: Freedom at 6:10 am (utc) on April 17, 2004]

nakulgoyal




msg:460843
 6:08 am on Apr 17, 2004 (gmt 0)

Indeed a good article. Thanks for the link here.

sidyadav




msg:460844
 11:43 am on Apr 17, 2004 (gmt 0)

Some excellent information in there.

Especially knowledgable for people like be :)

Sid

Robert Charlton




msg:460845
 7:12 am on Apr 18, 2004 (gmt 0)

Great article... fascinating at many levels, from nuts and bolts to visionary. It gives me added respect for Matt Wells, who was already kind of a hero.

I thought that this was a particularly intriguing observation...

I would suppose that the amount of information stored on the Internet is around the level of the adult human brain. Now we just need some higher-order functionality to really take advantage of it. At one point we may even discover the protocol used in the brain and extend it with an interface to an Internet search engine.

This brings to mind a related observation by Albert-Laszlo Barabasi [webmasterworld.com] that (paraphrasing) "the human body is an example of a dynamic network that has evolved over millions of years. The internet is less than 30 years old."

mbauser2




msg:460846
 10:03 am on Apr 18, 2004 (gmt 0)

I thought the interview was rather amateurish, myself (Come on, "tell a little about your background"? That's the textbook example of a lame question.), but I saw one interesting thing: Matt's apparently changed his mind about XML. He used to think it was "bloated and ugly", but now he thinks it will replace SQL.

Anyway, if anyone cares, Slashdot found the interview, and many posters there think the interview is all hype.

Freedom




msg:460847
 10:30 am on Apr 18, 2004 (gmt 0)

The interview was amateurish because you have to look at who was doing the interview. He has never been known for his professional journalism skills. Rather, something else.

gethan




msg:460848
 11:04 am on Apr 18, 2004 (gmt 0)

Great article - I hope Matt can get some deal arranged with someone and give google a real competitor.

Interesting opinions on the Death of Page Rank - the problem of not using PR is that we go back to the the keyword laiden domain names scenario... not pretty.

Chris_D




msg:460849
 2:05 am on Apr 20, 2004 (gmt 0)

Its actually quite interesting to compare Gigablast with other 'up and coming' engines like Nutch.

Nutch says:

Our current goal is to create a good-sized public demo that can handle moderate traffic. Even this takes a fair amount of hardware and bandwidth. Fortunately, the Internet Archive has donated bandwidth, so all that we need now is hardware. We estimate that a two-hundred-million page demo system that can handle moderate traffic will require less than $200,000 in hardware.

[nutch.org...]

Gigablast says:

Gigablast is a search engine that I've been working on for about the last three years. I wrote it entirely from scratch in C++. The only external tool or library I use is the zlib compression library. It runs on eight desktop machines, each with four 160-GB IDE hard drives, two gigs of RAM, and one 2.6-GHz Intel processor. It can hold up to 320 million Web pages (on 5 TB), handle about 40 queries per second and spider about eight million pages per day. Currently it serves half a million queries per day to various clients, including some meta search engines and some pay-per-click engines.

[acmqueue.com...]

and according to www.gigablast.com, Gigablast has

273,661,136 pages indexed

I know which one will have the lower overhead cost structure - Gigablast is already over 200 million pages running on 8 PCs!

cutting




msg:460850
 3:58 pm on May 12, 2004 (gmt 0)

I'm not convinced this is a fair comparison.

The use of the word "moderate" on Nutch's site is ambiguous. I meant several million searches per day (~50/second peak). One can also use Nutch to build a 200M page web search engine for less than $10,000, but it probably wouldn't be able to handle more than a few queries per second. I'll clarify that on the website.

Collecting and searching 200M pages is not very expensive. What's expensive is handling lots of traffic.

I have no idea how much traffic Gigablast sees, but I'd be surprised if they're handling millions of searches per day over 200M pages on just $8,000 of hardware.

takagi




msg:460851
 9:37 am on May 13, 2004 (gmt 0)

Hello cutting, welcome to WebmasterWorld.

I have no idea how much traffic Gigablast sees, but I'd be surprised if they're handling millions of searches per day over 200M pages on just $8,000 of hardware.

Well, most of the answers are in the message just before your posting

  • Currently Gigablast serves half a million queries per day
  • Gigablast can handle about 40 queries per second
Handling 40 queries per second for a whole day would result in handling 3.5 million queries a day. I'm not sure if Gigablast can really handle 40 queries per second for 24 hours, but even at 60% of that peak load it still would be able to handle more than 2 million queries per day. BTW, Gigablast is now searching 321 million pages which is about the maximum on the the 5TB Matt Wells mentioned in the interview.
ByronM




msg:460852
 3:02 pm on May 13, 2004 (gmt 0)

500k queries a day is roughly 6 queries per second.. not too shabby.

6 queries per second and 250 million pages is comparible to nutch/lucene on 10 servers if you tweak out the configuration well enough.. You have to figure that even a good percentage of the queries can be cached by os based on trends as not every query is unique.

However 40 queries per second seems a little high.. i can squeeze that through nutch running a benchmark program based on dictionary terms (As they would be cached "warmed up" over time)

When you do 40queries per second, is that single term or with joins, an "and" or "or" type query?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Alternative Search Engines
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved