|Mahalo Goes Public Today (Alpha)|
We've Created the top 4,000 SeRPs - So Far
Human-powered search engine Mahalo.com launches with investors including Sequoia Capital, Elon Musk, Newscorp, CBS Corporation and Burda Media.
Company Name: Mahalo.com, Inc.
Located: 10,000 square foot factory in Santa Monica.
Category: Human-powered search.
Launch Time/Date: 3PM PST, May 30th 2007.
Launch location: Wall Street Journal's D Conference, Four Seasons Aviara (northern San Diego). D Conference Website:http://d.wsj.com.
Product Launch Schedule: Currently in Alpha with the Internet's 4,000 most popular searches; will move to Beta at the end of 2007 with 10,000 pages, and launch shortly thereafter.
Founder & CEO: Jason McCabe Calacanis, 36 years old.
Previously: CEO of Weblogs, Inc. Weblogs, Inc. was purchased by AOL/TW for a reported $30M; CEO of Silicon Alley Reporter magazine and Venture Reporter; purchased by Dow Jones.
Investment: Two rounds, amount undisclosed.
The full press release [mahalo.com]
Must be nice to have friends in high places.
Unfortunately I think Jason made the mistake of talking down to the SEO community. His link bait campaign over the past year or so has been somewhat questionable. Based on the negative responses I've seen in many fora, ole Jason is going to have to rely on advertising dollars for some exposure. I doubt that the SEO community is going to provide much love.
I really like the concept though. Just wish that maybe someone different were behind the helm. Based on the stuff I've read from Jason, he thinks all of us SEOs are going to be without work once his human powered search engine becomes popular.
|We're not the destinationówe don't produce the contentóbut rather we look at all the content that's out there and we organize it and help people organize and find it. |
|Also, the DMOZ and Yahoo! Directory have not been maintained over the past decade, so their results are hit and miss. We only include the *best* links on our pagesónothing borderline. So, I think our results are much higher quality. |
Hmmm, just wait until money crosses the table.
BTW Jason, please, no more videos, please?
<added> Just realized you posted this topic on 2007 May 31. It took 12 days for the first response. That should give us some indication as to how much support Jason is going to get from the Webmaster communities. ;)
|Human-powered search engine |
Why are they calling this thing a search engine, isn't more like a human edited directory?
And adding 600 more pages by the end of the year doesn't sound especially ambitious, it's something like 30 - 40 pages a day. If it takes them a "couple hours" to build a page they have what, maybe 7 - 10 "guides" working on the building new pages?
Or is my math wrong?
Search for Mahalo Greenhouse.
|Oh yeah, if we accept your search result we will pay you $10 to $15 per search result. |
Hmmm, just how much funding did they get? :)
|Please note that at this time we can only pay U.S. citizens. If you are not a U.S. citizen, we will make a donation to the Wikimedia Foundation in your name. |
|will move to Beta at the end of 2007 with 10,000 pages |
Is it a joke?! I developed specialized search engine, and it adds 50000 'cached' (filtered, mined, processed) pages a day to a web interface, and it crawls over 300000 pages a day just via cable 6mbps Internet access.
They have manually edited search results there, hence very low number of unique keywords with "SERPs" rather than "pages" as in indexed in proper search engines rather than this Web 2.0 directory. Pretty amazing what you can raise money for these days.
|company hopes to reach 10,000 search terms by the end of the year |
Now it's clear: it's not 10,000 pages.
I am recalling some well-known bankruptcies during past few years... bankruptcy was initially planned. Spreading idea, good advertisements, some investments, stock market, death of the company, but... money traveling from one pocket to another...
The founder of this company appears to be a good guy, he certainly does not appreciate fully challenges of building a proper search engine, but then again he is not making one - it is said that for non-matching manual results they will use Google's results. In which case it seems better to use Google in the first place, but time will tell soon if his approach is successful.
Sounds like an attempt to prove the Infinite Number Of Monkeys theory. :) I still have grave reservations about this kind of venture given the shifts in the web scape that happen daily. The competition between the big search engines is centered on fresh and accurate SERPs. This venture will be providing stale SERPs.
Mahalo (thanks) for the great feedback everyone.
1. In terms of the number of SERPs we are over 6,000 now.
2. We have launched a program for the public to be involved called the Mahalo Greenhouse: [greenhouse.mahalo.com....]
3. Over 100 people are approved and working in the Greenhouse and we're about to break 50 accepted SERPs.
4. Note: each SERP represents a couple of dozens searches (i.e. Paris Hotels = Hotels in Paris, Paris Lodging, Paris Hostels, etc.), so we are not doing just "10,000" searches. We will have 10,000 pages which I guess will service 100=250,000 searches by the end of the year.
5. We use a three step process for creating SERPs:
--- a) Part-time Guides (PTGs) create pages in the Greenhouse
--- b) Full-time Guides (FTGs) in our office in Santa Monica do quality control on these SeRPs (i.e. taking out spam or low-quality links that might have slipped in, typos, dead links, etc).
--- c) The public debates the SeRPs on the discuss page and by submitting links they think are missing.
So, it's an organic directory/search service and a process for keeping it up to date. It will take three years to build, but when it's done it will be very, very helpful for people.
If you want to help please submit links and/or join the Greenhouse!
all the best,
Nice to see you posting on the thread Jason.
This is one potential flaw that I can immediately see. The rate of change of the web makes it very difficult for a human based tracking system for dead/bad links to keep a SERPS page up to date. The process has to be automated with some kind of human overview. Dmoz used to link check. However it did it on such a long period basis that the dropping and reregistration of domains slipped right through its link quality process. With the gTLDs and TLDs it is a recognised pattern however with ccTLDs, the problem is less clear. Even Google has difficulty with reregistered ccTLDs.
|So, it's an organic directory/search service and a process for keeping it up to date. |
Theoretically it will never be completed. The journey is the reward. :)
|It will take three years to build, but when it's done it will be very, very helpful for people. |