Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: bakedjake
Steve joked about being the new kid on the block but went on to build expectations for Wisenut's relaunched site. He said that since acquiring the engine, Looksmart has been rebuilding its infrastructure. Today, they are rolling out their enhanced site and updated index. The updated index contains 1.1 billion documents. Steve asserted that by Q2 they would have a 30 day refresh cycle in place.
In response to various questions to the whole panel about crawler features, Steve's answers indicate that Looksmart is putting significant effort into building a feature rich, highly competitive engine. [anyone who was there want to add specifics?]
Gemignani also claimed that one of their key differentiators is the Looksmart human-powered directory. They use this directory to seed their crawls. Besides this, he seemed to be suggesting that the directory plays a role in their algorithm [I'm mostly infering this. If anyone got a specific quote here, please add it.]
When an audience member asked, basically, why should we care about Wisenut, Steve declined to offer specifics on distribution deals but very directly said something along the lines of "watch for big things from us" (loose quote).
This post is meant to summarize Looksmart's presentation and give those who weren't there the gist of the news. Steve was a bit less reserved in his comments than some of the other crawler reps, but all of the reps tend towards the cryptic. So, check out Wisenut again, judge for yourself and--I guess--watch for big things...
[edited by: jeremy_goodrich at 5:22 pm (utc) on Mar. 7, 2003]
[edit reason] fixed link [/edit]
I personally would like to see Wisenut do well, we need a bit more competition in this industry.
The 30 day updates will be a start, and using the looksmart directory (especially the zeal non commercial cats) will add a human touch to the SERPS (depending how it effects the algo of course).
Google is now well past 3 billion, and when WiseNut launched, numbers wise, it was competitive.
To top that off, Google refreshes their index (3 times that size) every 30 days...and LookSmart says 'they will try to do 1/3 what Google does'...hm, that doesn't sound like good news.
Still, it's not a bad engine...if they really put some work into it. The bit about the servers is very important. :)
Way too slow to catch on to the general surfing public.
The wisenut rep was claiming that they now had a much more robust and speedy platform. I believe he said that the new index was currently 1.1 billion. He did not say this was as big as it would be getting. He specifically stated that they don't handle dynamic pages very well right now and that they are working on this. That would presumably boost the index size significantly.
Ya gotta think LS is trying to win an expanded or even exclusive deal at MSN, if not an outright acquisition by MS. Do they have a good enough relationship there that they've convinced MS to give them time to get Wisenut ready for primetime before it makes the move that so many people seem to be expecting.
Pure speculation on my part...
They can index 10 billion documents with every day refresh, and use CRAY servers, and it won't mean a thing to most of us, because Wisenut still won't be on the radar in our server logs!
LookSmart has survived because of the MSN deal, period!
They got a lot of negative PR with Webbies, when they pulled their revenue model out from underneath a lot of companies that bought in to the LS directory listing annual subscription.
LS says they have made up the lost ground financially after switching to a CPC model, but after MSN, I hardly consider BlowSearch a quality distribution partner.
using the looksmart directory (especially the zeal non commercial cats) will add a human touch to the SERPS...
In the case of Looksmart, human=$, and, if the algo of a search engine that owns Looksmart gives a boost to Looksmart subscribers, there's a skewing of the playing field that might make the FTC take notice. Where then do the paid listings leave off?
We already have the situation where people are buying PageRank on Yahoo, and we not only live with the situation, but some of us use it. If Google owned Yahoo, though, I think the situation would get a little more clouded.
But Steve, if you see this post, tell your bot to ease up a bit. It is bad webmaster relations to send it out at these speeds. I saw this little guy:
Mozilla/4.0 compatible ZyBorg/1.0 (firstname.lastname@example.org; [WISEnutbot.com)...]
pull down something like 6,000 pages an hour for a couple hours the other night. Today she/he/it pulled down something like 2-4,000 for 15 straight hours. By comparison, Miss Googlebot (I'm convinced it's a she), be she Miss Deep or her twin sister Freshy pulls down no more than 1 every 3 or 4 seconds - 1,000 or so per hour.
If Steve is looking at freshness, a crawl like that every month, while not fatal, is gonna cause some problems. I can't even imagine what will happen if Mrs. Deep and Zyborg show up at the same time every month. Most of the content is dynamic and involves a VB script with 8 queries through MS Index Server, so that many pages is a heavy load.
The clarification is appreciated -> but, as we've been saying, big talk is just big talk, until something happends.
Previously, if you searched, most of WiseNut's results seemed to be derived from spidering other SE's...perhaps the larger Web Graph is a result of WiseNut switching from a meta engine derived base to the Zeal / LookSmart directory as a seed for their spidering.
Time will tell if something happens. Though, after trying more than a year ago to sell family and friends on WiseNut, I'm not going to do that one again till I know that they have at least 10,000 servers and a few more gerbels to run them. :)
They are slow as dirt. Really. Have you tried it before?
Honestly, if they can't get the speed up to par, they don't have a chance of making anybody switch.
<--- I like what they have, a lot. But, I can't stand it when I'm doing research and I get requests back to me in a minute +
It's unbearable. Google has set the standard there, searching needs to be fast. It needs to happen at the speed of the internet.