jmccormac - 2:14 am on May 1, 2012 (gmt 0)
Thanks all. What I think the real USP for a good SE is a combination of new, relevant, sites and clean content. In the last ten years, Google has been the 800 pound gorilla but it is the quality of content that has really hit small and large search engines that sought to compete with Google/Yahoo/Microsoft. The Searchwikia venture with its Social Media involvement was a good test but it paid very little attention to search index quality. All sorts of junk (in addition to compromised sites) went into the deployed index. The other aspect is that many small SEs adopt the same blind crawler approach used by Google and the larger SEs. This is akin to the Brute Force Attack in cryptography of trying all combinations of keyword to break a code. It requires a lot of computing power and a lot of servers. Since I don't have Google's technological resources and find the BFA method of site detection inelegant, I reckon that this project will just have to innovate, adapt and overcome (or at least be well enough designed to survive). But the marketing angle is where Google has the advantage.