Welcome to WebmasterWorld Guest from 54.167.175.218

Forum Moderators: goodroi

Should we force Googlebot to become Open Source?

   
8:59 pm on May 4, 2012 (gmt 0)

Top Contributors Of The Month



Googlebot is a phenomenal crawler. The best at what it does and I can't see it beaten anytime soon. However I can't help wondering if in order to create a fair market that Google should be forced to become a raw data provider and a market of Search Engines would use that data to produce their own search results. Of course they would be able to choose different providers of data to enhance their results and Google Search would become one interpretation of googlebots raw data alongside competing Search Engines that may choose to interpret the data another way. This could create a fair and competitive market in Search.
2:43 am on May 5, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Google should be forced

Inquiring minds want to know: How?
10:43 am on May 5, 2012 (gmt 0)

WebmasterWorld Senior Member andy_langton is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Crawling is actually one of the more straightforward elements of running a search engine. It's primarily data retrieval. There's a whole host of good Open Source options if you want to run your own crawler, from Nutch to Heritix. You still need the bandwidth and storage space, of course, which may be what you're hinting at.

But the crawler side (in terms of the software at least) is not providing much commercial advantage to Google. The key is in processing the data that the crawler uncovers. I think you'd be hard-pressed to argue that Bing's problems lie with lack of data.

And as lucy24 says, even if there was a good argument in favour (which I don't believe there is) how would you force Google to do such a thing?
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month