Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
Google now operates a global infrastructure of more than 250,000 Linux-based servers of its own design, according to one Google executive I spoke with, and it is becoming a major consumer of electrical power, computer hardware, and telecommunications bandwidth.
Microsoft's Challenge> Web search
Google has already built the loyalty and branding for Web search
Microsoft has the advantage of local computer, network, and software application search.
If I were Google, I would not put this off for very long. Microsoft can definitely start to brand their search technology from these applications. A new software application may include a Microsoft search for files, info, and extend out to the web to retrieve other info regarding the software that being installed. This could even be used to make MSN as the default search engine automatically when the software is installed. This is a new market that has not been fully exploited and niche that Microsoft already has the advantage in.
Instead of competing through incremental improvements in the quality and range of their search services, Microsoft, Google, and Yahoo will be forced into a winner-take-all competition for control of industry standards.
This is my feeling as well. Google is always improving and that's great. They were allowed to introduce new features as and when they thought they were ready. They could work on their own timeline and do things slowly but surely because for the longest time, nobody else did a darned thing to improve search!
But, the next go round is going to be big! Bill Gates doesn't think small. He is going to release what he thinks is the best thing since "search" itself. There's only going to be one big winner. Whichever one of the big three is able to set the world on its collective ear ... will win this war.
Nobody had better jump the gun before they are certain they will dazzle the world. But the first one to do it and do it well, will be the hands down winner. The others will just be also rans riding on the winner's coat tails.
Its going to be fun to watch as it unfolds. Place your bets ladies and gentlemen. I think I'll sit this one out myself. I am a big fan of Google and would prefer they win ... but I wouldn't bet against Bill Gates on this one!
Web search is a service, and in a service, the quality of service matters. MS has never won a service war, only a product war.
From what I have seen of Google's strategy so far, it seems to be sound:
1. Index deep.
2. Go beyond the web
3. Earn revenue from increased distribution
4. Make search convenient: fast, desktop, etc.
5. Build a WebOS
6. Don't be evil
These 6 are common sense strategies and if they stick to them, they should have a sound future.
For all that is said, MS also sticks to some common sense strategies that have seen it win many battles:
1. Make everything easy to use
2. Provide reasonably good quality
3. Provide it cheap
4. Push it to the max
5. Get developers on your side
It beat Netscape, Apple, Novell, IBM, etc. using just these five strategies. But these strategies are blunt against Google, because Google is already doing the first four, and there isn't much scope for the fifth in search.
The big question is what will happen when MS provides intergrated desktop search? The answer is that Google still wins if it follows it's own 1 and 2 and stays ahead of MS. People who are searching will goto Google.
Further, we are moving to the high bandwidth era, where we are using more web applications than ever before. If Google can successfully engineer some key applications (such as Gmail) to be equivalent to desktop software (such as Outlook), people will automatically migrate to web apps as they are completely portable.
I am also surprised that the author hasn't spoken about patent acquisition as a strategic advantage. We have seen many tech wars won as a result of patents (Minolta vs Carl Zeiss for example). This important factor could decide the MS vs Google battle. Both players relaize the importance of patents and must be amassing them in hordes. Google ofcourse has a head start in this as far as search and WebOS goes.
As far as APIs are concerned, I believe, Google will provide full fledged APIs when it can successfully offer a WebOS. Possibly just before Longhorn.
Let us not underestimate the Linux factor in all this. In one or two years, Linux will be as friendly to use as Windows (still some issues with fonts, installations, etc.). When the time comes for people to discard Windows XP, the big question is will they go for Longhorn or the new Linux. In my opinion, it will be the new Linux.
The future: People will "upgrade" from Windows to Linux; and use more web apps as compared to desktop apps.
Has MS considered building a WebOS? No news there so far. If they do, then we are talking serious competition to Google in a few years.
I find the author's strategy strange. His attaching all value to APIs appears to be misplaced.
Interesting post, Namaste. If you've haven't read Ferguson's book, I'd recommend it. It gives a lot of context that explains how he arrived at those thoughts.
(I'm traveling, but had a few minutes to browse..)
MIT sells its soul to the advertisers? This is a particularly good example of how not to win friends and influence people. Probably great content but a terrible, headache inducing website!
P.S. If anyone has any tips on how to nuke these adverts let me know.
Firefox adblocker extension is your friend. Block ads using wild cards, after a few times you won't see any ads. Like this: doubleclick.com/* is blocked, you will never see a doubleclick ad again. Never saw the ad in question since I've blocked that ad site long ago. You can shut it all off, google adsense, whatever you want.
Since I had the ads turned off, I'll fill you in. Typical contentless drivel, half baked pseudo thinking, pathetic conjectures. No wonder those guys went out of business. In other words you would be about as well off looking at the flash ads as reading this if you're looking for a deeper grasp of the question. Read the whole thing in about 10 minutes, hoping that maybe something worthwhile would come on the next page, 1, 2, 3... got to 9 and nothing.
Some random thoughts/.....
Control the enterprise API and you've got people locked in. I've grown up in an industry that made billions of dollars for several decades by controlling terminal emulation APIS (think HLLAPI). I know of atleast one super-duper SEO here (hope he's reading this) who will attest to the value of APIs in the enterprise space.
The desktop market is firmly controlled by MS because of APIs like COM or whatever its morphed into.
If you make your APIs accessible, easy to use and develop to, people will write the applications to use your services.
Amazon remains one of the best examples of what a company can do by releasing APIs.
I've not looked at eBay's API, but have heard good things about it.
>> Build a WebOS
Which will need huge amounts of middleware, which will need standard APIs.
Most of the Java and .NET stuff is beyond the grasp of simpletons like me.
If they do not, they risk being regulated to the same extent as IBM. It was regulation that cost IBM the pc hardware market & much worse the PC software business.
Regulation, or rather the fear of regulation, could cost microsoft the search market. Interestingly enough, the search space over the long term could be of greater significance than the browser/OS/Office markets.
Microsoft must know that they won't get away with a limp smack on the wrist next time. When they abuse their position again they get the full treatment, if not in the States, then elsewhere probably Europe. After all Europe doesn't have a big economic stake in Microsoft's future. Quite the opposite in fact.
I doubt it will be as one sided as many believe.
Microsoft has already announced that it intends to provide third-party developers with APIs to its new search engine, enabling them to construct applications based on it.
But I don't regard 'Web Search' as that important a function to require APIs. I still regard it as a way to quickly access the statically delivered content on the web. Obviously Google already have a developers API (with very limited usage) and Adsense websearch. Amazon have their new e-commerce webservice that allows you access to their search results (Google-derived and Alexa enhanced) that is in beta and may be subject to a charge in the future. Both of these seem to be allowing websites to incorpoate their own websearch facilities. I don't think they will be taken up in large enough numbers to have a big impact on searches done.
What he is talking about is the next generation of search - the one that includes the 'hidden web' desktop PC file systems, emails, handhelds, and Linux. To provide a cross-platform access to all of this would be nice - but hardly a 'killer app'. I haven't bothered to download Google or MS desktop search - I know where my files are and what they contain and can use windows explorer to check them. I only need a deep search of previous web pages/emails/files about every 2 weeks. If you said I could search and access the text of any book ever written, any software, any album details (cover/real lyrics/ track listing/sample), access MP3s of my music and the music collections of any friends (wishing all my vinyl was converted to MP3) - then I would be excited both in my working and home life. But copyright prevents a lot of this and I couldn't afford to actually purchase these as products.
I just fail to make that jump from search being a quick, sometimes frustrating, way to access web content to being the nervous sytem that unifies my informational world. In the long term (10-20 years?) it will be that. But for the next few years, when the Google/MS competition will take place, it is the web search that will be the battleground. I used to think the real crux could be how you access the search - when the browser disappears from Windows and becomes part of the desktop then Microsoft can make it awkward for people to change the default search from MSN to Google. It didn't work last time with the built-in IE search but maybe will work better this time. Luckily enough Google should have the financial clout to quickly stop Microsoft using any unfair tactics, unlike some other companies in the past who have had to wait 5 years for their multi-million dollar settlements that are just loose change for Mr Gates.
I realise that I am holding up my hand and saying I just don't have the imagination/foresight to see how APIs and extending the search content is the next step. Given that Microsoft won't be able to just leverage control of the major operating system to eliminate Google, I keep on coming back to the thought that for the next 5 years it's the same old, same old thing - quality of the search results. All the pain of the Florida update, the obfuscations that have reduced the power of Pagerank, the 'filters/sandbox/hilltop/ anchor text/ over-optimisation penalties' - has failed to produce better Google search results. Google needed to move from a keyword-based search with Pagerank to something else (now that Pagerank was understood and spammed rather than natural web linking). I really believed that Google was going to move to the next stage and figure out what a page was about before serving it as a result as opposed to ever more elaborate counts/weighting of keywords in the document. But they have failed. I think that Yahoo and Teoma may now be its equal and that Microsoft may catch up in a year. Google could become a minor player long before the big battle over control of access to digital content is fought - if one of the other players comes up with a search engine that actually understands something about what the user is searching for.
While this is no more an official number than what was mentioned here [webmasterworld.com] it is still a higher number than the "more than 10,000" we see referred to so often.
Assuming a harddrive size of 100Mb and an average page size of 10Kb (html only); with the current published index size of 8.058.044.651 pages, that's a compression ratio of around 100-to-31% (assuming that every single bit of the html is stored).
Although this seems reasonable i still believe that they've got more machines than that, and also that they've got more storage capacity than that (for old indexes and so on).
typo I assume, I'd assume average hard drive size of 80 gigabytes. And an average page size of 20 kB. Minus the os and swap partitions, and allowing for 10-15% free space on the disks to maximize disk life, say roughly 70 gigabytes available per machine. With maybe a 3 to 1 compression ratio.
250,000 machines @ 70Gb should make room for roughly 875,000,000,000 pages at zero compression, assuming average page size is 20k and not 10k.
Search APIs will be a part of a new generation standard (whatever that is) and will be a small part.
Anyways, good article, puts a lot in perspective for me as an individual business owner. The emphasis on architecture is brilliant and rightly should extend into whatever your business is -- affiliate, publisher, vendor.
These are all "APIs"...but unoffically promoted by Google. IF they wanted they could shut the tap on these anytime.