TheMadScientist - 3:18 pm on Jan 13, 2013 (gmt 0)
They can't even index the whole thing, so I think assuming Chrome visits to sites/pages that have 'surfaced' or even Analytics data are 'telling and definitive' grossly underestimates the size of the Internet and the relative % of pages and sites people actually visit ... In 2005 based on size estimates Schmidt said it would take 300 years to index if all growth stopped. (Of course that was before caffeine, so maybe they got it down to something reasonable, like 30 years - assuming there's not 700,000 pages a minute still being added.)
The Internet is comprised of approximately 78 million servers that span the globe (That number is quite possibly very low.) Information on the Internet is being measured in Terabytes, and a Terabyte is 1,000 Gigabytes. One estimate in 2005 by Eric Schmidt, CEO of Google, puts the estimate at near five million Terabytes of information on the Web, four years ago.
Google's search engines managed to index about 200 Terabytes in seven years (as of 2005) as a comparison of how large that really is; 200 Terabytes is only .004% of five million Terabytes!
700,000 new pages of information per minute are added to that tally. If the internet stopped all forward progress it would take another 300 years for Google to index it all.