Here's an interesting article about power consumption of super computers. Google CEO cites "power requirements of his company's Linux-based infrastructure were a major concern"
"A Linux cluster operating at 360 teraflops would require 15 to 20 megawatts," says IBM's Turek. "That's an electricity bill of $20 million to $30 million a year."
Thats a big power bill.
oldskool79
1:30 am on Dec 8, 2004 (gmt 0)
is that the power requirements to run the computer, or to keep it cool?
If it's the latter...perhaps it will be cost effective to move the G'plex to the south pole (hey...I doubt the engineers will mind...they don't get out much anyway)
TheDoctor
12:47 pm on Dec 9, 2004 (gmt 0)
the south pole
Or outer space? There was some talk, back in the 70s, before the coming of the microcomputer, of the feasibility of putting giant mainframes in geostationary orbits. One of the arguments was the ease of keeping the machines cool at near-zero temperatures.