Saturday, December 1, 2007


So why would GOOG be serious about investing in promising new energy technologies? The amount of electric power consumed by these mega-data centers/server farms is staggering...

By last year, data centers scattered from northern Virginia to Washington State were consuming 1.5 percent of the nation’s electricity supply, the E.P.A. study says, straining the system in areas where power demand is high.

Companies tend to be secretive about how much it costs to run their servers, but several experts said that energy costs can be 40 percent of the cost of operating a data center. In three years, the cost of running a server can top its purchase price. A farm typically draws 10,000 watts or more, generating heat with each one. For every watt to run the computers, the farm may need nearly a watt for air-conditioning. It costs $4.5 billion a year for the electricity to run the nation’s server farms, according to the E.P.A., a sum ultimately picked up by consumers.
Taming the Guzzlers That Power the World Wide Web

No comments:

Post a Comment