[SitM: predicting the future is fraught with uncertainty and mistakes because it’s nearly impossible to tell what sort of advancements are going to be made in technology. Advancements in tech can increase power usage due to greater demand or it can decrease it due to increased efficiency]
SAN FRANCISCO — Data centers’ unquenchable thirst for electricity has been slaked by the global recession and by a combination of new power-saving technologies, according to an independent report on data center power use from 2005 to 2010.
The study suggests that Google’s centers are more efficient than most.
The report, by Jonathan G. Koomey, a consulting professor in the civil and environmental engineering department at Stanford University, found that the actual number of computer servers declined significantly through 2010 because of this lowered demand for electricity and because of the financial crisis of 2008 and the emergence of technologies like more efficient computer chips and computer server virtualization, which allows fewer servers to run more programs.
The slowing of growth in consumption contradicts a 2007 forecast by the Environmental Protection Agency that the explosive expansion of the Internet and the computerization of society would lead to a doubling of power consumed by data centers from 2005 to 2010.
In the new study, prepared at the request of The New York Times, Mr. Koomey found that electricity used by data centers worldwide grew significantly, but it was an increase of only about 56 percent from 2005 to 2010. In the United States, power consumption increased by 36 percent, according to Koomey’s report, titled “Growth in Data Center Power Use 2005 to 2010.”
“Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity consumption is clearly much lower than what was expected, and that’s really the big story,” said Mr. Koomey.
Comments are closed.