Tuesday, February 20, 2007

Computer Servers Use 1.2% of Total US Electricity

I have written before how server farms are replacing aluminum refining as the major users of cheap electricity. This report looks at the extent of server energy use.

Total power used by servers represented about 0.6% of total U.S. electricity consumption in 2005. When cooling and auxiliary infrastructure equipment are included, that number grows to 1.2%, an amount comparable to that for color televisions. The total power demand in 2005 (including associated infrastructure) is equivalent (in capacity terms) to about five 1000 MW power plants for the U.S. and 14 such plants for the world. The total electricity bill for operating those servers and associated infrastructure in 2005 was about $2.7 B and $7.3 B for the U.S. and the world, respectively.

Aggregate electricity use for servers doubled over the period 2000 to 2005 both in the U.S. and worldwide.

If power per server remains constant, those trends would imply an increase in electricity used by servers worldwide of about 40% by 2010. If in addition the average power use per unit goes up at the same rate for each class as our analysis indicates that it did from 2000 to 2005, total electricity used by servers by 2010 would be 76% higher than it was in 2005.
And these numbers are probably an underestimate as they don't take into account networking equipment and tape backups. It also doesn't include Google as they don't use server class machines in their server farms (ironic isn't it). If you include Google they estimate it would raise the total worldwide electricity use by servers 1.7%.

There are 10,306 servers in the US and 27,282 in the world (US has 37% of the world's total). This works out to 1 server per 30,000 Americans. The servers use 45 billion kWh of energy per year in the US and 123 billion kWh worldwide. This works out to 150 kWh per American.

via Business 2.0 (.pdf) via ecoIron

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.