In 2014, data centers in the United States consumed approximately 100 billion kilowatt hours (kWh) of energy. To add insult to injury, the power needed to support this rapidly growing demand comes from an electrical grid that is wildly inefficient and is based on infrastructure that was created, in large part, more than a century ago. Just how significant is this waste? It turns out that the power grid supplies 150W of power to meet the demands of a digital chip that may need only 100W. Moreover, the amount of wasted energy is even greater because every watt of power lost through power conversion is transferred into heat. And it is necessary to remove that heat from the server farm by expensive and energy-intensive air conditioning. It takes about 1W of air conditioning to remove 1W of power losses, effectively doubling the inefficiency of this power conversion process.
New materials have emerged that can convert electricity more efficiently and at a lower cost. By eliminating the inefficiencies in this final stage in the server farm power architecture we can realize a direct saving of 7 billion kWh per year. This is doubled when air conditioning energy costs are added, bringing the total to about 14 percent of the total energy consumed by servers in the US alone. The cost savings are also significant. At the average cost of $0.12 per kWh, that’s a savings of $1.7 billion annually, which does not include the additional savings in system cost resulting from fewer power converters and air conditioners.
December 15, 2015
By: Alex Lidow