Arch Rock Energy Optimizer* in Action: Data Center
Recent trends in Information Technology such as dense blade computing, server and storage virtualization, and “cloud” computing have raised the priority for energy and thermal efficiency in data center design. Operating 24 hours a day, data center energy consumption in the U.S. has been estimated [EPA] in 2006 to be 1.5 percent of total U.S. electricity consumption and could potentially double by 2011.
For many Data Center owners or operators, what is at stake in this trend is clearly the cost implication of energy consumption, but often the viability of a site—if energy cannot be supplied to it, keeping pace with the growth in the demand. The challenges created by these trends for energy production, transport, and cost illustrate the importance of conservation and efficiency efforts. Conservation starts with measurement, to establish a baseline of industry metrics such as Power Usage Effectiveness (PUE) and Data Center infrastructure Efficiency (DCiE), two metrics promoted by the Green Grid organization [GGRID], and continues with on-going tracking of these metrics.
To improve the data center operations, reduce energy consumption, and reap associated costs savings, an open monitoring, reporting, and alerting framework is key to help the facility and IT managers improve their data center environmental efficiency. A key requirement for wide-scale acceptance of such a framework is the ease with which it can be introduced and managed, without the prohibitively costly and disruptive impacts that have been characteristic of wired instrumentation systems so far. A monitoring framework must have zero negative impact on the operation of servers and network and storage equipment, in addition to being easy to introduce and be based on open IT paradigms such as IP networking and Web Services interfaces.
Read the full Arch Rock Energy Optimizer* in Action: Data Center Report.