Cloud-based computing is made possible by many thousands of data centers located all over the globe. Data centers are buildings that have a specific function of housing many computer servers. They come in all sizes and can be composed of many thousands of servers—all of which are tremendous users of energy.

By some accounts as many as 30 percent of computer servers are categorized as “comatose”.
Data Center Effectiveness
The data center industry uses the measurement PUE, or power usage effectiveness, to measure efficiency. A PUE of 2.0 means for every watt of IT power, an additional watt is consumed to cool and distribute power to the IT equipment. A PUE closer to 1.0 means nearly all of the energy is used for computing. Many of the major data centers have been working to decrease their PUE to levels around 1.2, but the largest data centers only account for roughly 5 percent of the total energy consumption of data centers in existence. The vast majority of data centers are small- and mediumsized and are generally much less efficient.
Clearly this leaves a lot of room for retrofitting opportunities in smaller data centers, but where to begin? It helps if we first have an understanding of how the pie is divided when it comes to energy consumption in a data center. The bottom graph to the left shows 50 percent of the power that is consumed in a data center is by the IT equipment itself. The next biggest user of energy is cooling in the data center at 25 percent. Air movement is next at 12 percent. The uninterrupted power supply (10 percent) and lighting (3 percent) use the least energy in a data center.
Energy-reduction Strategies
Because the largest energy user in a data center is the IT equipment itself, it makes sense to start exploring energy-efficiency opportunities there. IT professionals are looking at strategies that range from simply turning off unused equipment to installing more energy-efficient servers.
One would assume that all servers in a data center are working at maximum capacity, but almost the exact opposite is true. Server efficiency generally ranges between 12 and 18 percent. By increasing server efficiency, a reduction in the number of servers required is possible. By some accounts as many as 30 percent of computer servers are categorized as “comatose”. By definition a comatose server is a server that has not delivered information or computing services in six months or more. There are many reasons these servers are underutilized, including changing needs of a company and simply forgetting the server exists and/or is operational. Yet, many of these servers are still operating 24 hours a day, seven days a week, consuming energy. Identifying these comatose servers and taking them offline is proving to be an effective strategy for energy efficiency.
Keeping It Cool

Data centers are complex users of energy, often with micro environments that can be extremely challenging to analyze and assess.
One of the biggest enemies of computer systems is heat. Too much heat can severely impact the lifespan of computer equipment. Keeping the equipment at a cooler temperature is a key goal of IT professionals. However, keeping the equipment cool comes at a cost with large, sophisticated, energy-consuming HVAC equipment.
In general, cooler air is taken in on the front side of a server and hot air is expelled from a fan on the rear of a server. Anyone who has ever put their hand behind a computer knows just how much heat is removed from a single computer; now multiply that by 100 or even 1,000 and it’s easy to understand that cooling is important.
One cooling strategy being implemented in many existing data centers is thermal segregation—basically creating “hot aisles” and “cold aisles”. A hot aisle is an area between two racks of servers where the fans are expelling hot air toward each other. Consequently, a cold aisle is an area where the fronts of two servers are facing each other, not expelling heat at each other. Cooling can then be concentrated in the cold aisles without the waste heat from servers intermixing. Some data centers are even going as far as to separate the hot and cold aisles with a plastic curtain, similar to what you might see at the entrance of a walk-in cooler or freezer, to further help thermally segregate the server areas.
Because the servers need to operate within a defined temperature range, most excess heat has typically been ventilated from the data center. However, this is a tremendous waste of energy, particularly in instances where the heat can be reclaimed and reused. A strategy of reclaiming the excess heat is particularly suited for private or corporate data centers that are located inside of office buildings in colder climates. Waste heat that is generated in a data center can be reclaimed through a heat-exchange process and used to heat occupied areas of the building.
Paying the Costs
Generally, the cost/benefit analysis of implementing energy-efficiency measures guides decisions about which strategies to pursue first—often those with the shortest payback period. However, one aspect that needs to be considered is available incentives from utility companies.
Most every utility company offers monetary incentives to customers to reduce energy in their buildings. The reason utility companies offer incentives is simple: It is more costeffective to pay a customer to reduce energy than invest in expensive upgrades in existing infrastructure. Upgrades of this nature do not usually fall into most prescriptive categories in utility incentive programs but rather into a custom category. The process typically involves the utility company or a trusted consultant providing guidance and analysis of potential energy savings and determining an appropriate incentive amount. In some areas, the incentives can reach 30 percent of the cost of implementing the energyefficiency equipment or strategy, thus drastically reducing payback periods. One resource to check for available incentives in your area is the Database of State Incentives for Renewables & Efficiency, or DSIRE.
Implementation
Data centers are complex users of energy, often with micro environments that can be extremely challenging to analyze and assess. New technology develops so rapidly it has a tendency to be out of date as soon as it is installed. Hiring design and consulting professionals that understand these nuances is key to a successful retrofit project and cost savings for data center owners.
CHARTS: Natural Resources Defense Council’s Data Center Efficiency Assessment, August 2014
Be the first to comment on "Modern Data Centers Provide Vast Opportunities to Decrease Energy Use"