Understanding data center cooling, energy usage and reduction methods

Image of a data centre

Data center energy usage has risen dramatically over the past decade and will continue to grow in-step with the processor-intensive applications that support business in the modern world.

The growth of technology has driven the data center into a new phase of expansion, and while data centers themselves may vary over different industry segments, there are common factors including a need to do more with the same resources, or in some cases, even less.

To this end, much has been done to increase server efficiency and IT space utilization, but the actual space and cooling infrastructure supporting these intensified loads has often not been properly addressed to keep pace with these developments – an important oversight since cooling can represent up to 42% of a data center’s energy usage.

There are many factors that must be considered before deciding on a cooling approach to use in a given data center Energy usage, installation specifics such as the location of the data center itself, the density of the data center on a per rack and kW/square foot level and other user-specific requirements will all impact this decision.

CRAC /CRAH

The most common method, CRAC and CRAH units, have been successfully deployed in a variety of installations, but reliability aside, in light of the energy requirements discussed earlier, these systems may not be the most cost-effective way to cool a data center

Containment offers some energy benefits, even when used with CRAC/ CRAH technology. It would appear that a minimum of 7.3% savings could be realized when evaluated against a standard CRAC/CRAH deployment. The data center size will impact the overall savings, but even a small data center could reduce its overall energy usage with this approach. Cold aisle containment can be retrofitted into any data center with traditional raised-floor cooling. Hot aisle containment or a chimney approach could make sense in those data centers with a duct return infrastructure already in place.

Liquid cooled racks

A typical liquid cooling retrofit into a data center with standard 45°F chilled water and no other optimization could expect to see an 18% savings compared to CRAH-type units and without the retrofit required for containment. These types of racks also offer the advantage of greater per-rack density when compared to that usually seen in a traditional CRAC/CRAH deployment (30kW vs. 4-6kW). This may offer a benefit when considering available floor space and may make it possible to extend the life of an existing data center instead of building a new one.

The requirements for installation are typically just a chilled water source or the space to install such a unit. The heat rejection for these units is the outside air so the chiller installation must be carefully considered. Once the chilled water system is in place, it becomes as simple as connecting the units to the chilled water source.

The true energy savings potential for this type of system becomes clear when chilled water temperature is increased in systems that can accept warmer temperatures without de-rating capacity. Compared to traditional CRAH units, the energy savings would increase to almost 40%. The addition of free cooling units increases the savings to 49%, and going to an evaporative free cooling system expands it further – to 55%.

Free cooling systems are location-dependent with regard to operational windows. The user should evaluate the data center location against readily available bin data from ASHRAE to determine the feasibility and potential ROI for these systems.

Active and passive rear door cooling units

Active and passive rear door units can take advantage of all the technologies discussed including waterside economizing (with an improvement), or elimination of one of the fan sources. Active systems show a 57% improvement over traditional water-cooled CRAH units, and passive systems deliver a 66% improvement. It is clear that this approach may make the most sense in those data centers capable of deploying these components.

From an installation perspective, chilled water is required in some form. The economizing side of the installation should also be planned to take full advantage of system capabilities. Installation on existing racks may drastically reduce the installation process time since there is no need to move equipment into new racks.

Pumped refrigerant-based systems

As a supplemental solution, these systems offer some clear energy savings when compared to conventional CRAHtype systems. A 36% saving can be realized with this type of system, due to the more effective means of moving the energy from the servers and the absence of continuous humidification of the air stream due to the 100% sensible cooling provided. The location of the units in relation to the racks also provides for a reduction in fan energy usage, even with the additional pump required moving the refrigerant. These systems also offer relatively simple retrofit capability. Installation typically takes place over the rack or on top of the rack and provides cool air in close proximity to the servers. The system does, however, rely on 45°F chilled water to provide cooling, which limits the possibility for the use of waterside economizing

Airside economizing

When it comes to energy usage, airside economizing offers some clear benefits. A 48% energy saving can be realized when implementing this type of system. This type of system would typically require a new data center build, as large volumes of air must be brought into and out of the data center Due to the fact that the data center space becomes, more or less, an extension of the outdoors, more fluctuation in internal temperature and humidity levels should be expected.

Direct-to-chip or board cooling

This technology is used to a limited extent for IT component cooling, but typically in a hybrid air/direct approach only. This approach still requires less efficient cooling technologies, but considering the best-case scenario of how a system may possibly perform if 100% of the heat rejection was realized through the direct method. The savings are quite large. 82% of the energy required in the cooling process is entirely eliminated, dramatically reducing the costs required to cool the data center This approach does come with drawbacks. Currently there are few commercially available servers that are directly cooled 100% by liquid. This type of design would require a dramatic rethinking of the server design criteria, and would create a subset of servers that are cooled in this manner as air cooled systems would still be required for those that will not adopt the technology. While the energy savings are very large, it is my opinion that the full application of this technology in everyday data center use is still a few years off.

Conclusion

There are many methods available to cool a data center, all with varying degrees of effectiveness and energy efficiency. Traditional CRAC/CRAH systems now deployed are reaching the limit of their capacity – requiring the adoption of new technologies to enable the efficient cooling of growing data center loads. Although these technologies differ in some ways, they often share many common components to reduce the difficulty in installing these systems into existing infrastructures. It is clear that a single system approach may not make sense for every user, but the integration of these systems is key to successfully reducing energy costs while handling the increasing requirements of the users and applications.

There are many benefits of deploying environmentally responsible data center cooling technologies, and their importance will continue to grow moving forward. The products and techniques described in this paper can help with many of the critical issues facing the IT industry and the world at large, including saving energy and money, reducing carbon footprints and limiting greenhouse gas emissions.

Extract from

http://www.bdstrategy.asia/data-center/power-and-cooling/68-understanding-data-centre-cooling-energy-usage-and-reduction-methods

Recommended for you

Leave a comment

*

Time limit is exhausted. Please reload the CAPTCHA.

*