If you live in Fargo or Duluth or anywhere else on the Canadian border, heat is very welcome. Even in these icy regions, heat can do a lot of damage to a data center. Aside from high temperatures, humidity levels that are too low can increase operational risks, speed up equipment failure, and even cause fires. 1
Multiple methods are available to assist data center operators in maintaining the right temperature and humidity levels within their facilities. There are also new methods available that let data center managers choose the best way for their environment.
Data center cooling methods all follow a similar strategy. They remove excess heat from IT equipment and the surrounding area of the data center. This reduces the risk of equipment damage and downtime. Take a look at some proven methods.
Keep cool with deployment visibility.
In order to manage your deployments effectively, it should be obvious that the staff in your data center must be able to monitor the environmental conditions.
CoreSite’s Customer Service Delivery Platform contains a component named CoreINSITE (r), which provides near-real-time reporting on data center environmental data, including power consumption data and circuit data. It also includes temperature and humidity readings.
Computer room air conditioners or computer room heat handlers are used to cool data center server rooms using air cooling. It is the same way that commercial and residential buildings are cooled. Both CRAHs and CRACs use a cooling medium, usually water or refrigerant, inside a coil to remove heat from the air and then transfer it to an external device, such as a condenser, chiller, or cooling tower.
water-side economization is used by many chiller plants to achieve free cooling, particularly in cooler climates. When the outside temperature is right, condenser and chiller waters exchange heat in a plate-and-frame heat exchanger. This bypasses the mechanical cooling cycle within a chiller. The heat generated by the return of chilled water from the data center can be emitted to the atmosphere using much less energy.
Liquid cooling is more efficient at transferring heat than air. This allows for higher server densities, and it can also support high-power chips, which generate a great deal of heat. Liquid cooling comes in two different types:
- Liquid Immersion Cooling immerses an electrical device in the dielectric fluid within a closed circuit. The fluid absorbs heat from the device and then rejects it via a heat exchanger (usually water) to another medium. In sealed two-phase immersion cooling, the server boils fluid and turns it into vapor. The gas is cooled down and condensed by using heat exchangers that are water-cooled, allowing for the cycle to continue.
- The liquid cooling method, on the other hand, delivers nonflammable dielectric liquid directly to the components that generate the most heat in the server. These include the processor chip, motherboard, or CPU. The fluid absorbs heat and turns it into vapor. The gas is carried away and condensed before being sent back as a cool fluid to the chip.
Raised Floor Plenums
The flexibility and reliability of raised floor cooling make it a popular choice for colocation environments. Air handlers pressurize the underfloor plenum with cold air. In front of the server cabinets, perforated tiles allow cold air from the pressurized underfloor plenum to escape and draw into the IT equipment. Perforated tiles come in different sizes to allow for more or less airflow to the server cabinets, depending on density and CFM requirements.
Hot & Cold Aisle Layouts
In a typical computer room with air cooling, racks, and server cabinets are arranged in rows. This configuration of equipment creates alternate aisles of cold and hot air. In a slab-floor room, the cold air is directed to the server intakes by the cold aisle. The hot air that is emitted from the servers’ backs is drawn to the CRAC/CRAH return coils, which then transfer the heat away from the room. The perforated tiles, previously mentioned, supply cold air into the cold aisles in a raised-floor environment. The heat is again redirected from the servers into the hot hall and then returned to the CRAC/CRAH. CRAC or CRAH cold air flows into the space beneath the raised floor. The perforated tiles allow the cold air from the plenum to enter the main area, ideally right in front of the server intakes. After passing through the servers, the heated air returns to the CRAC/CRAH for cooling, usually after mixing it with the cold air.
For a hot/cold environment to be successful, it is important to follow many best practices. These include installing blanking panels and cabinet side panels.
New ASHRAE Cooling Guideline
It’s not enough to simply cool data centers to the temperature in high-tech meat lockers. Research has revealed new cooling techniques for data centers. Conventional wisdom had it that the temperature should be kept as low as possible, usually 55 degrees Fahrenheit.
American Society of Heating, Refrigerating, and Air-Conditioning Engineers publishes recommendations on temperature and humidity for data centers. In its latest set of guidelines, ASHRAE suggests that data centers should be kept at temperatures between 64.4 and 80.6 F with a humidity of 60 percent.
The recommended temperature increase for data centers is in line with CoreSite’s standard environmental SLAs. Data center cooling systems may account for up to 50% of the energy consumed by data centers. Making temperature adjustments will reduce energy consumption and protect IT equipment while preventing downtime.
We have even developed a checklist/guide for customers called “Dos and Don’ts.” This is a list that will help you understand how you can maximize cooling and energy efficiency for your cabinets and cages. We’re all in this together!