Monthly Bites

Data Center Cooling Optimization

In a Data Center environment the management of utility power is most vital but often this results into as significant portion of the opex being attributed to the cooling. This normally happens due to the over cautious attitude of the Datacenter manager who has the primary responsibility to ensure a cent percent availability.

However with the growing pressure of reducing cost as well as the carbon foot print which is an underlying social obligation now the DC operator has to assume the responsibility of optimizing the cooling and being innovative in their approach to operations rather than being conservative.

At this juncture it should be understood that apart from cooling management all other losses in electrical & mechanical equipments can be tackled at initial stages of DC Build while cooling inefficiencies keep creeping in and sometimes building up overtime. This is accentuated even as the baton is passed on from the designing to project management team and finally to the varied operations and management teams that manage the DC over its life.

Cooling of DC & Utilities can be managed along the service life of a DC to save energy. DC Managers should set up operations strategy to implement focus rather than random cooling which results in impacting the profitability of the DC.

Optimization Strategy in cooling can be brought in by implementing some or all the following measures as may be applicable to their environment:

  • Utility equipment cooling based on the Data sheet of the equipments

    Cooling needs to be provided to the DC equipment based on the data sheet of the equipment. Generally we see that the cooling is maintained at the same temperature for the Server racks as well as the UPS & UPS Battery. It is to be noted that the cooling need of all the three are different. For servers it is 22°C where as for UPS it is 35°C & for Battery it is 27°C.

  • Provide sufficient air in front of the Server Racks based on the power consumed in the rack & not on the rated power of the rack

    Provisioning of air in sufficient quantity, Cubic Ft per Minute (CFM) in front of the rack at 22°C is important to avoid hot spot as well as to avoid high temperature alarm from the servers. This CFM is based on the operating heat load of the servers installed in the Rack which is calculated based on the operating power of the servers.

  • Use false flooring tiles with higher percentage of perforation

    It is necessary to provide perforated tile having high percentage of opening to meet the necessary CFM of air. In absence of this the requirement has to be met by operating more number of PACs in the server hall resulting in wastage of energy.

  • Have check on the PAC return air temperature setting

    Keep a record of the return air temperature setting of the PACs so that the units are operating at efficient level of operation. This will save energy for the DC.

  • Prevent leakages in the false flooring system

    Check for sealing the false flooring at the surface so that the cold air is not lost due to short cycling.

  • Use blockers under the false flooring to restrict the air flow within the required area

    Install air blockers below the false flooring to channelize the air to the required space & avoid it to spread to the unused space so that the CFM output is better at the false floor openings through the perforated tiles.

    For a DC operator who implements the above steps in a systemic and ongoing manner can expect to save the cooling energies by 10-15%.

    Optimizing any operations or equipment is not an one time activity it has to be imbibed into to way one operates. The underlying philosophy or the delivery framework comprising of processes and systems should drive the onsite operations team and aid them in making such dynamic changes. This will not only bring one time savings but also result in incremental savings before it can be sustained.

Suresh DikshitSuresh Dikshit,
Datacenter Veteran,
Speaker has 26 years of experience in Data centre Design and management and has lead the country’s largest critical Datacenters.