Creating a better and greener working environment

What do the experts say about Evaporative Cooling ?

Many leading experts in the Data Centre and IT field are finally looking at embracing new technologies as they strive to reduce running costs and help save carbon emissions. The following are a small example of some of the comments on the subject.
The following article covers such issues with some interesting comments from Steve Ballmer from Microsoft.
From ‘Data Centre Knowledge June 2009’.
Moving Beyond ‘Babysitting Servers’
June 7th, 2010 : Rich Miller
IT Cooling
The interior of a 40-foot container inside the new Microsoft Chicago data center, packed with servers on either side of a center aisle (click to see a larger version of this image).
Are servers spoiled? Microsoft CEO Steve Ballmer thinks so. “There shouldn’t be people babysitting all these machines,” Ballmer said recently in discussing Microsoft’s push into cloud computing.
After years of living in air-conditioned, brightly-lit rooms, servers are now seeing a change in their surroundings. At the largest data center builders, servers now reside in warmer, darker environs, sometimes encased in shipping containers.
Is the data center industry ready to get out of the “babysitting” business? Or will management and customer uneasiness limit the impact of this trend? These were hot topics of discussion at both the Uptime Institute Symposium 2010 and the Tier 1 Datacenter Transformation Summit.
As mounting power bills push energy efficiency to the fore, data center designers continue to set aside long-held beliefs about operating environments for IT gear. As Ballmer’s comments suggest, Microsoft has been among the most aggressive in pushing the boundaries, particularly with the temperature inside its IT PAC data center containers.

Pushing to 45 Degrees C

“We’ve gone all the way to 45 degrees C (about 113 degrees Fahrenheit) and seen no significant decrease in reliability,” said Dan Costello, Microsoft’s Director of Data Center Research.

Raising the baseline temperature inside the data center can save money spent on air conditioning. Data center managers can save as much as 4 percent in energy costs for every degree of upward change.
While most of Microsoft’s data center containers are managed remotely, the warmer temperatures require some design accommodations so servers can be maintained by staff  in the cold aisle, rather than the hot aisle, as is common in many server and rack designs. “We’re working with front access designs,” said Costello.
“Servers don’t need much air conditioning,” agreed KC Mares of Megawatt Consulting, who has designed and built data centers for Yahoo and Google. “We humans like it pretty comfortable. Servers could care less. They’re a block of metal. Most equipment is waranteed to 85 degrees and vendors will still support it.”
The ‘Bench Mark’ has Moved.
 
IBM Distinguished scientist and former ASHRAE Chairman Roger Schmidt has recently spoken about how the ASHRAE allowable conditions within Data Centres have been widen to allow for using external air (Americans call them airside economizers), the specification has now moved from 20Co – 25Co to a more realistic and environmentally friendly 18Co – 27Co.
This provides the perfect condition in the UK to fully utilise Evaporative cooling in the UK. The move by ASHRAE has primarily been undertaken in a bid to save energy in Data Centres.
Computerweekly.com  Jan 2009
IT Cooling
Green datacentre consortium, the Green Grid has introduced a free online tool and maps to help datacentre and facilities managers calculate if they can use outside air to cool datacentre server rooms.
The software determines how much outside air – known as free cooling – is available for individual datacentres. The updates extend coverage of the tool and maps to 33 European countries. The Green Grid said the tool can help datacentre managers in Europe lower energy consumption and related costs. This could extend the life and improve the energy efficiency of datacentre facilities.
“Finding cooling options that use less power is critical not only for organisations that don’t have resources to build new facilities but also for those that want to save money,” said Vic Smith, Dell representative and EMEA technical work group chair of The Green Grid.
Datacentre managers can enter data about their facilities – including local energy costs, IT load, and facility load – to determine energy savings for individual facilities. The software also provides information about savings that could be obtained using water-side economisers.
According to the Green Grid, a one megawatt (1000kW) datacentre in Luton, with power at a cost of 8.6p per kW hour, could save £312,000 per year using free cooling, or £192,000 per year using a water-side economiser.
Members of The Green Grid have access to a high-resolution graphical map of the estimated hours of air-side and water-side economisation possible for Europe. Lower-resolution maps of European free cooling estimates are available to the public at The Green Grid Web site.

Green IT: Microsoft takes the roof of its datacentre

Thursday 29 January 2009 12:13

green cooling, IT cooling

Arne Josefsberg, general manager of infrastructure services at Global Foundation Services, Microsoft, says it is vital to monitor the average Power Usage Effectiveness (PUE) across all the company’s datacentres to understand how well datacentre operations are under control, and to allow the company to make the right business decisions.

Microsoft’s current annual global average PUE is 1.60, says Josefsberg.

PUE is a standard measurement for the power efficiency of datacentres, as recommended by global consortium The Green Grid.

But all is not as it seems, explains Alan Priestly, consortium spokesperson. “The initial focus was on energy efficiency, meaning facility-load versus the IT load. Power comes in at the wall but only a certain percentage is delivered to the IT equipment. But how much of the energy [going to the servers] is converted to useful work? An IT department can have a datacentre that is very energy-efficient but is not doing a lot of useful work is that really efficient?

“The more complex thing is to measure the effective workload that the datacentre produces,” says Priestly.

The Green Grid is working on the concept, but for the moment PUE is the best measurement IT companies have, and it is the easiest thing to measure.

Carbon emissions

It does introduce a gap, however. As a whole, IT can increase its efficiency, but that does not mean carbon emissions will fall.

Microsoft recognises this mismatch between measurement and reality, and is keen to reduce the impact of its datacentres for the entire lifecycle and along the whole supply chain.

“We want to bring the concept of Moore’s law to energy efficiency in computing. We want to grow computing power but keep energy demands constant or reduced,” says Francois Ajenstat, Microsoft’s director of environmental sustainability.

Ajenstat says he is pushing the vision through all operations including original equipment manufacturers and suppliers.

“That is how we will manage the global growth in the IT industry,” he says.

Currently the company has more or less completed what it calls Generation 2 datacentres, with a focus on improved efficiency.

 

Its Dublin facility, for example, will use outside air all year round except for one or two days, and the company is exploring broader range operating environments in order to deploy chiller-less datacentres for huge power savings.

 

“At any given time we can see the carbon footprint of one datacentre compared to another. We can even go down and compare the footprints of servers such as Hotmail compared to Messenger,” says Ajenstat.

Through such metrics the company can charge its internal business groups by power consumption rather than floor or rack space.

“Business groups are becoming aware of their energy consumption and are making different decisions based on energy as a key consideration,” says Ajenstat.

Ajenstat’s law

Microsoft is now building Generation 3 ( best represented by the Chicago, Illinois facility) founded on Ajenstat’s law of doubling efficiency every 18-24 months.

“The key concepts for our Generation 3 design are increased modularity and greater concentration around energy efficiency and scale. This facility will seem very foreign compared to the traditional datacentre,” says Microsoft’s Michael Manos on his Loose Bolts blog.

Key to efficiency savings is containerisation. Servers are pre-packed in lorry-transported containers complete with everything required to simply plug them into a modular datacentre. This offers interesting sustainability benefits because suppliers compete to provide the most efficient designs.

“Think of it like building blocks, where the datacentre will be composed of modular units of prefabricated mechanical, electrical, security components, etc, in addition to containerised servers,” writes Manos.

And other useful concepts have emerged. Designs for the next-generation, Gen 4, datacentres have no roof. According to Manos, a roof was entirely unnecessary. “How much energy goes into making concrete? How much energy goes into the fuel of the construction vehicles? We are asking, ‘how can we build a datacentre with less building?'”

Microsoft is gathering pace on its software as a service strategy, and through the application of Ajenstat’s efficiency law hopes IT carbon emissions will remain static at worst. The company is helping replace inefficient customer datacentres with state-of-the-art facilities.

“As we grow into the cloud computing space we will increase our carbon footprint as a datacentre operator,” says Ajenstat. “But it becomes an increasing value proposition for our customers where they can choose to use Microsoft’s infrastructure, rather than build their own.”

The company does not yet have plans to charge customers by energy consumption, however.

Reduce DataCenter Cooling Cost by 75%by Keith Dunnavant P.E.

Mark Fisher

C. Mike Scofield P.E., FASHRAE

Tom Weaver P.E.

April 1, 2009

ARTICLE TOOLS

Cooling and Contamination Control

New technology in the field of air-to-air indirect (dry) evaporative cooling (IEC) heat exchangers, coupled with the newly expanded ASHRAE cold aisle requirements, can answer these data center cooling concerns. A recirculation air conditioning by evaporation (RACE) unit with a cooling energy efficiency ratio (EER) above 50 is illustrated in Figure 1.The heart of this central station air handler is a polymer air-to-air indirect evaporative cooling heat exchanger, shown in Figure 2. Hot aisle return air is pushed through the inside of the horizontal tubes and is sensibly cooled by a “scavenger” ambient airstream drawn upward across the wetted exterior surface of the tubes. Sufficient surface is provided to yield a 70% approach of the 100° hot aisle return air temperature to the ambient wetbulb (wb) condition of the outdoor air. The dry-side static pressure penalty for this dry cooling device is in the range of 0.5 to 0.8 in. w.g. Wetside pressure losses are in the range of 0.8 to 1.3 in. w.g.3

http://www.esmagazine.com/ES/Home/Images/es0409-scofield-fig2-lg.jpg
FIGURE 2. The first stage of sensible cooling is provided by the EPX polymer tube air-to-heat exchanger using ambinet wb temperatures for heat rejection through indirect evaporative cooling.

Figure 3 shows a psychrometric chart listing the cooling process points for Sacramento, CA, on the ASHRAE 0.4% wb design day. The first stage of cooling is IEC from 102° down to 81.4° using the 72° wb on the wet side of the heat exchanger to produce 18.9 tons of cooling for the 10,000 cfm recirculation air.

Only 5.4 tons of cooling remain to be provided by the direct expansion (DX) onboard refrigeration system. Rather than rejecting the heat of compression to a condenser coil on the roof where ambient drybulb (db) temperatures are 97° or higher, the condenser coil is located in the humid but cool 79.4° airstream off the sprayed IEC heat exchanger.

Almost like an evaporative cooled condenser heat rejection design, this system will have EERs in the range of 12 to 15 when refrigeration is required during high ambient humidity conditions. Unlike the evaporative-cooled condenser design, water treatment problems are not a concern, since the finned condenser coil remains dry. In addition to higher EERs, the benefits of this onboard DX design include higher compressor capacity at a lower refrigeration condensing temperature and increased compressor life due to the reduced temperature lift.4

A quick calculation indicates that, with a 70% wb depression efficiency, this RACE heat exchanger will produce all the cooling required at ambient wb temperatures below 64° when return air hot aisle temperatures are assumed to be 100° db.

Using typical meteorological year (TMY 2) weather data developed by National Renewable Energy Laboratories (NREL) in Golden, CO, Figure 4 shows the number of hours per year that refrigeration could be eliminated for a 24/7/365 duty cycle in 35 cities throughout the U.S. During these hours, this IEC heat exchanger may provide a 75° cold aisle supply condition. The right side of Figure 4 shows the percent of mechanical cooling reduction, assuming a 2° fan heat addition, at the 0.4% ASHRAE wb design.

Since ambient db design temperatures always coincide with a lower wb condition, the cooling capacity of the heat exchanger is higher at the db design and the residual cooling tons left to refrigeration are lower. Like a cooling tower, the wb design is the critical design criteria for a RACE unit.

A NEW INDIRECT EVAPORATIVE COOLING HEAT EXCHANGER

Figure 2 shows the construction of a new IEC air-to-air heat exchanger. Polymer airfoil-shaped tubes are used to minimize air-side static pressure parasitic losses on the wetside of the tubes. The polymer material meets Underwriters Laboratories (UL) Standard 94V-0 flame spread. The heat exchanger has been tested and is compliant with UL900 Class II. Compliance with these standards is essential, since the heat exchanger is located within the building supply air duct system. A unique sealing method bonds the tubes to the tube-sheet, preventing water leakage from the wetside of the heat exchanger to the recirculated airflow on the dry side.

Hard water and high temperature differences require a robust air-to-air heat exchanger that can shed mineral deposits caused by the indirect evaporative cooling process. The wb depression across the tubes (Figure 2) ranges from only 20° in humid climates to more than 40° in more arid climates. Water evaporation rates are consistent with that of cooling towers with comparable heat of rejection. Required bleed rates for the spray water recirculation sump are a function of the evaporation rate and the water chemistry of the makeup water at the site. For most potable makeup water systems, a bleed rate equal to the evaporation rate will maintain sump dissolved solids at an acceptable level.

A 2002 installation of this IEC module, located in Death Valley, CA, has been monitored for water hardness contamination of the wet side of the polymer tubes. Total dissolved solids (TDS) in Death Valley potable water range from 240 to 19,104 mg/L with an average of about 1,940 mg/L. At that site in October of 2005, it was discovered during a job visit that flexing of the polymer tubes during fan startup and shut down has effectively worked to shed water hardness deposits into the sump. This self-cleaning feature of this heat exchanger extends life expectancy, particularly in extreme hard water environments.5

WATER TREATMENT

http://www.esmagazine.com/ES/Home/Images/es0409-scofield-fig3-lg.jpg
FIGURE 3. The heat rejection process is plotted on a psychrometric chart for the Sacramento, CA, ASHRAE 0.4% wb design condition.

For a mission critical application such as a data center, the cooling system water treatment should be sustainable and fail safe. A new, non-chemical, water treatment that uses a pulse-power technology is recommended for the IEC sump recirculation water.6

Originally developed for cold pasteurization in the food industry, this system encapsulates water hardness minerals and particles into a non-adherent powder that is harmlessly deposited in the bottom of the sump. The device controls scaling of the wetted heat exchanger tubes and biological growth in the sump water. Under proper operation, the pulse-power component will maintain clean sump water with low bacteria counts free of bio-film and eliminate the breeding ground for the amplification of Legionella and other waterborne pathogens.7

For multiple roof-mounted IEC units, a central sump may be designed to accumulate the spray water. One central sump reduces pumping energy. A single set of dual pumps, for redundancy, replaces the recirculation pump at each unit on the roof. A variable volume pump would maintain the required system head pressure in response to a demand at each unit for spray water to wet the IEC heat exchanger. Water treatment costs are reduced, and the weight of the sump water at each roof-mounted unit is eliminated. The total blow down water consumption for the evaporative cooled system is reduced, thereby reducing the demand for potable makeup water. The central sump may also be used as a gray water reservoir for flushing toilets and landscape irrigation, since the water treatment system does not add any chemicals.

EER CALCULATION FOR SACRAMENTO

EER is defined as the cooling energy delivered in Btuh divided by the Watts (W) of electrical energy consumed to produce the cooling effect. For the Sacramento example of a 10,000 cfm cooling design in Figure 3, the parasitic losses for the IEC heat exchanger consist of the following:

Energy to push the air through the dry side of the IEC
= 1,350 W
Energy to pull the air through the wet side of the IEC
= 2,170 W
Spray water recirculation pump energy
= 750 W
Total energy consumed
= 4,270 W

The sensible cooling produced by the IEC at the 0.4% ASHRAE wb design condition is equal to 231,000 Btuh, therefore the EER calculates to be 54.1.

The refrigeration portion of the sensible cooling effect required to reach the 75° cold aisle delivery temperature is calculated to be 66,000 Btuh for the 10,000 cfm. The compressor energy input is 3,850 W, and the fan energy required to overcome the condenser coil static pressure loss is calculated to be 870 W. The mechanical cooling EER calculates to be 14.

The overall cooling EER for both IEC and refrigeration cooling pencils out to be 33. Compare this to a conventional CRAC system rejecting data center heat, on the ASHRAE 0.4% db design day, with an air cooled refrigeration design at an EER of only 10 to 12.

During winter operation, when the air-to-air heat exchanger operates without the spray pump energy loss, the EER increases to 67.6. Operating speed for the VFD on the outdoor air fan is reduced at low wb ambient conditions and lower db temperatures saving fan energy during cold weather.

COLD-AISLE TEMPERATURE CONTROL

Since data center cooling systems are essentially constant volume, close control of the cold aisle supply air temperature is essential. With a recirculation air design and sensible cooling of the supply air, room dewpoint conditions will not change except through moisture migration in or out of the space.During warm ambient conditions, where the air db temperatures are above 45°, the water recirculation pump and sprays will be on to wet the scavenger air side of the heat exchanger (Figures 1 and 2). The scavenger air fan VFD will control the mass flow of air on the wet side of the heat exchanger to maintain the 75° db supply air set point to the cold aisle.

TMY2 hour-by-hour weather data, these two bar charts show, for various U.S. cities, the number of hours per year
FIGURE 4. Using TMY2 hour-by-hour weather data, these two bar charts show, for various U.S. cities, the number of hours per year (left) where mechanical cooling may be eliminated and the percentage reduction (right) of mechanical cooling at the ASHRAE 0.4% wb design condition.

With a rise above 75° setpoint, and the scavenger air fan at full flow, the first stage of DX refrigeration cooling will be activated to maintain the 75° setpoint. The heat of compression is rejected to a condenser coil in the scavenger air exhaust located downstream of the moisture eliminator (Figures 1 and 3).

Refrigeration EERs for the DX cooling stage are a function of the local design ambient wb condition at the coincident db temperature. The more arid the local climate, the higher will be the refrigeration EERs, since there will be a greater drop in the ambient air db temperature within the wet side of the air-to-air heat exchanger upstream of the refrigeration condenser coil. Figure 1 shows that Sacramento, would provide a 17.6° reduction in the ambient db temperature in which to reject the refrigeration heat of compression.

At ambient db conditions below 45°, the water sprays and recirculation pump would not be required to reject the data center heat. With a 50% dry-to-dry heat transfer effectiveness, the scavenger air fan would again modulate the scavenger air across the heat exchanger at a mass flow sufficient to maintain the 75° cold aisle delivery temperature. Below 40° ambient db temperatures, the sump would be drained to protect against freezing.

Data center architecture is critical to the successful application of the cold aisle airflow to the inlet of the electronics being cooled. Unfortunately, this design detail is often outside the province of the mechanical consultant engineer. Without effective separation of the hot aisle and cold aisle airflow paths, a data center is condemned to furnishing lower supply air temperatures and higher airflow rates. Data center design professionals need to work together closely to ensure that hot aisle air is not recirculated to the inlet of the electronics and that cold aisle air is not short-circuited to the hot aisle without passing through the electronics being cooled.

SPACE PRESSURE AND HUMIDITY CONTROL

A positive room pressure is required within the data center to reduce infiltration of outdoor air. A separate AHU that would introduce, filter, and condition the outdoor air is indicated for this task. When ambient humidity levels are below the Class I and Class II data center minimum dewpoint condition of 41.9°, the outdoor air must be humidified. When outdoor air dewpoints are above the 59° maximum, the outdoor air introduced needs to be dehumidified.

In cold climates where there are many annual hours of cold, dry outdoor air conditions to deal with, a unit with an adiabatic direct evaporative cooling/humidifying component should be considered. Because data centers generate so much heat, a 12-in.-deep wetted media pad selected at 400 fpm face velocity will provide free humidification and additional data center heat rejection.8

PARTICULATE AND GASEOUS CONTAMINATION

The introduction of outdoor air for data center cooling with an air economizer saves cooling energy but increases exposure of the electronic equipment to contamination, corrosion, and humidity excursions. A recent ASHRAE Transactions paper discussed the effect of corrosive particulate and gases on computer reliability.9The paper points out that dust that settles on printed circuit boards can lead to short circuiting in the presence of ambient moisture (humidity). The electrical shorting occurs when ionic bridges are created by the dust particles accelerated by moisture from the environment.

The most important parameter controlling corrosion and short circuiting is the relative humidity at the inlet to the electronics. Research by the authors indicates that corrosion becomes negligible below 50% relative humidity. Data centers with airside economizers require real-time monitoring of the outdoor air. In the event of a sudden rise in the level of dust or gaseous contaminant in the outdoor air, the system should close off the external air source and revert back to a recirculation mode.

SUMMARY

Figure 4 summarizes the data center cooling impact of the RACE design in 35 cities throughout the 50 states of the U.S. Humid climates such as Honolulu, Tampa, and New Orleans have the lowest percentage of annual hours where refrigeration may be eliminated. Surprisingly, these same cities, at the ASHRAE 0.4% wb design condition, would yield a better than 50% reduction in refrigeration tons required to delivery 75° to the cold aisle.Northern, Western, and higher elevation locations show the greatest promise of energy savings. Cities where mechanical cooling is eliminated and where the onboard refrigeration system could serve as a backup include Anchorage, Colorado Springs, Helena, Reno, Redmond, Casper, and Cheyenne.

RACE units ensure the integrity of the electronic equipment by controlling data center dewpoint and limiting external contamination while offering a very efficient method of heat rejection. ES

Our Latest Jobs

Aberystwyth Council

Three years ago Celsius installed an EcoCooling Data centre cooling system for Aberystwyth Council at their Aberaron data centre, it worked that well delivering all the savings promised (over 80 tonnes of carbon per annum) as part of their carbon reduction program, therefore, when the cooling system in their main Data centre in their state of the art head office required a new back up system they approached us for our innovative solution.

Read more

Go Outdoors

Celsius have recently installed evaporative cooling systems for 6 Go Outdoors stores that suffered from severe summer overheating problems, the first was at Taunton and as this was so successful the Oxford store soon followed. The Celsius EcoCooling evaporative cooling approach has now been established as the Go Outdoor’s chosen solution for all of their stores, it fits in perfectly with their low carbon approach.

Read more