The,Data,Center,Temperature,De computer The Data Center Temperature Debate
----------------------------------------------------------Permission is granted for the below article to forward,reprint, distribute, use for ezine, newsletter, website,offer as free bonus or part of a product for sale as longas no changes a Gone are those times when the companies and the organisations didn't need a hi-tech system to handle them. Owing to the considerable increase in the business sector and thus, an enormous increase in the complexity of the organisational struc
Though never directlyarticulated by any data center authority, the prevailing practice surroundingthese critical facilities has often been The colder, the better. However, someleading server manufacturers and data center efficiency experts share theopinion that data centers can run far hotter than they do today withoutsacrificing uptime and with a huge savings in both cooling related costs and CO2emissions. One server manufacturer recently announced that their rack ofservers can operate with inlet temperatures at 104 deg F. Why does it feel the need topush the envelope? The cooling infrastructure is an energy hog. This system,operating 24x7x365, consumes a lot of electricity to create the optimalcomputing environment, which may hover anywhere between 55 to 65 deg F. (Thecurrent recommended range from ASHRAE is 18-27 C or 64.4 deg F through 80.6 deg F) To achieve efficiencies, anumber of influential end users are running their data centers warmer and areadvising their contemporaries to follow suit. But the process isnt as simpleas raising the thermostat in your home. Here are some of the key arguments and considerations: Position: Raising server inlettemperature will realize significant energy savings. Arguments for: · SunMicrosystems, both a prominent hardware manufacturer and data center operator,estimates a 4% savings in energy costs for every one (1) degree increase inserver inlet temperature. (Miller, 2007)· A highertemperature setting can mean more hours of free-cooling possible throughair-side or water side economizers. This information is especially compellingto an area like San Jose, California, where outside air (dry-bulb) temperaturesare at or below 70 deg F for 82% of the year. Depending on geography, theannual savings from economization could exceed six figures. Arguments Against: · The coolinginfrastructure has certain design setpoints. How do we know that raising serverinlet temperature wont result in false economy, causing additional,unnecessary consumption in other components like the server fans, pumps, orcompressors?· Free-cooling,while great for new data centers, is an expensive proposition for existingones. The entire cooling infrastructure would require re-engineering and may becost prohibitive and unnecessarily complex. · Costs fromthermal-related equipment failures or downtime will offset the savings realizedfrom a higher temperature setpoint. Position: Raising server inlet temperature complicates reliability,recovery, and equipment warranties. Arguments for: · Inlet air andexhaust air frequently mix in a data center. Temperatures are kept low tooffset this mixing and to keep the server inlet temperature within ASHRAEsrecommended range. Raising the temperaturecould exacerbate already-existing hotspots. · Cooltemperatures provide an envelope of cool air in the room, an asset in the caseof a cooling system failure. The staffmay have more time to diagnose and repair the problem and, if necessary, shutdown equipment gracefully. · In the case ofthe 104 degree F server, whats the chance every piece of equipmentfromstorage to networkingwould perform reliability? Would all warranties remainvalid at 104 deg F? Arguments Against: · Raising thedata center temperature is part of an efficiency program. The temperatureincrease must follow best practices in airflow management: using blankingpanels, sealing cable cutouts, eliminating cable obstructions under the raisedfloor, and implementing some form of air containment. These measures caneffectively reduce the mixing of hot and cold air and allow for a safe,practical temperature increase. · The 104 degreeF server is an extreme case that encourages thoughtful discussion and criticalinquiry among data center operators. After their study, perhaps a facility thatonce operated at 62 deg now operates at 70 deg F. These changes cansignificantly improve energy efficiency, while not compromising availability orequipment warranties. Position: Servers are not as fragile and sensitive as one may think. Studiesperformed in 2008 underscore the resiliency of modern hardware.Arguments For: · Microsoft ranservers in a tent in the damp Pacific Northwest from November 2007 through June2008. They experienced no failures. · Using an airside economizer, Intel subjected 450 high density servers to theelementstemperatures as high as 92 deg and relative humidity ranges from 4 to90%. The server failure rate during this experiment was only marginally higherthan Intels enterprise facility. · Data centerscan operate with a temperature in the 80s and still be ASHRAE compliant. Theupper limit of their recommended temperature range increased to 80.6 deg F (upfrom 77 deg F). Arguments Against: · Hightemperatures, over time, affect server performance. Server fan speed, forinstance, will increase in response to higher temperatures. This wear and tearcan shorten the devices life. · Studies fromdata center behemoths like Microsoft and Intel may not be relevant to allbusinesses: o Their enormous data center footprint is more immuneto an occasional server failure that may result from excessive heat. o They can leverage their buying power to receivegold-plated warranties that permit higher temperature settings. o They are most likely refreshing their hardware at amore rapid pace than other businesses. If that server is completely spent after3 years, no big deal. A smaller business may need that server to last longerthan 3 years. Position: Higher Inlet Temperatures may result in uncomfortable workingconditions for data center staff and visitors. Arguments for: · Consider the104 degree F rack. The hot aisle could be anywhere from 130 deg to 150 deg F.Even the higher end of ASHRAEs operating range (80.6 deg F) would result inhot aisle temperatures around 105-110 deg F. Staff servicing these racks wouldendure very uncomfortable working conditions. · Responding tohigher temperatures, the server fan speed will increase to dissipate more air.The increased fan speed would increase the noise level in the data center. Thenoise may approach or exceed OSHA sound limits, requiring occupants to wear earprotection.Arguments Against· It goeswithout saying that as the server inlet temperature increases, so does the hotaisle temperature. Businesses must carefully balance worker comfort and energyefficiency efforts in the data center. · Not all datacenter environments have high user volume. Some high performance/supercomputingapplications operate in a lights-out environment and contain a homogeneouscollection of hardware. These applications are well suited for highertemperature setpoints. · The definitionof data center is more fluid than ever. The traditional brick and mortarfacility can add instantaneous compute power through a data center containerwithout a costly construction project. The container, segregated from the restof the building, can operate at higher temperatures and achieve greaterefficiencies (Some close-coupled cooling products function similarly). ConclusionsThe movement to raise datacenter temperatures is gaining but it will face opposition until the concernsare addressed. Reliability and availability are at the top of any ITprofessionals performance plan. For this reason, most to date have decided toerror on the side of caution: to keep it cool at all costs. Yet, highertemperatures and reliability are not mutually exclusive. There are ways tosafeguard your data center investments and become more energy efficient. Temperature is inseparable fromairflow management; data center professionals must understand how the air getsaround, into, and through their server racks. Computational fluid dynamics(CFDs) can help by analyzing and charting projected airflow on the data centerfloor, but as cooling equipment doesnt always perform to spec and the data youenter could miss some key obstructions, onsite monitoring and adjustments arecritical requirements to insure that your CFD data and calculations are accurate.Data centers with excesscooling are prime environments to raise the temperature setpoint. Those withhotspots or insufficient cooling can start with low-cost remedies like blankingpanels and grommets. Close-coupledcooling and containment strategies are especially relevant, as server exhaustair, so often the cause of thermal challenges, is isolated and prohibited fromentering the cold aisle. With airflow addressed, userscan focus on finding their sweet spotthe ideal temperature setting which alignswith business requirements and improves energy efficiency. Finding it requires proactive measurement andanalysis. But the rewardssmaller energy bills, improved carbon footprints anda message of corporate responsibilityare well worth the effort. BibliographyMiller, R.(2007, September 24). Data Center Cooling Set Points Debated. RetrievedFebruary 19, 2009, from Data Center Knowledge:http://www.datacenterknowledge.com/archives/2007/09/24/data-center-cooling-set-points-debated
The,Data,Center,Temperature,De