Data centers now house more efficient densely packed servers and IT gear that releases more heat, creating a green challenge

September 21, 2010

3 Min Read
Data Centers Work Hard to Be Cool

NEW YORK -- Network operators face major challenges in keeping data centers cool, particularly as they add advanced equipment that is more densely packed but also consumes more power and throws off more heat, industry experts noted at Light Reading's Green Broadband Event today.

But there is progress being made, and US telecom operators are catching up to the rest of the world, says Chuck Graff, director of corporate network and technology for Verizon Communications Inc. (NYSE: VZ). He admits to being embarrassed three years ago, when attending an international event in Berlin, at how far behind the US was. Today, 37 percent of US carriers say they have created a green strategy and an implementation plan.

"I think that's a positive number, given where we were a few years ago," Graff says. "We are moving in the right direction. Legacy equipment is a challenge, because we don't have the wherewithal to just replace everything that is out there with things that are more efficient."

Some popular alternative cooling methods such as Kyoto cooling, which uses outside "free" air to cool a data center, are really greenfield technologies that need to be built in when a facility is created.

For the most part, telecom operators are looking for ways to retrofit data centers, central offices, and headends in ways that aren't so expensive, admits Peter Hayden, director of sustainable power services for Alcatel-Lucent (NYSE: ALU)'s Network and Systems Integrator unit.

Where once equipment was designed for heat densities of 1 to 2 kW per rack, today's data gear consumes more like 10 to 20 kW per rack, Hayden says. That means traditional cooling methods, which pump cool air down from ducts or up from the floor, are often not enough, forcing computer room air conditioners (CRACs) to work harder and consume more energy.

There are ways to address the issue without too much expense, Hayden says. Companies are doing things such as positioning equipment to create cool aisles and warm aisles within the data center; using blanking panels on equipment to confine the heat; or using clear plastic baffles to restrict the airflow.

Each option has disadvantages, Hayden admits, such as creating areas where the data center is uncomfortably warm. Some solutions can be difficult to implement with old equipment or may require additional duct work to move warm air back to the CRAC to be cooled.

But such options are becoming critically important, says David Quirk, senior engineer with Verizon Wireless .

"We are getting densities where cooling is critical. We rely on it to maintain service," Quirk says. Equipment failure or shortened equipment life are distinct possibilities if cooling isn't handled properly.

As IT equipment is being pushed out into more facilities, network operators find themselves with many different power supply options and requirements, he says, and the need to build out quickly and meet market demands can even force service providers to accept equipment that doesn’t meet their usual standards -- using AC equipment versus traditional DC equipment, for example.

Operators are adopting different approaches to arranging equipment, balancing the cost of various solutions and the need for speed-to-market of service, Quirk says.

Alcatel-Lucent also offers a Modular Cooling Solution that is designed to retrofit facilities by using copper piping and hoses to bring low-pressure pumped refrigerant close to the equipment cabinets themselves for more efficient cooling, Hayden says.

— Carol Wilson, Chief Editor, Events, Light Reading

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like