IBM's green demo data centre (part 2)
Cooling efficiency in action
(Continued from part 1.)
IBM's energy-reducing technologies
IBM's Tivoli management product has had energy measurement added to its functionality. It can automate a function to optimise energy use acting as per a policy set by the IT staff. It could switch servers off overnight if required, or at other quiet periods.
IBM's 'Cool Battery' is a fluid/solid storage device installed between a chiller and the air-conditioning system. The idea is to switch off chillers when not needed by storing their cooled output in a kind of battery. Data centre air-conditioning needs a cooling liquid constantly to work and that means, up until now, that chillers have to work constantly as well. Yet they are inefficient power users.
So the chiller is run at full power until the water-based fluid, with added chemicals to vary its freezing point, in the battery freezes. (It's called a phase-change material.) Then the chiller is switched off and the air-conditioning fluid is cooled by being passed through the cool battery. As it takes up the heat from the air-con it gradually melts. When melting is at a particular stage then the chiller is switched on again.
Inside IBM the use of intermittent chilling and a cool battery saved 45 percent of the energy draw of the unaided chiller over two years.
IBM can deliver cooling water direct to a rack with a heat exchanger installed directly in the rack, similar in concept to water-cooling an engine. It's reckoned to be more effective than air-cooling a vehicle engine, as Porsche realised with its water-cooled 996 iteration of its, until then air-cooled, 911 sportscar range.
IBM is working with suppliers of non-IT kit in data centres to get power efficiency up; Emerson Network Power for example, also Schneider Electric, Eaton Corporation and GE Consumer & Industrial.
The South Bank IBM data centre
IBM has built a more energy-efficient data centre at its London South Bank office to exemplify its green approach. One wall is glass so that you can see how the racks are laid out inside.
There is integrated but separate power and network cabling installed in the top or bottom of the racks, not under the floor. The underfloor space is used to get cooling air to the front of the racks. The racks have unused slots covered with blanking plates to enhance the airflow within and between them.
These racks are taller and deeper than standard racks, at 45U high and 1.2 metres deep. IBM terms this an Integrated Racking Solution (IRS) and the racks in effect, are rooms within the data centre room. They are laid out in a cold aisle/hot aisle design and the air flow is strong. There is a 25 kilowatt cooling capability per rack.
The data centre room itself is built from pre-fabricated components in a quasi-flat pack design. IT could be installed inside an existing data centre and save space by incorporating high-density blade-based computing inside the racks.
As a data centre it has a greater IT capability than a traditional data centre but with lower running costs, meaning lower carbon emissions, partly through a smaller number of CRAC (computer room air conditioner) units,
IBM's approach differs from that of APC. Steve Bowden, a green computing consultant in IBM's systems and technology group, says that APC don't use the cool battery concept and it encloses the hot aisles whereas IBM encloses the cold aisles.
He asserts that: "IBM is best-positioned with its technology to help the world with power and cooling data centre issues.
The general IBM recommendation is to take a holistic approach to improving data centre efficiency and don't look at things in isolation.