Subscribe
About

Serving up power saving

Need to cut back on power consumption? Check the data centre.
Philip Hampton
By Philip Hampton, CTO at Powermode.
Johannesburg, 17 Jul 2008

Most medium and large companies have experienced a sevenfold increase in power demands related to their IT infrastructures over the last decade. This estimate is based on the typical growth of the data centre to accommodate new technologies.

But this presents a problem. With Eskom expecting a 10% reduction in power consumption and huge price increases in power in the pipeline, companies are now looking for ways to reduce electrical usage and costs.

Does this mean putting the brakes on expansion?

Not necessarily. In many instances the data centre is responsible for a significant percentage - up to 40% - of a company's total electricity consumption.

This makes the data centre a good place to launch a 'best practices' energy spec-ing exercise in a bid to reduce power demands.

In order to be successful, it is important to identify the factors that contribute to power consumption within the data centre.

It is generally accepted that 50% of the power is consumed by IT equipment and another 30% by the air-conditioning system, with a further 20% on supplementary systems.

A comprehensive plan to efficiently and cost-effectively address data centre power and cooling is necessary in order to achieve a meaningful (at least 10%) reduction in energy consumption.

As the IT equipment is clearly the bigger consumer of electricity, it's the first place to look for energy savings.

But before savings can be realised, it's important to measure the energy consumption of the servers and other key elements of the centre. Very few companies do this.

Power supplies/servers

One of the least efficient pieces of equipment in a data centre is the server power supply. Almost 25% of the energy used to power 1U and 2U servers could be saved by simply improving server power supply efficiency.

Most power supplies manufactured before 2005 did not break the 70% power efficiency barrier. Today, best-in-class power supplies are available that deliver efficiencies of close to 90%.

The difference between a pre-2005 and a new power supply could be as much as 11% in actual power consumption.

Another approach would be to simply use fewer servers. Many companies are opting for this solution today by implementing server consolidation projects.

Virtualisation software extends the benefits of physical consolidation even further. With this approach, applications run on virtual machines - several virtual servers on one physical box - which consume computing resources based on an application's needs. This allows for even more efficient use of a server's capabilities.

Virtualisation can produce significant results, often reducing the number of physical servers by 80% or more.

Similar benefits can often be realised with data storage consolidation.

Managing the environment

By managing the data centre environment more efficiently, energy costs can be reduced further.

The cooling system needs particular attention. In this regard, effective sealing of the data centre is critical - as is the installation of a vapour barrier that isolates the controlled environment from the building environment.

Obviously humidity from the outside of the centre makes cooling more challenging - and costly.

The most efficient vapour seals use plastic film and incorporate special wall and floor coverings. Vapour-retardant paint is also effective.

Another key area of focus should be the airflow within the data centre. Ideally the airflow should be optimised by maximising the natural (design) flow of the air from the front to the back of the equipment.

In other words, the racks should be set up so that there are distinct 'hot aisle' and 'cold aisle' features to the floor plan.

This approach arranges racks front-to-front so the cooling air rising into the cold aisle is pulled through the front of the racks on both sides of the aisle and exhausted at the back of the racks into the hot aisle.

From a construction perspective, only cold aisles should have perforated tiles, while floor-mounted cooling should be placed at the end of the hot aisles - not parallel to the row of racks.

Parallel placement can cause air from the hot aisle to be drawn across the top of the racks and to mix with the cold air, causing insufficient cooling to equipment at the top of racks and reducing overall energy efficiency.

Under-floor airflow management should also be taken into account. Cabling is often a problem in raised-floor data centres because it obstructs airflow and hinders heat dissipation.

Winter cooling

In many instances the data centre is responsible for a significant percentage - up to 40% - of a company's total electricity consumption.

Philip Hampton is CTO at Powermode.

As it is mid-winter, IT managers (particularly on the highveld) have an ideal opportunity to augment traditional data centre air-conditioning systems with outside air.

But it's not as simple as opening the windows. So-called 'economiser systems' are necessary. There are two types available.

* Air-side economisers allow calibrated amounts of filtered outside air to enter a data centre to aid cooling.
* Fluid-side economisers incorporate a supplemented chilled-water or glycol-based cooling system.

Studies have found that, on average, normalisers can result in a 13% power consumption reduction.

Part-load efficiency

Because data centre air-conditioning systems are designed with some level of built-in redundancy in terms of capacity, they often operate in a 'partial-load' condition.

This presents an opportunity to optimise these systems at this level of operation.

Sensible heat vs latent heat

As data centres become crammed with more equipment - and as server densities rise - so more sensible heat is generated.

Sensible heat is potential energy in the form of thermal energy or heat. It must not be confused with latent heat which comes from outdoor humidity and from people working in the data centre.

Usually, latent heat remains pretty constant. The upward graph belongs to sensible heat.

Therefore, placing emphasis on cooling solutions that can operate at maximum sensible heat capacity, except when dehumidification is needed, will result in energy savings.

With this in mind, it is also important to coordinate the operation of various cooling systems and units.

For example, air-conditioners may be operating in different areas and require individual temperature and humidity control. On the north side of a building, the temperatures will be hotter than on the south side which may also have a lower relative humidity condition.

Control systems should be deployed allowing for the coordination of the units with power saving as the key goal.

Procurement

Finally, it's important to establish an IT equipment procurement policy that exploits the energy-efficiency benefits of low power processors and high-efficiency power supplies.

Today, inefficient data centre systems are being phased out by the manufacturers and replaced with higher efficiency units.

Purchasing these units will help create a solid platform for an energy-optimised data centre.

* Philip Hampton is CTO at Powermode.

Share