Data centers are the massive engines under the hood of the mobile internet economy. And it is no secret that they demand a lot of energy: with energy capacities ranging from 10MW to 100MW, they require up to 80,000 times more than what a typical US home needs.

And yet, you do not have to be a genius to figure out how the enormous energy bills could be reduced. The main energy gobblers are the CRACs, Computer Room Air Conditioners or the alternative, the CRAHs, the Computer Room Air Handlers. Most data centers still rely on some form of mechanical cooling. And to the outsider, it looks pretty wasteful, even stupid, that a data center is consuming energy to cool servers down while the outside air in a mild climate is more than cold enough most of the time (less than 20°C/68 °F).

Free cooling

There are quite a few data centers that have embraced "free cooling" totally, i.e. using the cold air outside. The data center of Microsoft in Dublin uses large air-side economizers and make good use of the lower temperature of the outside air.

Microsoft's data center in Dublin: free cooling with air economizers (source: Microsoft)

The air side economizers bring outside air into the building and distribute it via a series of dampers and fans. Hot air is simply flushed outside. As mechanical cooling is typically good for 40-50% of the traditional data center's energy consumption, it is clear that enormous energy savings can be possible with "free cooling".

Air economizers in the data center

This is easy to illustrate with the most important - although far from perfect - benchmark for data centers, PUE or Power Usage Effectiveness. PUE is simply the ratio of the total amount of energy consumed by the data center as a whole to the energy consumed by the IT equipment. Ideally it is 1, which means that all energy goes to the IT equipment. Most data centers that host third party IT equipment are in the range of 1.4 to 2. In other words, for each watt consumed by the servers/storage/network equipment, 0.4 to 1 Watt is necessary for cooling, ventilation, UPS, power conversion and so on.

The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1.2. You can imagine that the internet giants save massive amounts of energy this way. But as you have guessed, most enterprises and "multi-tenant" data centers cannot simply copy the data center technologies of the internet giants. According to a survey of more than 500 data centers conducted by The Uptime Institute, the average Power Usage Effectiveness (PUE) rating for data centers is 1.8. There is still a lot of room for improvement.

Let's see what the hurdles are and how buying the right servers could lead to much more efficient data centers and ultimately an Internet that requires much less energy.

Free Cooling Geography
Comments Locked

48 Comments

View All Comments

  • bobbozzo - Tuesday, February 11, 2014 - link

    "The main energy gobblers are the CRACs"

    Actually, the IT equipment (servers & networking) use more power than the cooling equipment.
    ref: http://www.electronics-cooling.com/2010/12/energy-...
    "The IT equipment usually consumes about 45-55% of the total electricity, and total cooling energy consumption is roughly 30-40% of the total energy use"

    Thanks for the article though.
  • JohanAnandtech - Wednesday, February 12, 2014 - link

    That is the whole point, isn't it? IT equipment uses power to be productive, everything else is supporting the IT equipment and thus overhead that you have to minimize. From the facility power, CRACs are the most important power gobblers.
  • bobbozzo - Tuesday, February 11, 2014 - link

    So, who is volunteering to work in a datacenter with 35-40C cool aisles and 40-45C hot aisles?
  • Thud2 - Wednesday, February 12, 2014 - link

    80,0000, that's sounds like a lot.
  • CharonPDX - Monday, February 17, 2014 - link

    See also Intel's long-term research into it, at their New Mexico data center: http://www.intel.com/content/www/us/en/data-center...
  • puffpio - Tuesday, February 18, 2014 - link

    On the first page you mention "The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1."

    This article says that Facebook has a achieved a PUE of 1.07 (https://www.facebook.com/note.php?note_id=10150148...
  • lwatcdr - Thursday, February 20, 2014 - link

    So I wonder when Google will build a data center in say North Dakota. Combine the ample wind power with cold and it looks like a perfect place for a green data center.
  • Kranthi Ranadheer - Monday, April 17, 2017 - link

    Hi Guys,

    Does anyone by chance have a recorded data of Temperature and processor's speed in a server room? Or can someone give me the information about the high-end and low-end values measured in any of the server rooms respectively, considering the equation temperature v/s processor's speed?

Log in

Don't have an account? Sign up now