Skip navigation to main content.
NREL - National Renewable Energy Laboratory
About NRELEnergy AnalysisScience and TechnologyTechnology TransferTechnology DeploymentEnergy Systems Integration
Bookmark and Share Printable Version

Green Computing Helps in Zero Energy Equation

April 14, 2010

Photo of two men watching as a third man goes over blueprints in the data center of NREL's Research Support Facility. Enlarge image

Kevin Donovan, data center manager for NREL's Research Support Facility, goes over blueprints with NREL IT project manager Craig Robben and NREL Data Center Coordinator Justin Peltz in the building's data center. The building's cooling system, taking advantage of Colorado's cool nights, is considered key to achieving net-zero energy use when the facility opens this summer.
Credit: Pat Corkery

It's a daunting challenge erecting the largest net-zero-energy office building in the world.

It's especially daunting when that building will be full of people computing, teleconferencing, and generating teraflops of information about renewable energy.

The National Renewable Energy Laboratory aims to generate as much energy as it uses in the new 222,000-square-foot U.S. Department of Energy Research Support Facility (RSF) on the NREL campus. When completed this summer, it will house more than 800 people and a data center that stores and manages mountains of information on computer servers.

A net zero-energy building produces as much energy as it consumes.

To get there, "every watt has to count," Craig Robben, Information Technology project manager for the RSF says.

The RSF can't get to net-zero without getting credit from some of the energy produced via the sun and wind at its Golden, Colo., campus.

But neither can it get there if the actual energy use in the new building gets above about 250 watts per person. "That's four or five light bulbs per person — for everything — computers, servers, building systems, the exercise room, everything," Robben told a roomful of employees at a recent informational gathering.

The engineers and scientists from NREL's Building Technology Program set the energy criteria and the energy design strategies that are making it possible for the RSF to use no more carbon-based energy than is produced by renewables.

Even just a couple years ago, there would have been no hope for per-capita energy use being that low in a modern office building. The electrical needs of desktop computers, servers, scanners, printers, and more would have mushroomed above those numbers.

Still, Robben is confident the RSF will meet those goals, with a big boost from smarter use of information technology.

Smart Cooling, Virtualized Servers Key to Energy Savings

Plans to get RSF there employ an intelligent cooling system, natural lighting, virtualized servers and common-sense measures to conserve, switch off and think twice.

"We're wasting a lot of energy going out the back of the desktops as heat," he said. "Desktop computers are not supposed to be space heaters."

So, employees will be encouraged to continue the switch to laptops, which only average about 35 watts — about a third the average wattage of desktops. And laptops are getting so powerful that they exceed the needs of most users. Newer mobile processors now include the ability to 'ramp-up' their processing speeds as more number crunching is required.

Only LCD monitors will be going into the RSF, and a good portion of those will use the more energy-efficient LED-backlight. LCD's have always been more energy efficient than the cathode ray tube monitors, but manufacturers are stepping up with new technologies to be even more efficient. As time goes on, Robben expects other new display technologies, such as Organic LED (OLED) displays, to replace LCD's.

All-in-One Printer/Copier/Scanner to be the Norm

Photo of five people in construction vests and hard hats gazing toward the ceiling of the data center in NREL's Research Support Facility. Enlarge image

NREL IT managers and supervisors from Haselden Construction eye the spot where the racks will be positioned in the data center of the new Research Support Facility.
Credit: Pat Corkery

There will be very few, if any, local or group printers in the RSF. Scanners, copiers and fax machines will grow all but extinct. New all-in-one devices that can fax, scan, print, e-mail and copy in one unit will save huge amounts of energy, Robben said. The new protocol will be just one or two of those units per wing, compared to the current situation, in which there are some 600 printers for 1,800 employees, plus hundreds of other devices to scan, fax and copy.

The new units can print up to 50 pages per minute, print in high-resolution color on both sides of up to tabloid-sized paper, and perform high-resolution color scans that can be e-mailed.

Phone calls will be made via the Voice Over Internet Protocol system, which uses less energy, even while affording more functionality. The phone system gives users the option of turning the computer into a virtual handset. "Your computer becomes your phone," Robben said.

RSF employees will be encouraged to switch to motion-detector or similar "smart" power strips that will sense when someone isn't in the office, and then switch off the devices that aren't needed — say, the label printer or task lights.

"It's scary how much power is wasted there," he said.

Cooling Down the Teraflops

The greatest challenge is achieving net-zero energy in a building that has a large data center.

Typically, servers rest in racks, chomping up and spitting out information. Cool air blows through, trying to keep the processors from frying.

To create all that cool air, data centers typically employ chillers, basically big air conditioning units that sit outside on pads. The chillers are running all of the hours of the day, pumping chilled water into the building. The cool liquid passes over a big radiator and fan unit cooling the air that passes through the data center and to the servers.

The RSF will employ several strategies to dampen the energy needs of the data center including taking advantage of Colorado's climate and some ingenious engineering to minimize the hours the chillers must run.

"The building has been designed from the beginning to take advantage of our environment," Robben said.

Underneath the RSF is a labyrinth of concrete looking like a Rube Goldberg mousetrap, storing thermal energy. That labyrinth becomes a giant "battery," storing cool air during the summer nights and warm air during the winter days.

Air will circulate through that concrete maze constantly. The cold high-altitude air captured at night will remain in the labyrinth during the warm hours of the following day. During the winter, the hot air from the data center will be dumped into the labyrinth to aid in heating the building.

Meanwhile, efficient evaporative chillers, running cold water over pads, will cool the servers when needed. And for the rare occasion when outside air is too hot and humid for evaporative chillers to work effectively — estimated to be fewer than 10 hours per year — a more traditional central chilled water plant will cool the data center. Right now the data center chillers operate 24 hours a day, seven days a week.

The Key: Running Many Virtual Servers on One Physical Server

Photo of a monolith-shaped metal structure about 14 feet high sitting on a concrete pad outside the Research Support Facility. Enlarge image

An air intake structure outside the west wing of the Research Support Facility — nicknamed "the football" — takes in chilly night air to cool the data center. Such air-intake structures are key to lowering energy costs in the new building.
Credit: Pat Corkery

In the RSF's data center, the individual rack-mount server will go the way of dinosaurs.

Each of the rack-stacked servers plugs into its own power supply, and must be switched on even if during most hours it is just handling, say, 5 percent of its processing capacity. A humming, albeit underused, server still piles up the wattage.

Blade servers are game-changers, though. They're smaller, more streamlined and still packed with great capacity. And they'll be the work horses of the RSF.

Sixteen blade servers can fit in the space taken up by a few older servers. More important, all 16 are in a single blade chassis, sharing power supplies, cooling fans and circuit boards.

Along with blades, virtualization is the biggest energy saver at the server level. "Virtualization is taking multiple logical servers and running them on a single physical server," Robben said. "We're averaging 20 virtual servers per blade. We can run so many virtualized systems on one blade because most systems are only running at about 5 percent of their potential. With that kind of load we could potentially run 320 servers" off a single chassis fully loaded with blades.

Any virtualized server can be moved from one physical system to another without impacting services. That means that underutilized servers can be moved and their physical server can be put to sleep until it is needed again.

"Technology continues to provide us with the tools we need to make IT energy efficient," Robben said. "The RSF will demonstrate how using these methods, standards and tools in an office environment can help a building reach net-zero."

Learn more about green IT, Sustainable NREL and the Research Support Facility.

Haselden Construction and RNL are building the 222,000 square-foot Research Support Facility building, which is designed to be a model for sustainable, high-performance building design, and will provide DOE-owned work space for administrative staff who currently occupy leased space in the nearby Denver West Office Park. The RSF was designed by RNL. Stantec Consulting served as the project's engineering consultant.

NREL is the U.S. Department of Energy's primary national laboratory for renewable energy and energy efficiency research and development.  NREL is operated for DOE by The Alliance for Sustainable Energy, LLC.

— Bill Scanlon