A Decade of Greener Computing Blooms Inside NREL's Data Center

Researchers Push Boundaries To Lead Industry in Sustainable Directions

Aug. 2, 2023 | By Brooke Van Zandt | Contact media relations

Stock image of a data center.

Some things are treacherous when mixed, like water and electricity. So when advanced-computing experts at the National Renewable Energy Laboratory (NREL) made the move over a decade ago to use water to cool central processing units in its data center, it is understandable why some folks instinctively thought it was a bad idea.

"Most people balked at the idea of having water anywhere near a computer," said Aaron Andersen, NREL's group manager for advanced computing operations. He understood why the computing industry had doubts. But because NREL is the U.S. Department of Energy's (DOE's) laboratory dedicated to energy-efficiency research, NREL was uniquely positioned to challenge conventional energy-intensive mechanisms for cooling, like air conditioning.

It was a calculated risk to install NREL's first water-based cooling system inside its Energy Systems Integration Facility­ (ESIF) data center in 2012. NREL's collaboration with Hewlett-Packard (HP) on innovative liquid-cooled supercomputing was recognized with a 2014 R&D 100 award and a 2014 R&D 100 Editor's Choice Award for Sustainability, recognizing the best of the 100 winners. "Water is much better than air at conducting heat, and getting the liquid cooling close to the hot computer chips was essential to the high efficiency of the cooling approach," said Steve Hammond, NREL's computational science center director from 2002 to 2019.

That risk paid off, and not just in the $4.05 million operational-cost savings to date.

By not relying on air conditioning, the data center has avoided expelling 22,829 metric tons of carbon dioxide over the past 10 years. That is the equivalent of sequestering carbon dioxide emissions from over 25.2 million pounds of coal.

But NREL researchers know that water is a precious resource too. Since introducing the innovative and energy-efficient thermosyphon cooling system to the ESIF data center in 2018, NREL has saved over 6.5 million gallons of water.

These massive efficiency achievements are what have kept the NREL data center's power usage effectiveness (PUE) as low as 1.028. PUE measures energy efficiency by dividing the total amount of power entering a data center by the power needed to run the IT equipment within it. The computing industry's average PUE has ranged from 1.57 to 1.67 since 2013.

As a result of these efforts and others, the ESIF data center has been recognized with numerous awards, including the "Oscars" of the data center industry—the Data Center Dynamics Data Center Eco-Sustainability Award—which recognizes innovative and pioneering approaches to sustainable data center design. The award established NREL's place as a recognized leader in the field, and the ESIF regularly hosts many people in the computing industry to share the laboratory's knowledge.

"We benefitted from a team of talented, world-class researchers matched with strong industry partners," said Ray Grout, NREL's center director for computational science. "We are always searching for opportunities to share what we've learned with others. Collaboration is one way we find computing's next big questions—and the answers."

How To Raise the Supercomputing Bar

A paradigm-shift mentality has kept NREL's researchers pushing the boundaries of computing. When DOE's Office of Energy Efficiency and Renewable Energy installed the Eagle supercomputer at NREL in 2012, the advanced computing team had four major objectives involving the responsible stewardship of energy, water, and waste products. They also knew they needed to stay nimble as they pressed forward. "The team left room to adjust course to accommodate discoveries made along the way," Andersen said.

That mentality left the door open for achievements like thermosyphon cooling to break through. "The efficient use of water was always part of the plan," explained Andersen, "but the thermosyphon approach came after four years of operations."

Thermosyphon cooling is a form of evaporative cooling, which is an effective technique for managing the massive amounts of heat that supercomputers produce. It utilizes dry sensible cooling by flowing the return (hot) water from the data center through a flooded shell and tube evaporator that is filled with a refrigerant. Automatic controls vary the fan speed to control the evaporation cycle and reject much of the sensible heat to the atmosphere, further cooling the return water and reducing or eliminating the evaporation of water depending on outside atmospheric conditions.

NREL, with partners Johnson Controls and Sandia National Laboratories, initially deployed this innovative thermosyphon cooler as a test bed on the roof of NREL's ESIF. A key ingredient to its success was the addition of the Johnson Controls BlueStream Hybrid Cooling System—an advanced dry cooler that uses refrigerant in a passive cycle to dissipate heat.

In its first two years of operation, the thermosyphon saved the equivalent of more than three Olympic-size pools of water. This accomplishment earned NREL and its partners a 2018 Federal Energy and Water Management Award and the Data Center Dynamics 2018 Eco-Sustainability Award.

The efficient use of energy and water were the first two objectives NREL researchers tackled simultaneously. A typical data center needs to dedicate 70% of its energy consumption to the task of keeping its equipment cool. Within 10 years, NREL managed to whittle that percentage down to just 3% for not only its data center but also its entire ESIF—a Leadership in Energy and Environmental Design (LEED) Platinum-certified facility where 14 labs, a data visualization center, a control room, and a supercomputer occupy 182,500 square feet.

"The third objective focused on recycling or reusing waste products," Andersen said, "and the ESIF's ability to heat the NREL campus with its waste heat is an illustration of that."

NREL collaborated with HP for the HP Apollo 8000 System. This innovative system uses component-level warm-water cooling to dissipate heat generated by the supercomputer, thus eliminating the need for expensive and inefficient chillers in the data center. Water circulates through heat exchangers in the high-performance computing (HPC) systems to efficiently capture waste heat. The water is heated to around 100°F by the HPC systems and is used as a source of heating for laboratory and office spaces.

R&D Magazine recognized NREL and HP's achievement with a 2014 Laboratory of the Year award.

But researchers will not rest on yesterday's advanced computing achievements; they are using them as stepping stones on the path to their next big breakthrough.

Prepared for Future Partnerships

Andersen credits NREL's success to the unwavering dedication of his team members, who are "empowered by NREL's mission, which allows us to take the bold steps forward that many organizations with large data centers can't take based on the risk." Experts from across disciplines are drawn to NREL's living laboratory ecosystem, where taking risks is an important part of scientific discovery.

NREL's pursuit of more energy-efficient computing is not over. In March 2022, NREL partnered with the Joint Institute for Strategic Energy Analysis to launch its Green Computing Catalyzer to explore algorithmic energy consumption improvements and measurements, data center efficiency, and the waste cycle of computers and materials.

Advancing the science of computing is complemented by NREL's dedication to provide world-class HPC capabilities to its researchers and partner organizations. Advanced-computing researchers are searching for ways to reduce the energy consumption of powerful research applications, like the DOE's ExaWind code used to create wind turbine models and simulations, or the Vienna Ab Initio Simulation Package (VASP) code used for atomic-scale materials modelling. As NREL continues to lead on the frontiers of artificial intelligence and machine learning, more efficient algorithms are needed to optimize GPU usage and ultimately yield faster time-to-solution.

As NREL's digital-twin capability grows, it offers partners the chance to simulate and evaluate tweaks to a data center configuration before investing millions of dollars in upgrades. (A typical data center upgrade investment can cost $70 to $80 million.) In support of the Advanced Research Projects Agency-Energy's COOLERCHIPS program, NREL is creating a digital twin to develop testing protocols that evaluate energy-efficient cooling technologies for data center operations.

The thermal-energy potential of the NREL data center is now being explored to provide residential hot water heating at higher thermal temperatures. The Kestrel supercomputer—built by HPE—is now installed in the ESIF data center. It will amplify NREL's data center advancements and support NREL's journey to net-zero-lab status.

NREL is where bold ideas and collaborative partnerships bring the paradigm-shifting changes our future demands. To learn how NREL's advanced computing capabilities can support your mission, contact Steve Gorin.

Tags: Computational Science,Partnerships