High Performance Computing and Cutting-Edge Analysis Can Open New Scientific Realms

March 1, 2018 | Contact media relations

Two people looking at a 3D interactive graphical data the Visualization Center in the Energy Systems Integration Facility (ESIF).

When I started my career in biology more than three decades ago, the tools for research were well-established: test tubes and flasks were part of the lab and a new generation of cheap chips heralded the era of low-cost supercomputers. And while those flasks are still about the same, computing has evolved to the point where what is called high performance computing (HPC) has opened vast new areas of knowledge.

That is why we at the National Renewable Energy Laboratory (NREL) recently created the Scientific Computing and Energy Analysis (SCEA) directorate, pairing our expertise in HPC with our deep knowledge of energy analysis. This move allows us to combine our supercomputing and energy analysis capabilities in a very crosscutting way broadly for the laboratory. Now, computing informs analysis, and analysis informs computing, and it does so across all the science that we do. The initiative opens whole new vistas of research that can provide actionable solutions. The timing couldn’t be better—because our global energy challenges are complex. We can’t expect simple answers, and don’t have the luxury of waiting.

People, cities, nations—and indeed the world—all need reliable advice and they need it now. If a city or metropolitan district is trying to figure out the best ways to get advanced energy technologies, including renewable energy, it needs research at a higher level than ever. Instead of just looking at one or two scenarios, we at NREL can run thousands of simulations and apply analytical technical techniques. This process can determine what solutions best fit a client’s need and give decision-makers the best possible advice. 

And this HPC-analysis trend will continue. Merging the science done with exascale computing (computing systems capable of a billion billion calculations per second) and advanced analytics on real-world data means enormous opportunity for our mission work. We can hit our research targets faster and harder.

What does this look like? We’re already using our current capabilities to visualize complex, 3D images of the wakes from multiple wind turbines so that we can better understand the dynamics in a wind farm. But we want to go further. Currently we are collaborating with other labs in an Office of Science project to use exascale computing to run much more detailed and accurate simulations of wind farms. What if we could combine that capability with data sets taken from sensors in the field, along with automated learning techniques, to optimize the energy output of a wind farm in real time under variable conditions?  We could then potentially feed this data into an immersive center for decision makers—allowing those facing challenges to understand such complicated issues visually. It’s a long way from a test tube and beaker. 

Think how this capability might also be applied to our expanding options for autonomous vehicles. With all the sensing, computational, analytical, and visualizing capability used to help orchestrate the flow of millions of such vehicles, we will have at our disposal a means to guide this dawning autonomous transportation era by giving decision makers and stakeholders a clarity of  comprehension they simply don’t have now.

We’re looking ahead at NREL—and energized by this new realm of insight derived from HPC and analysis. By pairing the new tools—the best in computing and the deepest analysis—we at the lab will be shining a spotlight farther down the road than we ever imagine possible. Because of that, we will be able to find collective solutions to our most pressing energy challenges.

—Written by Martin Keller, Director of NREL

Tags: Energy Analysis,Energy Systems Integration