Skip to main content
Photo of a bank of computer cabinets, standing about six feet tall, four feet deep, and 30 feet wide. Five more similar banks of computer cabinets are lined up behind it, and heavy cables and cooling ducts extend from the cabinets into the ceiling.

NREL is using the RedMesa supercomputer at Sandia National Laboratories to perform its complex computations, including calculations of the physical properties of thousands of potential solar-cell materials. NREL will have its own supercomputer when the Energy Systems Integration Facility, now under construction, is completed late this year.
Photo provided by Sandia National Laboratories

Reinventing Material Science

It's not often that scientists set out to reinvent an entire field of study, but it's happening now.

It's not often that scientists set out to reinvent an entire field of study, but that is exactly what the Center for Inverse Design is pursuing in the field of material science. The vision of the center is to revolutionize the discovery of functional materials by developing an "inverse design" approach, powered by theory that guides experiment.

The Center for Inverse Design was established as an Energy Frontier Research Center, funded by the U.S. Department of Energy's Office of Basic Energy Sciences. The National Renewable Energy Laboratory (NREL) is the lead organization in partnership with Michael Toney of the Stanford Linear Accelerator Center; professors Arthur Freeman, Thomas Mason, and Kenneth Poeppelmeier of Northwestern University; professors Douglas Keszler and John Wager of Oregon State University; and Alex Zunger, formerly of NREL and now at the University of Colorado at Boulder.

Historically, the development of new materials for technology has been largely based on trial-and-error search or even accidental discovery. The Center for Inverse Design is founded on the key principle that modern theory, combined with high-throughput and targeted experimentation, can directly address the challenge of innovating novel materials and nanostructures by design. This principle takes the conventional paradigm and reverses it—given the desired property of a material, find the structure of the material." Theory can then guide experiment, rather than simply describe its results.

This Holy Grail of materials science is becoming reality due to the availability of ultra-high-speed computing, the creation of high-throughput experimental tools and iterative techniques, and the development by Zunger and his colleagues of "inverse band structure"—a method of deriving crystal structures based on pre-determined electronic and optical properties—as well as other theoretical approaches. This powerful combination had not existed before.

"The Center for Inverse Design represents a new way of approaching science," says William Tumas, the center's director. "Our approach to inverse design can be pivotal by coupling theory and experiment. This can significantly accelerate the rate of basic science discovery."

The Key for High-Throughput Computation: Supercomputers and Algorithms

This photo is dominated by a complex stainless steel device, consisting of a pillar-mounted sphere with many round portals projecting from it, some of which are covered with flat metal plates that are bolted on. A computer cable runs from the device to a keypad, where Paul Ndione is standing and smiling at Dave Ginley, who smiles back as he stands near a black box mounted near the device.

NREL Research Fellow David Ginley (center) and NREL post-doc Paul Ndione operate NREL's new pulsed-laser deposition system, which allows them to deposit multiple gradated films on top of one another. This process can create thousands of different compounds on a single 2-inch-by-2-inch substrate.
Photo by Dennis Schroeder, NREL

"Historically, if you told other material scientists that you were going to do inverse design, they would look at you and laugh," says NREL Research Fellow David Ginley, the center's chief scientist for experiment. "That's literally true, because there wasn't enough computational power—doing one material was hard, doing 100,000 or a million was just ridiculous. You couldn't think about doing that much calculation in a reasonable way."

Supercomputers have allowed the center to succeed in its new approach, but so have a number of algorithms that allow the researchers to perform those calculations more efficiently. For the initial focus of the center on semiconductor materials used in solar cells, many of those algorithms were developed at NREL, drawing on its 30-year history of developing solar cells and learning how solar cells and other semiconductor-based devices function. By combining supercomputers and efficient algorithms, the center has opened the door to new material science.

"In the old days, if you wanted somebody to calculate the properties of a cluster of atoms, they'd come back and say, 'We can do ten atoms,'" says Ginley. "Well, ten atoms do not a material make, and so calculating bulk materials properties was tough. But by going to new ways of doing the calculations, we now can do hundreds to thousands of atoms, and those are much more representative of a real material."

Stephan Lany leads the NREL theory component of the center. He and his colleagues have developed and applied methods to predict material properties based on first principles, using calculations of electronic structures. They have implemented approaches to predict, with a high degree of reliability, experimentally relevant properties of semiconductor materials, such as the structure, doping, and electron-carrier concentrations, as well as bandgaps or absorption spectra. This predictive capability is essential to design materials with target properties and functionalities.

"By coupling these tools to today's high-performance computing resources, we can quickly evaluate a large number of potential materials in a way that we've never done before," says Lany. "This enables inverse design to work and will lead to the realization of new and optimized functional materials through an iterative approach that draws successively on theory, experimental synthesis, and characterization."

High-Throughput Synthesis: Like Spraying Paint

To make a broad spectrum of potential materials, the Center for Inverse Design uses combinatorial methods to build "composition-spread libraries" of materials. To do this, researchers create 2 x 2" squares that gradually change in composition from top to bottom and from side to side. As a result, each point on the square represents a unique material composition. That may sound difficult, but in practice, it's not.

"It's really easy," says Ginley. "It's like we have a yellow, a blue, and a red spray-paint can, and we spray overlapping color circles onto the substrate. Where the circles overlap, we get every combination."

In reality, instead of using spray paint, a process called sputtering deposits the materials on the substrate. In sputtering, energetic particles free the atoms from a piece of source material, and then these freed atoms pass through a nozzle—a "sputter gun"—that sends them hurling onto the substrate. During sputtering, "basically, each sputter gun acts like a spray source for a particular atom or compound," says Ginley, "and when you co-deposit these materials, you get intimate mixing, and so you get everything in the range of materials that you're looking for."

Researchers can also deposit the materials using a process called pulsed-laser deposition, which uses pulses of laser light to vaporize the surface of a source material. Some materials can even be dissolved in solvents or suspended in solutions and deposited using ink-jet printers.

To explore an even wider range of materials, researchers can also change the conditions under which the film is deposited, perhaps by changing the deposition temperature across the width of the square, or by varying the partial pressure of oxygen in the deposition chamber across the height of the square.

High-Throughput Analysis and the Data Crunch

Once researchers have created the 2 x 2"-square compositional library, there are several analytical techniques they can use to measure its properties. Because each point on the square represents a different material composition, taking an array of measurements across its width and breadth yields a wealth of data.

For example, researchers might measure the library's electrical conductivity, light absorption, light reflection, and its work function, which is the minimum energy needed to free an electron from its surface. Considering that each point in the library could have at least a couple process variables and four or five physical property measurements associated with it, that's a lot of variables to consider.

If you wanted to create a plot of how all those variables related to each other, the plot would easily have seven dimensions to it, or maybe more. This is why finding the sweet spot for material with the desired properties turns out to be one of the center's greatest challenges.

"It's like doing this huge array of material science in one fell swoop," says Ginley. "Of course, you generate terabits of data, and the real question is: how do you take these huge data files and mine them appropriately to get the real information out? And that's still an evolving area.

"People are okay thinking about one, two, or three dimensions, but when you're up to 10, not a chance! So how you mine the data and how you present it in a way that you can make logical conclusions from it is non-trivial."

Zeroing in on an ideal material is also an iterative process, so there may be several material libraries loaded with data for each investigation. For instance, if an initial material library examines an oxide with a cobalt content ranging from zero to 100%, and finds a material sweet spot around 25% cobalt, a second material library might examine a cobalt content running from 20% to 30%. A third might zero in on a tighter range, such as 22%–23%. Each step is like zooming in with a microscope on a particular area of interest.

Once the center's researchers find an apparently "ideal" composition, they revert to more traditional material science approaches: synthesizing that ideal composition, characterizing the material, and applying theoretical analysis to fine-tune its material properties.

Proving the Model of Inverse Design

The center's work on spinels has already proved that its approach can work: by first using theory to derive a set of "design rules" for p-type spinels, researchers applied those rules to achieve a 10,000-fold increase in conductivity in one potential material.

"The approach and tools that we've developed through the extensive capabilities within the Center for Inverse Design are starting to show the value of the inverse design approach for materials discovery," says Tumas.

Despite those challenges, the center has already accomplished far more than would have been possible just a few short years ago.

"We can cover in a week what used to take a researcher's entire post-doctoral career, literally," says Ginley. "Being able to do material discovery as fast as the theorists can do the theory creates the ability to really carry out inverse design."

And although the center is currently focused on improving solar cell materials, its work could potentially establish inverse design as a new, proven approach to material science in general—an approach that could be applied to all types of materials challenges.

"We are focusing on synthesizing predicted materials for solar energy conversion and also looking into other desired properties of materials," said Tumas. "We still have a ways to go to make inverse design truly a reality that will be adopted by the materials science community. Larry Kazmerski, our program integrator, is leading our efforts to reach out to that community. So the solar cell materials—while important and significant in their own right—are also the means to an end of actually validating that inverse design works."

Related Links

Center for Inverse Design

Photovoltaics: New Materials, Devices, & Processes

Deliberate Science

Winter 2012 / Issue 2

RSS Subscribe to the RSS feed.

Editorial Team

  • Kim Adams | Managing Editor
  • Bill Gillies | Creative Director
  • Dennis Schroeder | Photographer
  • Jennifer Josey | Editor
  • Michael Oakley | Web Development
  • Amy Glickson | Web Development
  • Email the editor