An NREL-led Research Center Aims to Reinvent Material Science

Reinventing Material Science

It's not often that scientists set out to reinvent an entire field of study, but it's happening now.

This photo is dominated by a complex stainless steel device, consisting of a pillar-mounted sphere with many round portals projecting from it, some of which are covered with flat metal plates that are bolted on. A computer cable runs from the device to a keypad, where Paul Ndione is standing and smiling at Dave Ginley, who smiles back as he stands near a black box mounted near the device. Enlarge image

NREL Research Fellow David Ginley (center) and NREL post-doc Paul Ndione operate NREL's new pulsed-laser deposition system, which allows them to deposit multiple gradated films on top of one another. This process can create thousands of different compounds on a single 2-inch-by-2-inch substrate.
Photo by Dennis Schroeder

It's not often that scientists set out to reinvent an entire field of study, but that is exactly what the Center for Inverse Design is pursuing in the field of material science. The vision of the center is to revolutionize the discovery of functional materials by developing an "inverse design" approach, powered by theory that guides experiment.

The Center for Inverse Design was established as an Energy Frontier Research Center, funded by the U.S. Department of Energy's Office of Basic Energy Sciences. The National Renewable Energy Laboratory (NREL) is the lead organization in partnership with Michael Toney of the Stanford Linear Accelerator Center; professors Arthur Freeman, Thomas Mason, and Kenneth Poeppelmeier of Northwestern University; professors Douglas Keszler and John Wager of Oregon State University; and Alex Zunger, formerly of NREL and now at the University of Colorado at Boulder.

Historically, the development of new materials for technology has been largely based on trial-and-error search or even accidental discovery. The Center for Inverse Design is founded on the key principle that modern theory, combined with high-throughput and targeted experimentation, can directly address the challenge of innovating novel materials and nanostructures by design. This principle takes the conventional paradigm and reverses it—given the desired property of a material, find the structure of the material." Theory can then guide experiment, rather than simply describe its results.

This Holy Grail of materials science is becoming reality due to the availability of ultra-high-speed computing, the creation of high-throughput experimental tools and iterative techniques, and the development by Zunger and his colleagues of "inverse band structure"—a method of deriving crystal structures based on pre-determined electronic and optical properties—as well as other theoretical approaches. This powerful combination had not existed before.

"The Center for Inverse Design represents a new way of approaching science," says William Tumas, the center's director. "Our approach to inverse design can be pivotal by coupling theory and experiment. This can significantly accelerate the rate of basic science discovery."

The Key for High-Throughput Computation: Supercomputers and Algorithms

Photo of a bank of computer cabinets, standing about six feet tall, four feet deep, and 30 feet wide. Five more similar banks of computer cabinets are lined up behind it, and heavy cables and cooling ducts extend from the cabinets into the ceiling. Enlarge image

NREL is using the RedMesa supercomputer at Sandia National Laboratories to perform its complex computations, including calculations of the physical properties of thousands of potential solar-cell materials. NREL will have its own supercomputer when the Energy Systems Integration Facility, now under construction, is completed late this year.
Photo from Sandia National Laboratories.

"Historically, if you told other material scientists that you were going to do inverse design, they would look at you and laugh," says NREL Research Fellow David Ginley, the center's chief scientist for experiment. "That's literally true, because there wasn't enough computational power—doing one material was hard, doing 100,000 or a million was just ridiculous. You couldn't think about doing that much calculation in a reasonable way."

Supercomputers have allowed the center to succeed in its new approach, but so have a number of algorithms that allow the researchers to perform those calculations more efficiently. For the initial focus of the center on semiconductor materials used in solar cells, many of those algorithms were developed at NREL, drawing on its 30-year history of developing solar cells and learning how solar cells and other semiconductor-based devices function. By combining supercomputers and efficient algorithms, the center has opened the door to new material science.

"In the old days, if you wanted somebody to calculate the properties of a cluster of atoms, they'd come back and say, 'We can do ten atoms,'" says Ginley. "Well, ten atoms do not a material make, and so calculating bulk materials properties was tough. But by going to new ways of doing the calculations, we now can do hundreds to thousands of atoms, and those are much more representative of a real material."

Stephan Lany leads the NREL theory component of the center. He and his colleagues have developed and applied methods to predict material properties based on first principles, using calculations of electronic structures. They have implemented approaches to predict, with a high degree of reliability, experimentally relevant properties of semiconductor materials, such as the structure, doping, and electron-carrier concentrations, as well as bandgaps or absorption spectra. This predictive capability is essential to design materials with target properties and functionalities.

"By coupling these tools to today's high-performance computing resources, we can quickly evaluate a large number of potential materials in a way that we've never done before," says Lany. "This enables inverse design to work and will lead to the realization of new and optimized functional materials through an iterative approach that draws successively on theory, experimental synthesis, and characterization."

High-Throughput Synthesis: Like Spraying Paint

A triangular plot, representing pure cobalt, zinc, and nickel oxides at each of its corners, and oxide compounds of the three elements in its interior, includes one thousand overlapping dots that are color-coded to represent the conductivity of each combination of compounds. The logarithm of the conductivity varies from -2 near the zinc corner to +2 in the vicinity of a point that is a third of the way between the cobalt and nickel corners. This point is labeled as Co2NiO4, Enlarge image

An example of high-throughput material science is shown in this ternary graph of electrical conductivity (in Siemens per centimeter) for one thousand oxides containing zinc, nickel, and cobalt, all of which were deposited on a single 2-inch-by-2-inch substrate as overlapping, gradated films containing each of the metallic elements. Co2NiO4 was found to be the most conducting sample.
Illustration by John Perkins, NREL

To make a broad spectrum of potential materials, the Center for Inverse Design uses combinatorial methods to build "composition-spread libraries" of materials. To do this, researchers create 2 x 2" squares that gradually change in composition from top to bottom and from side to side. As a result, each point on the square represents a unique material composition. That may sound difficult, but in practice, it's not.

"It's really easy," says Ginley. "It's like we have a yellow, a blue, and a red spray-paint can, and we spray overlapping color circles onto the substrate. Where the circles overlap, we get every combination."

In reality, instead of using spray paint, a process called sputtering deposits the materials on the substrate. In sputtering, energetic particles free the atoms from a piece of source material, and then these freed atoms pass through a nozzle—a "sputter gun"—that sends them hurling onto the substrate. During sputtering, "basically, each sputter gun acts like a spray source for a particular atom or compound," says Ginley, "and when you co-deposit these materials, you get intimate mixing, and so you get everything in the range of materials that you're looking for."

Researchers can also deposit the materials using a process called pulsed-laser deposition, which uses pulses of laser light to vaporize the surface of a source material. Some materials can even be dissolved in solvents or suspended in solutions and deposited using ink-jet printers.

To explore an even wider range of materials, researchers can also change the conditions under which the film is deposited, perhaps by changing the deposition temperature across the width of the square, or by varying the partial pressure of oxygen in the deposition chamber across the height of the square.

High-Throughput Analysis and the Data Crunch

Partnering to Solve Solar Cell Material Challenges

The Stanford Linear Accelerator Center (SLAC), one of the partners in the Center for Inverse Design, is providing the center with access to its Stanford Synchrotron Radiation Lightsource, which produces extremely bright X-rays that can be used to study materials on the atomic scale.

Aerial photo of an oval structure, a particle accelerator ring measuring roughly 20 meters by 25 meters and about 2 meters wide, with an adjoining particle injector ring measuring about 10 meters in diameter. A number of adjoining buildings are located inside and outside of the oval. Enlarge image

The NREL-led Center for Inverse Design is employing the Stanford Synchrotron Radiation Lightsource (SSRL) at SLAC to study solar cell materials on the atomic scale. The SSRL accelerates electrons to near the speed of light in an oval path defined by a ring of electromagnets, called a synchrotron. The electrons emit extremely bright X-rays as they follow this path, and this radiation is used to study matter.
Photo from Brad Plummer, SLAC National Accelerator Laboratory

"SLAC has capabilities for measuring atomic positions and which elements are in particular positions that we can't do elsewhere—that you basically can't do anywhere that doesn't have a big synchrotron-based X-ray source," says John Perkins, a senior NREL researcher for the center. "The interactions with SLAC are critical to closing the loop, allowing us to determine if the atomic structures are causing the resultant material properties in the way that we expect them to."

One example is the center's work on spinels, a type of oxide with the form A2BO4, in which A and B are positively charged atoms. Because spinels can be transparent and can conduct electricity, they are used to form the top layers of solar cells. This top layer conducts electricity out of the cell while allowing light to pass through to the heart of the solar cell. The center is currently investigating "p-type" spinels, which are better at conducting the positively charged "holes" left behind by vacant electrons, rather than conducting the electrons themselves. This capability allows the spinels to serve as a positive terminal. In contrast, "n-type" materials conduct negatively charged electrons better than holes and serve better as negative terminals.

"If we take a prototypical spinel material, such as Co2ZnO4, our calculations predict that if the zinc ends up on a cobalt site in the crystal lattice, that's good—that creates free holes that lead to p-type conductivity. However, if the cobalt ends up on a zinc site, that's neutral and it doesn't make any difference in the material," says Perkins. "And it's predicted theoretically that the number of cobalt atoms on zinc sites will be far, far greater than the number of zinc atoms on cobalt sites, but because the cobalt atoms on zinc will be neutral, the material will still turn out to be a p-type conductor.

"We can determine that it's a p-type conductor at NREL, but we can't determine how many cobalt atoms are on zinc sites and how many zinc atoms are on the cobalt sites, so we can't determine whether or not the mechanism that's been predicted theoretically is actually the mechanism at play that's causing this to happen. That's why we have to go out to SLAC. That's generally true for the materials being worked on at the center: basically, the resultant material properties will depend in detail on which atoms are in which sites, and that information is the information we get out of SLAC."

According to NREL Research Fellow David Ginley, the collaboration has been even more successful than anticipated.

"It has been so beneficial that it has become addictive," says Ginley. "Originally we had a couple people going out to SLAC; now we have people going out all the time. They can examine atomic structure and site occupancy, they can examine electronic properties, and they can do it all at a level of resolution that we just can't touch at NREL.

"This is really a true collaboration. We're not just going out there and running samples, we're intimately working with the people out there, and that work is accelerating our understanding and our rate of progress."

Once researchers have created the 2 x 2"-square compositional library, there are several analytical techniques they can use to measure its properties. Because each point on the square represents a different material composition, taking an array of measurements across its width and breadth yields a wealth of data.

For example, researchers might measure the library's electrical conductivity, light absorption, light reflection, and its work function, which is the minimum energy needed to free an electron from its surface. Considering that each point in the library could have at least a couple process variables and four or five physical property measurements associated with it, that's a lot of variables to consider.

If you wanted to create a plot of how all those variables related to each other, the plot would easily have seven dimensions to it, or maybe more. This is why finding the sweet spot for material with the desired properties turns out to be one of the center's greatest challenges.

"It's like doing this huge array of material science in one fell swoop," says Ginley. "Of course, you generate terabits of data, and the real question is: how do you take these huge data files and mine them appropriately to get the real information out? And that's still an evolving area.

"People are okay thinking about one, two, or three dimensions, but when you're up to 10, not a chance! So how you mine the data and how you present it in a way that you can make logical conclusions from it is non-trivial."

Zeroing in on an ideal material is also an iterative process, so there may be several material libraries loaded with data for each investigation. For instance, if an initial material library examines an oxide with a cobalt content ranging from zero to 100%, and finds a material sweet spot around 25% cobalt, a second material library might examine a cobalt content running from 20% to 30%. A third might zero in on a tighter range, such as 22%–23%. Each step is like zooming in with a microscope on a particular area of interest.

Once the center's researchers find an apparently "ideal" composition, they revert to more traditional material science approaches: synthesizing that ideal composition, characterizing the material, and applying theoretical analysis to fine-tune its material properties.

Proving the Model of Inverse Design

The center's work on spinels has already proved that its approach can work: by first using theory to derive a set of "design rules" for p-type spinels, researchers applied those rules to achieve a 10,000-fold increase in conductivity in one potential material.

"The approach and tools that we've developed through the extensive capabilities within the Center for Inverse Design are starting to show the value of the inverse design approach for materials discovery," says Tumas.

Despite those challenges, the center has already accomplished far more than would have been possible just a few short years ago.

"We can cover in a week what used to take a researcher's entire post-doctoral career, literally," says Ginley. "Being able to do material discovery as fast as the theorists can do the theory creates the ability to really carry out inverse design."

And although the center is currently focused on improving solar cell materials, its work could potentially establish inverse design as a new, proven approach to material science in general—an approach that could be applied to all types of materials challenges.

"We are focusing on synthesizing predicted materials for solar energy conversion and also looking into other desired properties of materials," said Tumas. "We still have a ways to go to make inverse design truly a reality that will be adopted by the materials science community. Larry Kazmerski, our program integrator, is leading our efforts to reach out to that community. So the solar cell materials—while important and significant in their own right—are also the means to an end of actually validating that inverse design works."

—Kevin Eber

Related Links

Center for Inverse Design

Photovoltaics: New Materials, Devices, & Processes

Top of page