Finding new materials has traditionally been guided by intuition and trial and error,” said Turab Lookman, a physicist and materials scientist in the Physics of Condensed Matter and Complex Systems group at Los Alamos National Laboratory. “But with increasing chemical complexity, the combination possibilities become too large for trial-and-error approaches to be practical.”
Computational and theoretical materials science is playing an increasingly important role in advancing the search for novel materials and understanding the properties of existing ones. Computational research uses complex models in a variety of ways, all of which advance materials science and engineering. Modern computational hardware and software enable faculty to create “virtual laboratories,” where materials are tested and properties predicted computationally. “Problems that used to take years to solve can now be solved in a month,” says Srikanth Patala, a materials science and engineering researcher at NC State.
Researchers are also employing various machine (or statistical) learning methods to accelerate the discovery of new materials. The materials discovery process can be significantly expedited and simplified if we can learn effectively from available knowledge and data. Recently, researchers at Northwestern University used AI to figure out how to make new metal-glass hybrids 200 times faster than they would have doing experiments in the lab.
The Toyota Research Institute (TRI) is investing $35 million into the artificial intelligence project that would help in the hunt for new advanced battery materials and fuel cell catalysts. TRI Chief Science Officer Eric Krotkov said: “Toyota recognizes that artificial intelligence is a vital basic technology that can be leveraged across a range of industries, and we are proud to use it to expand the boundaries of materials science. Accelerating the pace of materials discovery will help lay the groundwork for the future of clean energy and bring us even closer to achieving Toyota’s vision of reducing global average new-vehicle CO2 emissions by 90 percent by 2050.”
Researchers at the Center for Nanoscale Materials and the Advanced Photon Source, both U.S. Department of Energy (DOE) Office of Science User Facilities at DOE’s Argonne National Laboratory, announced the use of machine learning tools to accurately predict the physical, chemical and mechanical properties of nanomaterials.
Now, Northeastern professor Yongmin Liu has developed a new method for quickly discovering materials that have desirable qualities. In a paper published recently in ACS Nano, Liu and his co-authors describe a machine learning algorithm they developed and trained to identify new metamaterial structures.
Kevin Ryan, Jeff Lengyel and Michael Shatruk at Florida State University, US, have developed a deep learning neural network and trained the network on 50,000 inorganic crystal structures without giving it any knowledge of chemical theory, leaving it to figure out the chemistry from the geometrical arrangements of atoms in crystals alone. Tests revealed that the network learned to recognise the similarities within the groups of elements in the periodic table.
The network uses the chemical knowledge it gained during training to identify which hypothetical, combinatorially generated crystal structures are the most reasonable. The network’s success was judged by how well it could identify examples of real crystals it had never seen during training from large sets of decoys. The researchers found that in 30% of the cases it ranked at least one known compound among the 10 most likely possibilities.
‘The resulting prediction model is appealing due to its nearly real-time evaluation that can be carried out on affordable personal computers. To perform a prediction, the user simply enters a desired set of chemical elements, and the program returns a list of results in seconds,’ Ryan explains. This list provides a manageable set of suggestions among the astronomical number of possible combinations of three or more elements. These may not be correct and definitive answers, but researchers can readily test them experimentally, and they may lead to the discovery of new materials with interesting properties
Accelerating materials property predictions using machine learning
The standard approaches adopted thus far involve either expensive and lengthy Edisonian synthesis-testing experimental cycles, or laborious and time-intensive computations, performed on a case-by-case manner.
“Owing to the staggering compositional and configurational degrees of freedom possible in materials, it is fair to assume that the chemical space of even a restricted subclass of materials (say, involving just two elements) is far from being exhausted, and an enormous number of new materials with useful properties are yet to be discovered. Given this formidable chemical landscape, a fundamental bottleneck to an efficient materials discovery process is the lack of suitable methods to rapidly and accurately predict the properties of a vast array (within a subclass) of new yet-to-be-synthesized materials.”
Machine learning methods may be used to establish a mapping between a suitable representation of a material (i.e., its ‘fingerprint’ or its ‘profile’) and any or all of its properties using known historic, or intentionally generated, data. Subsequently, once the profile property mapping has been established, the properties of a vast number of new materials within the same subclass may then be directly predicted (and correlations between properties may be unearthed) at negligible computational cost, thereby completely bypassing the conventional laborious approaches towards material property determination alluded to above. In its most simplified form, this scheme is inspired by the intuition that (dis)similar materials will have (dis)similar properties.
Researchers are employing machine (or statistical) learning methods trained on quantum mechanical computations in combination with the notions of chemical similarity for efficient and accurate prediction of a diverse set of properties of material systems. Harnessing such learning paradigms extends recent efforts to systematically explore and mine vast chemical spaces, and can significantly accelerate the discovery of new application-specific materials.
Machine learning enables predictive modeling of 2-D materials
In a study published in The Journal of Physical Chemistry Letters, a team of researchers led by Argonne computational scientist Subramanian Sankaranarayanan described their use of machine learning tools to create the first atomic-level model that accurately predicts the thermal properties of stanene, a two-dimensional (2-D) material made up of a one-atom-thick sheet of tin.
“Predictive modeling is particularly important for newly discovered materials, to learn what they’re good for, how they respond to different stimuli and also how to effectively grow the material for commercial applications—all before you invest in costly manufacturing,” said Argonne postdoctoral researcher Mathew Cherukara, one of the lead authors of the study.
“We input data obtained from experimental or expensive theory-based calculations, and then ask the machine, ‘Can you give me a model that describes all of these properties?'” said Badri Narayanan, an Argonne postdoctoral researcher and another lead author of the study. “We can also ask questions like, ‘Can we optimize the structure, induce defects or tailor the material to get specific desired properties?'”
Unlike most past models, the machine learning model can capture bond formation and breaking events accurately; this not only yields more reliable predictions of material properties (e.g. thermal conductivity), but also enables researchers to capture chemical reactions accurately and better understand how specific materials can be synthesized.
Another advantage of building models using machine learning is the process is not material-dependent, meaning researchers can look at many different classes of materials and apply machine learning to various other elements and their combinations.
Scientists use artificial neural networks to predict new stable materials
“Predicting the stability of materials is a central problem in materials science, physics and chemistry,” said senior author Shyue Ping Ong, a nanoengineering professor at the UC San Diego Jacobs School of Engineering. “On one hand, you have traditional chemical intuition such as Linus Pauling’s five rules that describe stability for crystals in terms of the radii and packing of ions. On the other, you have expensive quantum mechanical computations to calculate the energy gained from forming a crystal that have to be done on supercomputers. What we have done is to use artificial neural networks to bridge these two worlds.”
By training artificial neural networks to predict a crystal’s formation energy using just two inputs — electronegativity and ionic radius of the constituent atoms — Ong and his team at the Materials Virtual Lab have developed models that can identify stable materials in two classes of crystals known as garnets and perovskites. These models are up to 10 times more accurate than previous machine learning models and are fast enough to efficiently screen thousands of materials in a matter of hours on a laptop. The team details the work in a paper published Sept. 18 in Nature Communications.
“Garnets and perovskites are used in LED lights, rechargeable lithium-ion batteries, and solar cells. These neural networks have the potential to greatly accelerate the discovery of new materials for these and other important applications,” noted first author Weike Ye, a chemistry Ph.D. student in Ong’s Materials Virtual Lab. The team has made their models publicly accessible via a web application at http://crystals.ai. This allows other people to use these neural networks to compute the formation energy of any garnet or perovskite composition on the fly. The researchers are planning to extend the application of neural networks to other crystal prototypes as well as other material properties.
Army uses AI to identify fuel-efficient materials
A new system of algorithmic bots could tackle the most complex challenges beyond human experimental capabilities. Building on amazing successes in artificial intelligence, which can even win a game like Jeopardy, Army-funded researchers at Cornell University developed a system called CRYSTAL to explore new materials for long-lasting power for Soldiers.
Researchers seeking to improve fuel cells for cars are searching for a catalyst that would allow them to replace hydrogen, which is difficult to store, with methanol, which could be far more efficient. But because no known materials are efficient catalysts for methanol oxidation, a new material is needed, said co-author John Gregoire, Ph.D., a staff scientist at the California Institute of Technology.
“If a viable catalyst exists, it’s going to need to be discovered by combining elements of the periodic table, and the number of combinations is so vast that it can’t be done with traditional experimentation,” Gregoire said. Researchers also need to understand the crystal structure, or phase, of the material, because solids may have multiple phase structures and each one behaves differently as a catalyst.
“Humans can solve the phase map for simple composition systems containing two elements,” Gregoire said, “but whenever there are more than two elements, it’s too much information for humans to process, and we need AI to assist.”
CRYSTAL relies on a collective of algorithmic bots that sift through hundreds of thousands of combinations and elements–a number so vast that it’s inaccessible through traditional experimentation. The system is able to obey the laws of physics and chemistry–where existing machine learning approaches fail–and could identify the next generation of material breakthroughs that will equip Soldiers on the future battlefield.
Using the system, researchers were able to identify a unique catalyst, composed of three elements crystallized into a certain structure, which is effective for methanol oxidation and could be incorporated into methanol-based fuel cells.
“The exciting part about basic science research is you can’t always predict where the results will lead,” said Dr. Purush Iyer, division chief, network sciences at Army Research Office. “We funded this research to better understand collective intelligence (wisdom of crowds). While material science application, such as design of novel alloys, were always on the cards, the serendipitous nature of the eventual outcome, that of a catalyst to aid in designing better fuel cells, is solving a problem of immense importance for the Army–battery power in the field–shows the importance of investing in basic research.” The Materials Research Society Communications published an article
New Algorithm Can Discover Materials With Unusual Characteristics
The algorithm Liu and his team built was trained with a data set of 30,000 different samples, each representing a specific relationship between a metamaterial structure and corresponding optical property. Once the algorithm learned those relationships, it was able to predict new ones.
“Searching through all possible parameter combinations for materials is nearly impossible. By introducing artificial intelligence to the metamaterial design, I believe the potential of metamaterials will be fully realized,” said Shuang Zhang, a professor of physics at the University of Birmingham. “Prof. Liu’s research points to a new research direction which will be followed by many groups in this field.”
Engineers can now use the algorithm to discover new materials with specific useful characteristics. For example, current solar panels can only convert 20 to 30 percent of sunlight to energy. Liu is interested in finding a material capable of 100 percent light absorption to create more efficient solar panels.
“With this algorithm, we can design new metamaterial properties on demand,” said Liu, an assistant professor of mechanical and industrial engineering. “These novel optical materials will serve as the foundation for a variety of functional devices.”
“Here, we report a deep-learning-based model, comprising two bidirectional neural networks assembled by a partial stacking strategy, to automatically design and optimize three-dimensional chiral metamaterials with strong chiroptical responses at predesignated wavelengths.
The model can help to discover the intricate, nonintuitive relationship between a metamaterial structure and its optical responses from a number of training examples, which circumvents the time-consuming, case-by-case numerical simulations in conventional metamaterial designs. This approach not only realizes the forward prediction of optical performance much more accurately and efficiently but also enables one to inversely retrieve designs from given requirements.”
Machine learning based adaptive design strategy accelerates the discovery of new materials
In 2016 Researchers demonstrated how an informatics-based adaptive design strategy, tightly coupled to experiments, can accelerate the discovery of new materials with targeted properties, according to a paper published in Nature Communications.
They developed a framework through which starting with a relatively small data set of well-controlled experiments, uses uncertainties to iteratively guide the subsequent experiments toward finding the material with the desired target. “The goal is to cut in half the time and cost of bringing materials to market,” said Lookman. “What we have demonstrated is a data-driven framework built on the foundations of machine learning and design that can lead to discovering new materials with targeted properties much faster than before.” The work made use of Los Alamos’ high-performance supercomputing resources.
The authors have applied their framework to search a shape-memory alloy with very low thermal hysteresis (or dissipation). The functionalities of SMAs, including shape memory effect and super-elasticity, arise from the reversible martensitic transformation between high-temperature austenite and low-temperature martensite phases.
Heating and cooling across the martensitic transformation temperature results in hysteresis (ΔT) as the transformation temperatures do not coincide, giving rise to fatigue. An obstacle for developing low ΔT SMAs is the large search space, because a vast majority of transition metals can be alloyed with Ni50Ti50
Even in our constrained pseudo-quaternary composition space, there are N=797,504 potential alloys. Such a vast space is difficult to explore with high-throughput experiments or ab initio calculations. However, our design loop is able to discover Ti50.0Ni46.7Cu0.8Fe2.3Pd0.2 with a low ΔT of 1.84 K in the sixth out of nine iterations of our loop.
Prior knowledge, including data from previous experiments and physical models, and relevant features are used to describe the materials. This information is used within a machine learning framework to make predictions that include error estimates. The results are used by an experimental design tool (for example, Global optimization) that suggests new experiments (synthesis and characterization) performed in this work, with the dual goals of model improvement and materials discovery. The results feed into a database, which provides input for the next iteration of the design loop.
We show how we exercise the loop, the key ingredients of which are as follows:
- A training data set of alloys, each described by features and with a desired property (that is, ΔT) that has been measured;
- An inference model (regressor) that uses the training data to learn the feature–property relationship, with associated uncertainties;
- The trained model is applied to the search space of unexplored alloy compositions (for which the property has not been measured), to predict ΔT with associated uncertainties;
- Design or global optimization (selector) that provides the next candidate alloy for experiment by balancing the trade-off between exploitation (choosing the material with the best predicted property) and exploration (using the predicted uncertainties to study regions of search space where the model is less accurate); and
- Feedback from experiments allowing the subsequent iterative improvement of the inference model.
Lookman and his colleagues focused on nickel-titanium-based shape-memory alloys, but the strategy can be used for any materials class (polymers, ceramics or nanomaterials) or target properties (e.g., dielectric response, piezoelectric coefficients and band gaps). This becomes important when experiments or calculations are costly and time-consuming.
Although the work focused on the chemical exploration space, it can be readily adapted to optimize processing conditions when there are many “tuning knobs” controlling a figure of merit, as in advanced manufacturing applications. Similarly, it can be generalized to optimize multiple properties, such as, in the case of the nickel-titanium-based alloy, low dissipation as well as a transition temperature several degrees above room temperature.
Artificial intelligence aids materials fabrication
In 2017, a team of researchers at MIT, the University of Massachusetts at Amherst, and the University of California at Berkeley reported to close the materials-science automation gap, with a new artificial-intelligence system that would pore through research papers to deduce “recipes” for producing particular materials.
“Computational materials scientists have made a lot of progress in the ‘what’ to make—what material to design based on desired properties,” says Elsa Olivetti, the Atlantic Richfield Assistant Professor of Energy Studies in MIT’s Department of Materials Science and Engineering (DMSE). “But because of that success, the bottleneck has shifted to, ‘Okay, now how do I make it?'”
The researchers envision a database that contains materials recipes extracted from millions of papers. Scientists and engineers could enter the name of a target material and any other criteria—precursor materials, reaction conditions, fabrication processes—and pull up suggested recipes.
As a step toward realizing that vision, Olivetti and her colleagues have developed a machine-learning system that can analyze a research paper, deduce which of its paragraphs contain materials recipes, and classify the words in those paragraphs according to their roles within the recipes: names of target materials, numeric quantities, names of pieces of equipment, operating conditions, descriptive adjectives, and the like.
In a paper appearing in the latest issue of the journal Chemistry of Materials, they also demonstrate that a machine-learning system can analyze the extracted data to infer general characteristics of classes of materials—such as the different temperature ranges that their synthesis requires—or particular characteristics of individual materials—such as the different physical forms they will take when their fabrication conditions vary. Much of Olivetti’s prior research has concentrated on finding more cost-effective and environmentally responsible ways to produce useful materials, and she hopes that a database of materials recipes could abet that project.
“This is landmark work,” says Ram Seshadri, the Fred and Linda R. Wudl Professor of Materials Science at the University of California at Santa Barbara. “The authors have taken on the difficult and ambitious challenge of capturing, through AI methods, strategies employed for the preparation of new materials. The work demonstrates the power of machine learning, but it would be accurate to say that the eventual judge of success or failure would require convincing practitioners that the utility of such methods can enable them to abandon their more instinctual approaches.
Machine-learning tool can analyze failed reactions to guide synthesis of new materials
In 2016, a team of researchers from Haverford College reported to have developed a machine-learning tool, that by analyzing past data of failed chemical reactions, can guide the choices for synthesizing new materials. The formation of compounds is not fully understood hence synthesis of new materials has relied primarily on exploratory syntheses.
“Failed reactions contain a vast amount of unreported and unextracted information,” says Alex Norquist, a materials-synthesis researcher at Haverford College in Pennsylvania, who is part of the team that has reported the work in Nature1. “There are far more failures than successes, but only the successes generally get published.” The lessons that can be learnt from failed reactions remain hidden in the lab notebooks of individual researchers. “Here we demonstrate an alternative approach that uses machine-learning algorithms trained on reaction data to predict reaction outcomes for the crystallization of templated vanadium selenites.”
“We used information on ‘dark’ reactions–failed or unsuccessful hydrothermal syntheses–collected from archived laboratory notebooks from our laboratory, and added physicochemical property descriptions to the raw notebook information using cheminformatics techniques,” the researchers wrote in a letter to Nature.
They trained an algorithm on data from almost 4,000 attempts to make the crystals under different reaction conditions (such as temperature, concentration, reactant quantity and acidity). That work included transcribing information on dark, failed reactions from the team’s archived lab notebooks into a format that a machine could analyse.
“We used the resulting data to train a machine-learning model to predict reaction success.” The machine-learning tool was designed to analyze these data and predict whether a set of reagents would yield a crystalline material when combined with a solvent and heated.
Decision tree
Having generated evidence to suggest the algorithm can best human intuition, the team built a decision tree to guide their own decisions in the future. The tree allows researchers to work through questions such as “is sodium present?” to reach a prediction regarding the likelihood of a reaction resulting in a crystal.
To test the algorithm, the team picked out previously untried combinations of reactants, and tried to guess the best processing conditions for making selenite materials. When asked to recommend reactions within this niche, the reaction conditions suggested by the algorithm generated a crystalline product in 89% of the time, compared to this Researchers with a combined 10 years of experience with the materials were right 78% of the time.
The team has set up a website, called the Dark Reactions Project, to encourage others to share — in a machine-readable format — their own failed attempts to make new crystals. “The planning and development of such tools is essential if we are to eventually make full use of our ‘failed’ experiments,” adds Richard Cooper, a crystallographer at Oxford University, UK.
References and Resources also include:
- http://www.fiercebiotech.com/it/machine-learning-tool-mines-lab-notebooks-for-lessons-from-failed-reactions
- http://www.sciencenewsline.com/news/2016050920310015.html
- http://www.nature.com/ncomms/2016/160415/ncomms11241/full/ncomms11241.html
- http://phys.org/news/2016-06-materials-screening-reveal-nitride-semiconductors.html
- http://phys.org/news/2016-10-simulations-graphene-defects-assets.html?utm_source=nwletter&utm_medium=email&utm_campaign=weekly-nwletter
- https://phys.org/news/2017-11-artificial-intelligence-aids-materials-fabrication.html
- https://pubs.acs.org/doi/ipdf/10.1021/acsnano.8b03569
- https://www.chemistryworld.com/news/ai-teaches-itself-to-identify-materials–and-predict-new-ones-too/3009188.article
- https://www.sciencedaily.com/releases/2018/09/180918082102.htm
- https://www.army.mil/article/225351/innovative_ai_system_could_help_make_army_fuel_cells_more_efficient