Ekin Dogus Cubuk, Staff Research Scientist, Google Brain
Wednesday December 7th, 10am (USA/Pacific)
Materials science data and computational resources are growing rapidly. While machine learning offers a promising set of tools for learning models for large volumes of data (IID), these tools are not inherently good at generalizing to new (OOD) data. For this reason, the impact of deep learning on the computational discovery of stable inorganic materials has been limited. There are two observations from deep learning that encourage optimism: 1) scaling up neural networks with more data and compute can monotonically improve their IID generalization and 2) while OOD performance is almost always worse than IID performance, better IID performance is correlated with better OOD performance.
We explored these two directions to investigate if stable material discovery can be made more efficient via deep learning. I will introduce our pipelines for scaling up DFT calculations and graph neural networks, which allow us to discover a large number of inorganic crystals that are stable relative to both The Materials Project and The Open Quantum Materials Database. Due to the diversity of candidate generation algorithms that has been employed, we discovered thousands of novel crystal prototypes. By screening these crystals for electronic and energy applications, we find that our dataset improves the number of promising candidates for certain applications by more than an order of magnitude. Finally, I will present the scaling behavior of our networks for robustness and zero-shot predictions on tasks they were not trained on.
A recording of this seminar will be available at a future date, likely in January.
If you are unable to ask questions live, please feel welcome to ask any questions following the talk here and we will ask the speaker to check afterwards. Whether they will be able to answer questions or not depends on the speaker’s availability.