Brown U. awarded $3.5 million to speed up atomic-scale computer simulations
"Simulations provide insights into materials and chemical processes that we can't readily get from experiments," said Andrew Peterson, an associate professor in Brown's School of Engineering who will lead the work.
"Computational power is growing rapidly, which lets us perform larger and more realistic simulations. But as the size of the simulations grows, the time involved in running them can grow exponentially. This paradox means that even with the growth in computational power, our field still cannot perform truly large-scale simulations. Our goal is to speed those simulations up dramatically - ideally by orders of magnitude - using machine learning."
The grant provides $3.5 million dollars for the work over four years. Peterson will work with two Brown colleagues - Franklin Goldsmith, assistant professor of engineering, and Brenda Rubenstein, assistant professor of chemistry - as well as researchers from Carnegie Mellon, Georgia Tech and MIT.
The idea behind the work is that different simulations often have the same sets of calculations underlying them. Peterson and his colleagues aim to use machine learning to find those underlying similarities and fast-forward through them.
"What we're doing is taking the results of calculations from prior simulations and using them to predict the outcome of calculations that haven't been done yet," Peterson said. "If we can eliminate the need to do similar calculations over and over again, we can speed things up dramatically, potentially by orders of magnitude."
The team will focus their work initially on simulations of electrocatalysis - the kinds of chemical reactions that are important in devices like fuel cells and batteries. These are complex, often multi-step reactions that are fertile ground for simulation-driven research, Peterson says.
Atomic-scale simulations have demonstrated usefulness in Peterson's own work in the design of new catalysts. In a recent example, Peterson worked with Brown chemist Shouheng Sun on a gold nanoparticle catalyst that can perform a reaction necessary for converting carbon dioxide into useful forms of carbon. Peterson's simulations showed it was the sharp edges of the oddly shaped catalyst that were particularly active for the desired reaction.
"That led us to change the geometry of the catalyst to a nanowire - something that's basically all edges - to maximize its reactivity," Peterson said. "We might have eventually tried a nanowire by trial and error, but because of the computational insights we were able get there much more quickly."
The researchers will use a software package that Peterson's research group developed previously as a starting point. The software, called AMP (Atomistic Machine-learning Package) is open-source and already widely used in the simulation community, Peterson says.
The Department of Energy grant will bring atomic-scale simulations - and the insights they produce - to bear on ever larger and more complex simulations. And while the work under the grant will focus on electrocatalysis, the tools the team develops should be widely applicable to other types of material and chemical simulations.
Peterson is hopeful that the investment that the federal government is making in machine learning will be repaid by making better use of valuable computing resources.
"Modern supercomputers cost millions of dollars to build, and simulation time on them is precious," Peterson said. "If we're able to free up time on those machines for additional simulations to be run, that translates into vastly increased return-on-investment for those machines. It's real money."