"Machine learning in chemistry and beyond" (ChE-650) seminar by Prof. Andrew White (University of Rochester): Making cool stuff with deep learning
|Date||05.10.2021 – 15:15 › 16:15|
|Speaker||Andrew White graduated from Rose-Hulman Institute of Technology in 2008 with a BS in chemical engineering. While at Rose, he spent a year studying at the Otto-von Guericke Universität and the Max Planck Institute for Dynamics of Complex Technical Systems in Magdeburg, Germany. Dr. White completed a PhD in chemical engineering at the University of Washington in 2013. The thesis topic was the creation of non-fouling biomimetic surfaces with computational modeling. Next, Dr. White worked with Professor Greg Voth at University of Chicago as a Post-doctoral fellow in the Institute for Biophysical Dynamics from 2013-2014. In Chicago, he developed new methods for combining simulations and experiments. Dr. White joined the University of Rochester in Chemical Engineering in 2015 and is currently an associate professor. He has joint appointments in the Chemistry Department, Biophysics, Materials Science, and Data Science programs. Dr. White received a National Science Foundation CAREER award in 2018 and an Outstanding Young Investigator Award from the National Institutes of Health in 2020. Dr. White has authored a textbook on deep learning for molecules and materials, which is freely available at https://whitead.github.io/dmol-book.|
|Category||Conferences - Seminars|
Deep learning has begun a renaissance in chemistry and materials. We can devise and fit models to predict molecular properties in a few hours and deploy them in a web browser. We can create novel generative models that were previously PhD theses in an afternoon. In my group, we’re exploring deep learning in soft materials and molecules. We are focused on two major problems: interpretability and data scarcity. Now that we can make deep learning models to predict any molecular property ad naseum, what can we learn? I will discuss our recent efforts on interpreting deep learning models through symbolic regression and counterfactuals. Data scarcity is a common problem in chemistry: how can we learn new properties without significant expense of experiments? One method is in judicious choose of experiments, which can be done with active learning. Another approach is pre-training or meta-leraning, which tries to exploit related data. I will cover recent progress in these areas. Finally, one consequence of the state of deep learning is that you can just make cool things in chemistry with minimal effort. I’ll review a few fun projects, including making molecules by banging on the keyboard, doing math with emojis, finding chemical entities in HTML, and doing molecular dynamics with ImageNet derived potentials.
If you are on the faculty job market Andrew would love to discuss an opening at the University of Rochester opening and answer questions. Please contact the organizer (Kevin Jablonka) or Andrew to schedule a discussion.