2022 MRS Fall Meeting & Exhibit

Tutorial DS02-1: An Introduction to End-to-End Differentiable Atomistic Simulations with JAX MD

Sunday, November 27
1:30 pm – 5:00 pm
Hynes, Level 2, Room 206

The high-level goal of this tutorial is to introduce researchers to differentiable programming and to demonstrate how differentiable simulations can open qualitatively new avenues for research in materials science. 

As described in the title, we will use JAX MD — a novel software library for differentiable molecular dynamics — as a platform for the tutorial. The entire tutorial will take the form of Julia notebooks (hosted on Google Colab) that will allow participants to interactively participate.

 The tutorial has several specific goals. 

  1. Introduce Automatic Differentiation (AD) to materials science researchers who may not be familiar. This will be done in the context of JAX. 
  2. Show how AD can make existing simulations simpler to express. For example, show that we can compute forces, stresses, elastic moduli, and phonon spectra automatically from knowledge of the energy function. 
  3. Show how point 2. makes it easy to integrate sophisticated neural networks into traditional simulations. 
  4. Show how to construct more exotic experiments that optimize materials properties by directly differentiating through simulations. Also, highlighting the risks of this approach.

Introduction

Samuel Schoenholz, Google Research

The introductory portion of the tutorial will describe AD and how it can be used to change how we think about atomistic simulations. It will include a short introduction to JAX and then it will introduce JAX MD.

Physical Quantities Made Easy

Carl Goodrich, IST Austria

The next part of the tutorial will show how many quantities can be computed efficiently using AD by taking derivatives of the Hamiltonian. This will include forces, stress and pressure, elastic constants, and phonon spectra.

Neural Network Potentials

Amil Merchant, Stanford University

We will show how easy it is to combine state-of-the-art neural networks with atomistic simulations when everything has been built to support AD from the ground up. This will involve instantiating and (beginning) to train a state-of-the-art equivariant graph neural network. After this, we will demonstrate the usage of this network in several practical settings.

Composability and Extensible Simulations

Carl Goodrich, IST Austria

To prepare for the final section of the tutorial on meta-optimization, we will see how primitive operations in molecular dynamics can be composed with JAX’s automatic vectorization to produce a wide range of simulation environments and tools. In particular, we will go through the construction of simulations with temperature gradients and the nudge elastic band method for identifying saddle points between optima.

Meta Optimization

Ella King, Harvard University

The final session of the day will focus on optimization through simulation environments. Here we will show how to use JAX’s automatic differentiation to perform gradient based optimization through several standard molecular dynamics techniques such as Langevin dynamics and Phonon spectra calculations.

 

 

Tutorial Schedule

1:30 pm
Introduction
Samuel Schoenholz, Google Research

2:30 pm
Physical Quantities Made Easy
Carl Goodrich, IST Austria

3:00pm
Break

3:30 pm
Neural Network Potentials
Amil Merchant, Stanford University

4:00 pm
Composability and Extensible Simulations
Carl Goodrich, IST Austria

4:30 pm 
Meta Optimization
Ella King, Harvard University
 

Publishing Alliance

MRS publishes with Springer Nature

Symposium Support