2022 MRS Fall Meeting & Exhibit

Tutorial DS02-2: Graph Neural Networks for Materials Design

 

Sunday, November 27, 2022
8:30 AM - 12:00 PM
Hynes, Level 2, Room 206

The high-level goal of this tutorial is to demonstrate the versatility of graph neural networks for inverse design of materials. Different strategies for materials design will be explored.

The specific objectives of the tutorial are:

  1. Introduction to graph neural networks - In this section, we will introduce the fundamentals of graph networks including essential components and tasks. The tutorial will describe the basics of molecules and crystals as graphs and describe graph connectivity, representations, centralities, and message passing for predicting materials properties.
  2. Graph neural network potential for the periodic table - In this section, we will demonstrate using GNN as interatomic potentials for general crystals. This tutorial will cover basics of computational materials, running the M3GNet model for structural relaxations, property predictions, and molecular dynamics to get the diffusivity and conductivity.
  3. Hierarchical GNNs for design and analysis of Zintl phases - In this section, we will use the family of materials called Zintl phases to illustrate how hierarchical graph neural networks can be used to automatically identify structural motifs that can aid in the prediction of materials properties and design of new materials.
  4. Reinforcement learning with GNNs for inverse material design - In this section, we will demonstrate using GNNs as a surrogate objective function in inverse design applications. This will consist of a series of simple code demos, where we optimize over structure space for a simple crystal system. We will demonstrate how to specify the search space, construct a GNN policy model, and optimize a given GNN surrogate energy function.

Introduction to Graph NN

Taylor Sparks, University of Utah

This section will include fundamentals of graph networks: components, directionality and tasks, molecules and crystals as graphs, matrix representation, graph connectivity and centralities, message passing, node/edge/graph representations and comments on advanced alternative GNNs (multigraphs, hierarchical graphs etc).

GNN Universal Interatomic Potential for Materials Design

Chi Chen, Microsoft Quantum

This section will include basics of computational materials and tools, running M3GNet for structural relaxation and property predictions and running M3GNet for molecular dynamics and diffusivity and conductivity calculations.

Zintl Phases and Hierarchical Graph NN

Prashun Gorai, Colorado School of Mines, NREL; Qian Yang, University of Connecticut

This section will include introduction to Zintl phases and structural motifs, hierarchical graph neural networks for crystals and examples of running code and results of trained model - visualization of automatically identified motifs for Zintl phases.

Reinforcement Learning with Graph NN

Peter St. John, National Renewable Energy Lab

This section will include introduction to the graph-env and rlcrystal packages, specification of an action space for a reduced crystal structure search, construction of a GNN policy model and running RL search for top-performing candidates.

 

 

Tutorial DS02-1: An Introduction to End-to-End Differentiable Atomistic Simulations with JAX MD

Sunday, November 27, 2022
1:30 PM - 5:00 PM
Hynes, Level 2, Room 206

The high-level goal of this tutorial is to introduce researchers to differentiable programming and to demonstrate how differentiable simulations can open qualitatively new avenues for research in materials science. 

As described in the title, we will use JAX MD — a novel software library for differentiable molecular dynamics — as a platform for the tutorial. The entire tutorial will take the form of Julia notebooks (hosted on Google Colab) that will allow participants to interactively participate.

 The tutorial has several specific goals. 

  1. Introduce Automatic Differentiation (AD) to materials science researchers who may not be familiar. This will be done in the context of JAX. 
  2. Show how AD can make existing simulations simpler to express. For example, show that we can compute forces, stresses, elastic moduli, and phonon spectra automatically from knowledge of the energy function. 
  3. Show how point 2. makes it easy to integrate sophisticated neural networks into traditional simulations. 
  4. Show how to construct more exotic experiments that optimize materials properties by directly differentiating through simulations. Also, highlighting the risks of this approach.

 

Introduction

Samuel Schoenholz, Google Research

The introductory portion of the tutorial will describe AD and how it can be used to change how we think about atomistic simulations. It will include a short introduction to JAX and then it will introduce JAX MD.

 

 

Physical Quantities Made Easy

 

 


Carl Goodrich, IST Austria

 

 

The next part of the tutorial will show how many quantities can be computed efficiently using AD by taking derivatives of the Hamiltonian. This will include forces, stress and pressure, elastic constants, and phonon spectra.

 

Neural Network Potentials

Amil Merchant, Stanford University

We will show how easy it is to combine state-of-the-art neural networks with atomistic simulations when everything has been built to support AD from the ground up. This will involve instantiating and (beginning) to train a state-of-the-art equivariant graph neural network. After this, we will demonstrate the usage of this network in several practical settings.

 


Composability and Extensible Simulations

 

 

Carl Goodrich, IST Austria

To prepare for the final section of the tutorial on meta-optimization, we will see how primitive operations in molecular dynamics can be composed with JAX’s automatic vectorization to produce a wide range of simulation environments and tools. In particular, we will go through the construction of simulations with temperature gradients and the nudge elastic band method for identifying saddle points between optima.

 

Meta Optimization

Ella King, Harvard University


The final session of the day will focus on optimization through simulation environments. Here we will show how to use JAX’s automatic differentiation to perform gradient based optimization through several standard molecular dynamics techniques such as Langevin dynamics and Phonon spectra calculations.

 

 

Tutorial Schedule

1:30 pm
Introduction
Samuel Schoenholz, Google Research

2:30 pm
Physical Quantities Made Easy
Carl Goodrich, IST Austria

3:00pm
Break

3:30 pm
Neural Network Potentials
Amil Merchant, Stanford University

4:00 pm
Composability and Extensible Simulations
Carl Goodrich, IST Austria

4:30 pm 
Meta Optimization
Ella King, Harvard University
 

 

 

Publishing Alliance

MRS publishes with Springer Nature

Symposium Support