Instructors: R. Stanley Williams Texas A&M University
The goal of this tutorial is to provide an overview of recent advances in brain-inspired or neuromorphic computing, with an emphasis on the continuing importance of discovery of new and improved existing materials to perform the fundamental synaptic and neuronic operations used for computation.
- Historical overview and comparison of the different types of neural networks and computing paradigms that have been proposed and demonstrated based on inspiration from neurophysiology and psychology.
- Introduction to nonlinear dynamics and the concepts of local activity and edge of chaos.
- Key descriptors of dynamical materials and their application to neuromorphic computing.
- Discussion on how advances in materials design can be translated to new devices and circuits.
R. Stanley Williams, Texas A&M University
With the saturation of Moore’s scaling of transistors, there has been an explosion in activity and creativity to find new modes of computation that will continue to scale exponentially with time even though transistor circuits only improve modestly. Much of the inspiration for new ways of computing comes from what little we understand about the brain. We actually don’t know how the brain computes, but many different possibilities have been proposed, for example, multinary logic (ternary and higher), Neural Networks of all kinds, extensions of Hebbian learning via spike-timing dependent plasticity, Boltzmann/Ising machines, Hopfield networks, Bayesian inference and Markov Chains, to name a few. Actually, these possibilities are not necessarily mutually exclusive – the brain may use some combination of them or even use a higher-order generalization that contains several of them, since many share mathematical similarities. How to express these computational modes in hardware is a significant challenge. Since the brain itself is a highly nonlinear dynamical system, an appropriate focus for research is nonlinear dynamical circuit theory. This is the realm of the Principle of Local Activity, which provides a basis for understanding and building new generations of neuron-like amplifiers and chaotic oscillators, and designing circuits that are biased at the Edge of Chaos, where complexity and emergent behavior are found. What new types of devices will be used to construct these circuits? Can we emulate or even surpass neural data processing and computation using new types of dynamical electronic-ionic-thermal devices that express similar behavior? Finally, what materials will we use to build these new devices and incorporate them into the existing commercial integrated circuit foundries? I will present a brief historical survey of brain-inspired computation that begins with research in neurophysiology and psychology in the 1920s. I will show what constitutes the present state of the art, and actually how primitive that is compared to the human brain despite the hype surrounding present machine learning. I will describe some major opportunities that exist for new computing paradigms based on emerging hardware and electronic devices such as memristors, which have the potential for many orders of magnitude improvement in time to solution and energy consumption compared to purely digital systems. I will also highlight some of the most exciting research and the groups performing it in laboratories around the world.
9:45 am BREAK