log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
Incorporating Symmetries into Deep Learning Models (two papers)
Ian DesJardin - UMD
Board & Brew (Offsite)
Thursday, December 9, 2021, 5:00-6:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

For our last meeting of the year, we're focusing on two papers: Incorporating Symmetry into Deep Dynamics Models for Improved Generalization (Yu et al., 2021) and A Practical Method for Constructing Equivariant Multilayer Perceptron for Arbitrary Matrix Groups (Wilson et al., 2021).

--

Paper 1: Incorporating Symmetry into Deep Dynamics Models for Improved Generalization by Yu et al. (2021)

URL 1: https://arxiv.org/abs/2002.03061

Paper 1 Abstract: "Recent work has shown deep learning can accelerate the prediction of physical dynamics relative to numerical solvers. However, limited physical accuracy and an inability to generalize under distributional shift limit its applicability to the real world. We propose to improve accuracy and generalization by incorporating symmetries into convolutional neural networks. Specifically, we employ a variety of methods each tailored to enforce a different symmetry. Our models are both theoretically and experimentally robust to distributional shift by symmetry group transformations and enjoy favorable sample complexity. We demonstrate the advantage of our approach on a variety of physical dynamics including Rayleigh BĂ©nard convection and real-world ocean currents and temperatures. Compared with image or text applications, our work is a significant step towards applying equivariant neural networks to high-dimensional systems with complex dynamics. We open-source our simulation, data, and code at this https URL."

--

Paper 2: A Practical Method for Constructing Equivariant Multilayer Perceptron for Arbitrary Matrix Groups by Wilson et al. (2021)

URL 2: https://arxiv.org/abs/2104.09459

Paper 2 Abstract: "Symmetries and equivariance are fundamental to the generalization of neural networks on domains such as images, graphs, and point clouds. Existing work has primarily focused on a small number of groups, such as the translation, rotation, and permutation groups. In this work we provide a completely general algorithm for solving for the equivariant layers of matrix groups. In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before, including O(1,3), O(5), Sp(n), and the Rubik's cube group. Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems. We release our software library to enable researchers to construct equivariant layers for arbitrary matrix groups."

For more information and our full schedule, see our website (https://leesharma.com/physics-ai-reading-group/)

This talk is organized by Lee Sharma