log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
PhD Proposal: A Reconciliation of Deep Learning and the Brain: Towards Hybrid Biologically-Augmented Recurrent Neural Networks for Temporal Sensory Perception
Matthew Evanusa
Remote
Monday, December 7, 2020, 2:30-4:30 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract
In 1958, Frank Rosenblatt conceived of the Perceptron, in an effort to fulfill the dream of connectionism, to explain and recreate brain phenomena such as learning and behavior through simple learning rules of simple neurons.  After his tragic death and the A.I. winter, and the resurgence that followed, his more brain-focused network was distilled into the more standardized feed-forward deep multi-layer perceptrons, or deep artificial neural networks, that we are more familiar with today.  However, even in proposing the perceptron, Rosenblatt hinted that it was really hierarchical and temporal information that was interesting: an intuitively clear point,  as all the data we experience is in the temporal domain.  Backpropagation continues to dominate, although it appears to be ill-fitted for recurrent network training.  This is reinforced by the fact that backpropagation-trained feed-forward Transformer networks outperform RNNs on temporal tasks, causing RNNs to lose favor in the ML and AI communities for temporal data. Reservoir computing, a type of recurrent neural network that keeps random recurrent connections but trains only a readout layer, avoids the pitfalls of backpropagation with recurrence while showing strong performance, but needs further development to compete with the state of the art, especially hierarchical or deep variants.

My proposed dissertation aims to pick up where the perceptron left off, in motivation and spirit, to continue to look to the brain to construct networks via the connectionist philosophy.  Still believing that recurrent connections will be a powerful tool for temporal learning, and looking to the biology, I will propose a new class of recurrent neural networks, what I will call B-RNNs, short for Biologically-Augmented RNNs, that will build off of the success of the reservoir computing paradigm, but go steps further by incorporating reservoirs into new hybrid hierarchical architectures trained by new backpropagation alternatives, deep reinforcement learning, and composed of new insights and findings from neuroscience.  The B-RNN nomenclature will also serve as a taxonomical organization umbrella for these networks.  I will show in completed work that even simple hybridizations can beat deep LSTM and GRUs at complex temporal classification tasks, and propose several more complex B-RNNs in development and beyond.  I will also lay out a framework for how B-RNNs can serve as a standardization for spiking neural network architectures..

Examining Committee: 
 
                          Chair:               Dr. Yiannis Aloimonos        
                          Dept rep:         Dr.  James Reggia
                          Members:        Dr.  Cornelia Fermüller     
                                                    Dr. Michelle Girvan    
                                                     Dr. Daniel Butts  
Bio

Matthew Evanusa is a PhD student in Computer Science, advised by Yiannis Aloimonos, co-advised by Cornelia Fermuller.  He is a fellow of the NSF-funded COMBINE program at UMD.  His interests sit at the intersection between A.I. and neuroscience, developing brain-inspired artificial neural networks and learning mechanisms for sensory perception, and finding the bridge between deep learning and brain learning.  His work involves reservoir computing, spiking neural networks, deep reinforcement learning, and neuromorphic computing. He is also a now-retired conductor of the UMD Gamer Symphony Orchestra, where he arranged pieces as well, and is a member of the Terp Wushu club.

This talk is organized by Tom Hurst