log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
Tensor Methods: A new paradigm for training probabilistic models and neural networks
Monday, October 26, 2015, 4:00-5:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

Tensors are rich structures for modeling complex higher order relationships in data rich domains such as social networks, computer vision, internet of things, and so on. Tensor decomposition methods are embarrassingly parallel and scalable to enormous datasets. They are guaranteed to converge to the global optimum and yield consistent estimates for many probabilistic models such as topic models, community models, hidden Markov models, and so on. I will also demonstrate how tensor methods can yield rich discriminative features for classification tasks and provides a guaranteed method for training neural networks.

Bio

Anima Anandkumar is a faculty at the EECS Dept. at U.C. Irvine since August 2010. Her research interests are in the area of large-scale machine learning and high-dimensional statistics. She received her B. Tech in Electrical Engineering from IIT Madras in 2004 and her PhD from Cornell University in 2009. She was a visiting faculty at Microsoft Research New England in 2012 and a postdoctoral researcher at the Stochastic Systems Group at MIT between 2009-2010. She is the recipient of the Alfred. P. Sloan Fellowship, Microsoft Faculty Fellowship, AFOSR & ARO Young Investigator Awards, NSF CAREER Award, IBM Fran Allen PhD fellowship, Best thesis award from ACM SIGMETRICS society, and paper awards from the ACM SIGMETRICS and IEEE Signal Processing societies.

This talk is organized by Adelaide Findlay