log in  |  register  |  feedback?  |  help  |  web accessibility
Effectiveness of Local Search in Machine Learning
Tuesday, March 6, 2018, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)

Many modern machine learning problems are solved using complex over-parametrized models on large datasets and require scalable algorithms with low computation and memory complexity. This often leads to use of non-convex methods which do not necessarily have good computational/ statistical foundations.    

In this talk, we will explore the role of simple local search algorithms, such as stochastic gradient descent, in optimizing non-convex functions, and leverage this to design efficient algorithms for a large and important class of machine learning problems: those involving the fitting of low-rank matrices to data. Later we will discuss a surprising role these local search methods play in implicitly regularizing the complexity of the over-parametrized models in problems involving both low rank matrices and deep neural networks.

Srinadh Bhojanapalli  is currently a research assistant professor at Toyota Technological Institute at Chicago. He obtained his Ph.D. in Electrical and Computer Engineering from The University of Texas at Austin in 2015. He has spent some summers as an intern at Microsoft research and Ebay research labs.

His research is primarily focused on designing statistically efficient algorithms for large scale machine learning problems. He is interested in matrix and tensor factorization, non-convex optimization, neural networks and sub-linear time algorithms.

This talk is organized by Brandi Adams