log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
PhD Proposal: Quantum algorithms for machine learning and optimization
Tongyang Li
Wednesday, April 11, 2018, 11:00 am-1:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

The theories of machine learning and optimization answer foundational questions in computer science and lead to new algorithms for practical applications. While these topics have been extensively studied in the context of classical computing, their quantum counterparts are far from well-understood. For my thesis, I propose to study quantum time and sample complexities of learning and optimization, focusing on three topics: learning distributions, convex optimization, and spectral analysis.

Learning distributions aims to estimate properties or latent variables of unknown distributions, which are fundamental questions in machine learning, statistics, and information theory, given that much of science relies on samples furnished by nature. I propose to have a better understanding about quantum algorithms for learning discrete distributions. In particular, tight bounds on the number of samples to estimate Shannon and Renyi entropies have been established in the classical setting. We gave the first quantum algorithms for estimating entropies with up to quadratic speedup on the number of samples. In the future, I plan to study quantum algorithms for other properties, and also quantum algorithms for learning mixtures of discrete distributions.

Convex optimization has been a central topic in the study of theoretical computer science and operations research given the fact that they admit polynomial-time solvers. I propose to understand quantum speedups of convex optimization. In particular, for low-rank semidefinite programmings (SDPs), we gave a quantum algorithm with running time only poly-logarithmic in dimension, an exponential speedup compared to classical algorithms. In the future, I propose to study some applications of our quantum algorithm for solving SDPs, and I also plan to find quantum algorithms for solving general convex optimization problems.

Spectral analysis aims at understanding the spectrum of data. In particular, principal component analysis finds the subspace with the largest variance in data, a ubiquitous task in unsupervised learning, feature generation, and data visualization. Inspired by the power method, we gave a quantum algorithm that finds the leading eigenvector of a sparse matrix with exponential speedup in dimension compared to classical algorithms. We also proved that the exponential speedup still holds if the data are given online. In the future, I propose to improve the time complexity of our algorithms using more advanced optimization techniques such as the stochastic reduced variance gradient descent method (SVRG). I also plan to study quantum algorithms for high-order spectral analysis, i.e., efficient quantum algorithms for tensor decomposition.

Examining Committee: 
 
                          Chair:               Dr. Andrew Childs
                          Dept. rep:        Dr. Furong Huang
                          Members:        Dr. Aravind Srinivasan
                                                    Dr. Xiaodi Wu
This talk is organized by Tom Hurst