log in  |  register  |  feedback?  |  help  |  web accessibility
PhD Proposal: New Efficient Bilevel Optimization Algorithms and their Applications in Federated Learning
Junyi Li
Friday, August 23, 2024, 3:30-4:30 pm
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

https://umd.zoom.us/j/5433752507?pwd=LgafNlRqlrP5JcjZoxeiZuFoFnoJVR.1

Bilevel Optimization has witnessed notable progress recently with new emerging efficient algorithms. Beyond computationally expensive double loop algorithms, several single loop algorithms optimizing the inner and outer variable alternatively are recently proposed. However, these algorithms not yet achieve fully single loop. As they overlook the loop needed to evaluate the hyper-gradient for a given inner and outer state. Based on this observation, we propose a novel Hessian inverse free Fully Single Loop Algorithm (FSLA) for bilevel optimization problems, where we first identify a general approximation formulation of hyper-gradient computation that encompasses several previous common approaches, then we introduce a new state variable to maintain the historical hyper-gradient information. Combining our new formulation with the alternative update of the inner and outer variables, we propose FSLA. We theoretically show that our algorithm converges with the rate of $O(\epsilon^{-2})$. We next study the bilevel optimization problems in the Federated Learning setting, where many problems exhibit a bilevel structure. More specifically, we define the Federated Bilevel Optimization problems and propose a communication-efficient algorithm, named Comm-FedBiO. In fact, the high communication cost in hyper-gradient evaluation is a major bottleneck. More specifically, we propose two communication-efficient subroutines to estimate the hyper-gradient. Convergence analysis of the proposed algorithms is also provided. Finally, we apply the proposed algorithms to solve the noisy label problem in Federated Learning.

 

Bio

Junyi Li is a PhD student at the Department of Computer Science, University of Maryland College Park, advised by Dr. Heng Huang. His research focuses on developing efficient optimization algorithms for machine learning problems and provide theoretical guarantee. His research area includes but not limited to nonconvex optimization, bilevel optimization and federated learning. Junyi has published many papers in top tier machine learning conferences, including NeurIPS, ICLR, KDD etc.

This talk is organized by Migo Gui