log in  |  register  |  feedback?  |  help  |  web accessibility
PhD Defense: New Efficient Algorithms for Nested Machine Learning Problems
Junyi Li
Thursday, April 3, 2025, 2:00-4:00 pm
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

In recent years, machine learning (ML) has achieved remarkable success by training large-scale models on vast datasets. However, building these models involves multiple interdependent tasks, such as data selection, hyperparameter tuning, and model architecture search. Optimizing these tasks jointly often leads to the challenging nested objectives, where each task both influences and depends on the others. In this talk, I will start by formalizing nested ML problems as bilevel optimization tasks and presenting efficient algorithms with theoretical guarantees that solve them. Then, I will extend these ideas to the federated learning context, examining how algorithmic designs must be adapted to meet the challenges of that environment.

Bio

Junyi Li is currently a PhD candidate at the department of Computer Science at the University of Maryland, College Park, advised by Prof. Heng Huang. His research focuses on developing machine learning models and algorithms with theoretical foundations, encompassing areas such as federated learning, foundation models, artificial general intelligence (AGI), large-scale distributed optimization, trustworthy AI and efficient machine learning. Junyi’s research has resulted in many papers in top-tier machine learning and AI venues, including NeurIPS, ICML, ICLR, KDD, AAAI, CVPR and NAACL.

This talk is organized by Migo Gui