log in  |  register  |  feedback?  |  help  |  web accessibility
PhD Proposal: Submodular Optimization in Learning
Samira Goudarzi
IRB-5105 or https://umd.zoom.us/j/2456753152?pwd=cGVhd3R1cWRHUUxQYTVid2hGL2poQT09
Tuesday, June 17, 2025, 12:30-2:00 pm
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

Submodular optimization is a well-established area of combinatorial optimization that forms a powerful and widely used framework that underpins many foundational problems in machine learning and data science.
A set function is said to be submodular if it satisfies the diminishing returns property, which captures the intuitive idea that the benefit of adding an element to a set decreases as the set grows.


This property enables the design of efficient approximation for maximizing or minimizing such functions under various constraints, despite the underlying problems often being NP-hard.


Submodular functions naturally arise in a wide range of applications across machine learning, economics, operations research, and network science. In machine learning, they are central to tasks such as data subset selection, data summarization, active learning, feature selection, diversity-aware recommendations, and influence maximization in networks.

Traditional algorithms for problems such as submodular maximization and submodular cover typically assume access to a static ground set. However, the growing prevalence of ever-evolving datasets has spurred significant interest in designing algorithms that operate effectively in large-scale and time-varying environments.


In this proposal, we investigate a range of optimization problems involving submodularity in the fully dynamic setting, where the elements of the ground set may be inserted or deleted over time. Specifically, we address key problems, including submodular maximization under a matroid constraint, weighted submodular cover, and non-monotone submodular maximization under a cardinality constraint, by developing dynamic algorithms that efficiently maintain high-quality approximate solutions as their input changes through a stream of updates.

Bio

Samira Goudarzi is a fourth-year Ph.D. student working under the supervision of Mohammad Hajiaghayi. Her research focuses on theoretical computer science, particularly on problems with applications in machine learning. Her research investigates the design of big data algorithms or the computational complexity of fundamental problems related to submodular optimization, replicability, and similar topics.

This talk is organized by Migo Gui