- Series
- ACO Student Seminar
- Time
- Friday, September 21, 2018 - 1:05pm for 1 hour (actually 50 minutes)
- Location
- Skiles 005
- Speaker
- Matthew Fahrbach – CS, Georgia Tech – matthew.fahrbach@gmail.com – https://www.cc.gatech.edu/~mfahrbac/
- Organizer
- He Guo
As a generalization of many classic problems in combinatorial optimization, submodular optimization has found a wide range of applications in machine learning (e.g., in feature engineering and active learning). For many large-scale optimization problems,
we are often concerned with the adaptivity complexity of an algorithm, which quantifies the number of sequential rounds where polynomially-many independent function evaluations can be executed in parallel. While low adaptivity is ideal, it is not sufficient
for a (distributed) algorithm to be efficient, since in many practical applications of submodular optimization the number of function evaluations becomes prohibitively expensive. Motivated by such applications, we study the adaptivity and query complexity
of adaptive submodular optimization.
Our main result is a distributed algorithm for maximizing a monotone submodular function with cardinality constraint $k$ that achieves a $(1-1/e-\varepsilon)$-approximation in expectation. Furthermore, this algorithm runs in $O(\log(n))$ adaptive rounds
and makes $O(n)$ calls to the function evaluation oracle in expectation. All three of these guarantees are optimal, and the query complexity is substantially less than in previous works. Finally, to show the generality of our simple algorithm and techniques,
we extend our results to the submodular cover problem.
Joint work with Vahab Mirrokni and Morteza Zadimoghaddam (arXiv:1807.07889).