1 research outputs found
Differentiable Greedy Submodular Maximization: Guarantees, Gradient Estimators, and Applications
Motivated by, e.g., sensitivity analysis and end-to-end learning, the demand
for differentiable optimization algorithms has been significantly increasing.
In this paper, we establish a theoretically guaranteed versatile framework that
makes the greedy algorithm for monotone submodular function maximization
differentiable. We smooth the greedy algorithm via randomization, and prove
that it almost recovers original approximation guarantees in expectation for
the cases of cardinality and -extensible system constrains. We also
show how to efficiently compute unbiased gradient estimators of any expected
output-dependent quantities. We demonstrate the usefulness of our framework by
instantiating it for various applications