5 research outputs found
Small Errors in Random Zeroth Order Optimization are Imaginary
The vast majority of zeroth order optimization methods try to imitate first
order methods via some smooth approximation of the gradient. Here, the smaller
the smoothing parameter, the smaller the gradient approximation error. We show
that for the majority of zeroth order methods this smoothing parameter can
however not be chosen arbitrarily small as numerical cancellation errors will
dominate. As such, theoretical and numerical performance could differ
significantly. Using classical tools from numerical differentiation we will
propose a new smoothed approximation of the gradient that can be integrated
into general zeroth order algorithmic frameworks. Since the proposed smoothed
approximation does not suffer from cancellation errors, the smoothing parameter
(and hence the approximation error) can be made arbitrarily small. Sublinear
convergence rates for algorithms based on our smoothed approximation are
proved. Numerical experiments are also presented to demonstrate the superiority
of algorithms based on the proposed approximation.Comment: New: Figure 3.
Information-theoretic lower bounds for convex optimization with erroneous oracles
Abstract We consider the problem of optimizing convex and concave functions with access to an erroneous zeroth-order oracle. In particular, for a given function x → f (x) we consider optimization when one is given access to absolute error oracles that return values in [f (x) − , f (x) + ] or relative error oracles that return value in , for some > 0. We show stark information theoretic impossibility results for minimizing convex functions and maximizing concave functions over polytopes in this model
Informational Substitutes
We propose definitions of substitutes and complements for pieces of
information ("signals") in the context of a decision or optimization problem,
with game-theoretic and algorithmic applications. In a game-theoretic context,
substitutes capture diminishing marginal value of information to a rational
decision maker. We use the definitions to address the question of how and when
information is aggregated in prediction markets. Substitutes characterize
"best-possible" equilibria with immediate information aggregation, while
complements characterize "worst-possible", delayed aggregation. Game-theoretic
applications also include settings such as crowdsourcing contests and Q\&A
forums. In an algorithmic context, where substitutes capture diminishing
marginal improvement of information to an optimization problem, substitutes
imply efficient approximation algorithms for a very general class of (adaptive)
information acquisition problems.
In tandem with these broad applications, we examine the structure and design
of informational substitutes and complements. They have equivalent, intuitive
definitions from disparate perspectives: submodularity, geometry, and
information theory. We also consider the design of scoring rules or
optimization problems so as to encourage substitutability or complementarity,
with positive and negative results. Taken as a whole, the results give some
evidence that, in parallel with substitutable items, informational substitutes
play a natural conceptual and formal role in game theory and algorithms.Comment: Full version of FOCS 2016 paper. Single-column, 61 pages (48 main
text, 13 references and appendix