11,028 research outputs found
Gordon's inequality and condition numbers in conic optimization
The probabilistic analysis of condition numbers has traditionally been
approached from different angles; one is based on Smale's program in complexity
theory and features integral geometry, while the other is motivated by
geometric functional analysis and makes use of the theory of Gaussian
processes. In this note we explore connections between the two approaches in
the context of the biconic homogeneous feasiblity problem and the condition
numbers motivated by conic optimization theory. Key tools in the analysis are
Slepian's and Gordon's comparision inequalities for Gaussian processes,
interpreted as monotonicity properties of moment functionals, and their
interplay with ideas from conic integral geometry
Bayesian Inference of Log Determinants
The log-determinant of a kernel matrix appears in a variety of machine
learning problems, ranging from determinantal point processes and generalized
Markov random fields, through to the training of Gaussian processes. Exact
calculation of this term is often intractable when the size of the kernel
matrix exceeds a few thousand. In the spirit of probabilistic numerics, we
reinterpret the problem of computing the log-determinant as a Bayesian
inference problem. In particular, we combine prior knowledge in the form of
bounds from matrix theory and evidence derived from stochastic trace estimation
to obtain probabilistic estimates for the log-determinant and its associated
uncertainty within a given computational budget. Beyond its novelty and
theoretic appeal, the performance of our proposal is competitive with
state-of-the-art approaches to approximating the log-determinant, while also
quantifying the uncertainty due to budget-constrained evidence.Comment: 12 pages, 3 figure
- …