2,546 research outputs found
Non-Asymptotic Analysis of Tangent Space Perturbation
Constructing an efficient parameterization of a large, noisy data set of
points lying close to a smooth manifold in high dimension remains a fundamental
problem. One approach consists in recovering a local parameterization using the
local tangent plane. Principal component analysis (PCA) is often the tool of
choice, as it returns an optimal basis in the case of noise-free samples from a
linear subspace. To process noisy data samples from a nonlinear manifold, PCA
must be applied locally, at a scale small enough such that the manifold is
approximately linear, but at a scale large enough such that structure may be
discerned from noise. Using eigenspace perturbation theory and non-asymptotic
random matrix theory, we study the stability of the subspace estimated by PCA
as a function of scale, and bound (with high probability) the angle it forms
with the true tangent space. By adaptively selecting the scale that minimizes
this bound, our analysis reveals an appropriate scale for local tangent plane
recovery. We also introduce a geometric uncertainty principle quantifying the
limits of noise-curvature perturbation for stable recovery. With the purpose of
providing perturbation bounds that can be used in practice, we propose plug-in
estimates that make it possible to directly apply the theoretical results to
real data sets.Comment: 53 pages. Revised manuscript with new content addressing application
of results to real data set
A Non-Asymptotic Analysis for Stein Variational Gradient Descent
We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution π ∝ e
−V
on R
d. In the population limit, SVGD performs gradient descent in the space
of probability distributions on the KL divergence with respect to π, where the
gradient is smoothed through a kernel integral operator. In this paper, we provide a
novel finite time analysis for the SVGD algorithm. We provide a descent lemma
establishing that the algorithm decreases the objective at each iteration, and rates
of convergence for the averaged Stein Fisher divergence (also referred to as Kernel
Stein Discrepancy). We also provide a convergence result of the finite particle
system corresponding to the practical implementation of SVGD to its population
version
Non-Asymptotic Analysis of Privacy Amplification via Renyi Entropy and Inf-Spectral Entropy
This paper investigates the privacy amplification problem, and compares the
existing two bounds: the exponential bound derived by one of the authors and
the min-entropy bound derived by Renner. It turns out that the exponential
bound is better than the min-entropy bound when a security parameter is rather
small for a block length, and that the min-entropy bound is better than the
exponential bound when a security parameter is rather large for a block length.
Furthermore, we present another bound that interpolates the exponential bound
and the min-entropy bound by a hybrid use of the Renyi entropy and the
inf-spectral entropy.Comment: 6 pages, 4 figure
- …