82 research outputs found
Error analysis for denoising smooth modulo signals on a graph
In many applications, we are given access to noisy modulo samples of a smooth
function with the goal being to robustly unwrap the samples, i.e., to estimate
the original samples of the function. In a recent work, Cucuringu and Tyagi
proposed denoising the modulo samples by first representing them on the unit
complex circle and then solving a smoothness regularized least squares problem
-- the smoothness measured w.r.t the Laplacian of a suitable proximity graph
-- on the product manifold of unit circles. This problem is a quadratically
constrained quadratic program (QCQP) which is nonconvex, hence they proposed
solving its sphere-relaxation leading to a trust region subproblem (TRS). In
terms of theoretical guarantees, error bounds were derived for (TRS).
These bounds are however weak in general and do not really demonstrate the
denoising performed by (TRS).
In this work, we analyse the (TRS) as well as an unconstrained relaxation of
(QCQP). For both these estimators we provide a refined analysis in the setting
of Gaussian noise and derive noise regimes where they provably denoise the
modulo observations w.r.t the norm. The analysis is performed in a
general setting where is any connected graph.Comment: 36 pages, 2 figures. Added Section 5 (Simulations) and made minor
changes as per reviewers comment
Tangent space estimation for smooth embeddings of Riemannian manifolds
Numerous dimensionality reduction problems in data analysis involve the
recovery of low-dimensional models or the learning of manifolds underlying sets
of data. Many manifold learning methods require the estimation of the tangent
space of the manifold at a point from locally available data samples. Local
sampling conditions such as (i) the size of the neighborhood (sampling width)
and (ii) the number of samples in the neighborhood (sampling density) affect
the performance of learning algorithms. In this work, we propose a theoretical
analysis of local sampling conditions for the estimation of the tangent space
at a point P lying on a m-dimensional Riemannian manifold S in R^n. Assuming a
smooth embedding of S in R^n, we estimate the tangent space T_P S by performing
a Principal Component Analysis (PCA) on points sampled from the neighborhood of
P on S. Our analysis explicitly takes into account the second order properties
of the manifold at P, namely the principal curvatures as well as the higher
order terms. We consider a random sampling framework and leverage recent
results from random matrix theory to derive conditions on the sampling width
and the local sampling density for an accurate estimation of tangent subspaces.
We measure the estimation accuracy by the angle between the estimated tangent
space and the true tangent space T_P S and we give conditions for this angle to
be bounded with high probability. In particular, we observe that the local
sampling conditions are highly dependent on the correlation between the
components in the second-order local approximation of the manifold. We finally
provide numerical simulations to validate our theoretical findings
On denoising modulo 1 samples of a function
Consider an unknown smooth function , and
say we are given noisy samples of , i.e., for , where denotes noise. Given the
samples our goal is to recover smooth, robust estimates
of the clean samples . We formulate a natural approach for
solving this problem which works with representations of mod 1 values over the
unit circle. This amounts to solving a quadratically constrained quadratic
program (QCQP) with non-convex constraints involving points lying on the unit
circle. Our proposed approach is based on solving its relaxation which is a
trust-region sub-problem, and hence solvable efficiently. We demonstrate its
robustness to noise % of our approach via extensive simulations on several
synthetic examples, and provide a detailed theoretical analysis.Comment: 19 pages, 13 figures. To appear in AISTATS 2018. Corrected typos, and
made minor stylistic changes throughout. Main results unchanged. Added
section I (and Figure 13) in appendi
Multi-kernel unmixing and super-resolution using the Modified Matrix Pencil method
Consider L groups of point sources or spike trains, with the l'th group represented by . For a function , let denote a point spread function with scale , and with . With , our goal is to recover the source parameters given samples of y, or given the Fourier samples of y. This problem is a generalization of the usual super-resolution setup wherein ; we call this the multi-kernel unmixing super-resolution problem. Assuming access to Fourier samples of y, we derive an algorithm for this problem for estimating the source parameters of each group, along with precise non-asymptotic guarantees. Our approach involves estimating the group parameters sequentially in the order of increasing scale parameters, i.e., from group 1 to L. In particular, the estimation process at stage involves (i) carefully sampling the tail of the Fourier transform of y, (ii) a deflation step wherein we subtract the contribution of the groups processed thus far from the obtained Fourier samples, and (iii) applying Moitra's modified Matrix Pencil method on a deconvolved version of the samples in (ii)
Learning linear dynamical systems under convex constraints
We consider the problem of identification of linear dynamical systems from a
single trajectory. Recent results have predominantly focused on the setup where
no structural assumption is made on the system matrix , and have consequently analyzed the ordinary least squares (OLS)
estimator in detail. We assume prior structural information on is
available, which can be captured in the form of a convex set
containing . For the solution of the ensuing constrained least squares
estimator, we derive non-asymptotic error bounds in the Frobenius norm which
depend on the local size of the tangent cone of at . To
illustrate the usefulness of this result, we instantiate it for the settings
where, (i) is a dimensional subspace of , or (ii) is -sparse and is a suitably scaled
ball. In the regimes where , our bounds improve upon
those obtained from the OLS estimator.Comment: 17 page
- …