18,462 research outputs found
Approximation of Eigenfunctions in Kernel-based Spaces
Kernel-based methods in Numerical Analysis have the advantage of yielding
optimal recovery processes in the "native" Hilbert space \calh in which they
are reproducing. Continuous kernels on compact domains have an expansion into
eigenfunctions that are both -orthonormal and orthogonal in \calh
(Mercer expansion). This paper examines the corresponding eigenspaces and
proves that they have optimality properties among all other subspaces of
\calh. These results have strong connections to -widths in Approximation
Theory, and they establish that errors of optimal approximations are closely
related to the decay of the eigenvalues.
Though the eigenspaces and eigenvalues are not readily available, they can be
well approximated using the standard -dimensional subspaces spanned by
translates of the kernel with respect to nodes or centers. We give error
bounds for the numerical approximation of the eigensystem via such subspaces. A
series of examples shows that our numerical technique via a greedy point
selection strategy allows to calculate the eigensystems with good accuracy
Recovering edges in ill-posed inverse problems: optimality of curvelet frames
We consider a model problem of recovering a function from noisy Radon data. The function to be recovered is assumed smooth apart from a discontinuity along a curve, that is, an edge. We use the continuum white-noise model, with noise level .
Traditional linear methods for solving such inverse problems behave poorly in the presence of edges. Qualitatively, the reconstructions are blurred near the edges; quantitatively, they give in our model mean squared errors (MSEs) that tend to zero with noise level only as as . A recent innovation--nonlinear shrinkage in the wavelet domain--visually improves edge sharpness and improves MSE convergence to . However, as we show here, this rate is not optimal.
In fact, essentially optimal performance is obtained by deploying the recently-introduced tight frames of curvelets in this setting. Curvelets are smooth, highly anisotropic elements ideally suited for detecting and synthesizing curved edges. To deploy them in the Radon setting, we construct a curvelet-based biorthogonal decomposition of the Radon operator and build "curvelet shrinkage" estimators based on thresholding of the noisy curvelet coefficients. In effect, the estimator detects edges at certain locations and orientations in the Radon domain and automatically synthesizes edges at corresponding locations and directions in the original domain.
We prove that the curvelet shrinkage can be tuned so that the estimator will attain, within logarithmic factors, the MSE as noise level . This rate of convergence holds uniformly over a class of functions which are except for discontinuities along curves, and (except for log terms) is the minimax rate for that class. Our approach is an instance of a general strategy which should apply in other inverse problems; we sketch a deconvolution example
An Efficient Streaming Algorithm for the Submodular Cover Problem
We initiate the study of the classical Submodular Cover (SC) problem in the
data streaming model which we refer to as the Streaming Submodular Cover (SSC).
We show that any single pass streaming algorithm using sublinear memory in the
size of the stream will fail to provide any non-trivial approximation
guarantees for SSC. Hence, we consider a relaxed version of SSC, where we only
seek to find a partial cover.
We design the first Efficient bicriteria Submodular Cover Streaming
(ESC-Streaming) algorithm for this problem, and provide theoretical guarantees
for its performance supported by numerical evidence. Our algorithm finds
solutions that are competitive with the near-optimal offline greedy algorithm
despite requiring only a single pass over the data stream. In our numerical
experiments, we evaluate the performance of ESC-Streaming on active set
selection and large-scale graph cover problems.Comment: To appear in NIPS'1
Bootstrap-Based Inference for Cube Root Asymptotics
This paper proposes a valid bootstrap-based distributional approximation for
M-estimators exhibiting a Chernoff (1964)-type limiting distribution. For
estimators of this kind, the standard nonparametric bootstrap is inconsistent.
The method proposed herein is based on the nonparametric bootstrap, but
restores consistency by altering the shape of the criterion function defining
the estimator whose distribution we seek to approximate. This modification
leads to a generic and easy-to-implement resampling method for inference that
is conceptually distinct from other available distributional approximations. We
illustrate the applicability of our results with four examples in econometrics
and machine learning
- …