61,429 research outputs found
Fast Gibbs sampling for high-dimensional Bayesian inversion
Solving ill-posed inverse problems by Bayesian inference has recently
attracted considerable attention. Compared to deterministic approaches, the
probabilistic representation of the solution by the posterior distribution can
be exploited to explore and quantify its uncertainties. In applications where
the inverse solution is subject to further analysis procedures, this can be a
significant advantage. Alongside theoretical progress, various new
computational techniques allow to sample very high dimensional posterior
distributions: In [Lucka2012], a Markov chain Monte Carlo (MCMC) posterior
sampler was developed for linear inverse problems with -type priors. In
this article, we extend this single component Gibbs-type sampler to a wide
range of priors used in Bayesian inversion, such as general priors
with additional hard constraints. Besides a fast computation of the
conditional, single component densities in an explicit, parameterized form, a
fast, robust and exact sampling from these one-dimensional densities is key to
obtain an efficient algorithm. We demonstrate that a generalization of slice
sampling can utilize their specific structure for this task and illustrate the
performance of the resulting slice-within-Gibbs samplers by different computed
examples. These new samplers allow us to perform sample-based Bayesian
inference in high-dimensional scenarios with certain priors for the first time,
including the inversion of computed tomography (CT) data with the popular
isotropic total variation (TV) prior.Comment: submitted to "Inverse Problems
Guaranteed passive parameterized macromodeling by using Sylvester state-space realizations
A novel state-space realization for parameterized macromodeling is proposed in this paper. A judicious choice of the state-space realization is required in order to account for the assumed smoothness of the state-space matrices with respect to the design parameters. This technique is used in combination with suitable interpolation schemes to interpolate a set of state-space matrices, and hence the poles and residues indirectly, in order to build accurate parameterized macromodels. The key points of the novel state-space realizations are the choice of a proper pivot matrix and a well-conditioned solution of a Sylvester equation. Stability and passivity are guaranteed by construction over the design space of interest. Pertinent numerical examples validate the proposed Sylvester realization for parameterized macromodeling
Refined Complexity of PCA with Outliers
Principal component analysis (PCA) is one of the most fundamental procedures
in exploratory data analysis and is the basic step in applications ranging from
quantitative finance and bioinformatics to image analysis and neuroscience.
However, it is well-documented that the applicability of PCA in many real
scenarios could be constrained by an "immune deficiency" to outliers such as
corrupted observations. We consider the following algorithmic question about
the PCA with outliers. For a set of points in , how to
learn a subset of points, say 1% of the total number of points, such that the
remaining part of the points is best fit into some unknown -dimensional
subspace? We provide a rigorous algorithmic analysis of the problem. We show
that the problem is solvable in time . In particular, for constant
dimension the problem is solvable in polynomial time. We complement the
algorithmic result by the lower bound, showing that unless Exponential Time
Hypothesis fails, in time , for any function of , it is
impossible not only to solve the problem exactly but even to approximate it
within a constant factor.Comment: To be presented at ICML 201
- …