26,565 research outputs found
Bayesian nonlinear hyperspectral unmixing with spatial residual component analysis
This paper presents a new Bayesian model and algorithm for nonlinear unmixing
of hyperspectral images. The model proposed represents the pixel reflectances
as linear combinations of the endmembers, corrupted by nonlinear (with respect
to the endmembers) terms and additive Gaussian noise. Prior knowledge about the
problem is embedded in a hierarchical model that describes the dependence
structure between the model parameters and their constraints. In particular, a
gamma Markov random field is used to model the joint distribution of the
nonlinear terms, which are expected to exhibit significant spatial
correlations. An adaptive Markov chain Monte Carlo algorithm is then proposed
to compute the Bayesian estimates of interest and perform Bayesian inference.
This algorithm is equipped with a stochastic optimisation adaptation mechanism
that automatically adjusts the parameters of the gamma Markov random field by
maximum marginal likelihood estimation. Finally, the proposed methodology is
demonstrated through a series of experiments with comparisons using synthetic
and real data and with competing state-of-the-art approaches
Lidar waveform based analysis of depth images constructed using sparse single-photon data
This paper presents a new Bayesian model and algorithm used for depth and
intensity profiling using full waveforms from the time-correlated single photon
counting (TCSPC) measurement in the limit of very low photon counts. The model
proposed represents each Lidar waveform as a combination of a known impulse
response, weighted by the target intensity, and an unknown constant background,
corrupted by Poisson noise. Prior knowledge about the problem is embedded in a
hierarchical model that describes the dependence structure between the model
parameters and their constraints. In particular, a gamma Markov random field
(MRF) is used to model the joint distribution of the target intensity, and a
second MRF is used to model the distribution of the target depth, which are
both expected to exhibit significant spatial correlations. An adaptive Markov
chain Monte Carlo algorithm is then proposed to compute the Bayesian estimates
of interest and perform Bayesian inference. This algorithm is equipped with a
stochastic optimization adaptation mechanism that automatically adjusts the
parameters of the MRFs by maximum marginal likelihood estimation. Finally, the
benefits of the proposed methodology are demonstrated through a serie of
experiments using real data
Estimating the granularity coefficient of a Potts-Markov random field within an MCMC algorithm
This paper addresses the problem of estimating the Potts parameter B jointly
with the unknown parameters of a Bayesian model within a Markov chain Monte
Carlo (MCMC) algorithm. Standard MCMC methods cannot be applied to this problem
because performing inference on B requires computing the intractable
normalizing constant of the Potts model. In the proposed MCMC method the
estimation of B is conducted using a likelihood-free Metropolis-Hastings
algorithm. Experimental results obtained for synthetic data show that
estimating B jointly with the other unknown parameters leads to estimation
results that are as good as those obtained with the actual value of B. On the
other hand, assuming that the value of B is known can degrade estimation
performance significantly if this value is incorrect. To illustrate the
interest of this method, the proposed algorithm is successfully applied to real
bidimensional SAR and tridimensional ultrasound images
Simulation in Statistics
Simulation has become a standard tool in statistics because it may be the
only tool available for analysing some classes of probabilistic models. We
review in this paper simulation tools that have been specifically derived to
address statistical challenges and, in particular, recent advances in the areas
of adaptive Markov chain Monte Carlo (MCMC) algorithms, and approximate
Bayesian calculation (ABC) algorithms.Comment: Draft of an advanced tutorial paper for the Proceedings of the 2011
Winter Simulation Conferenc
Computing the Cramer-Rao bound of Markov random field parameters: Application to the Ising and the Potts models
This report considers the problem of computing the Cramer-Rao bound for the
parameters of a Markov random field. Computation of the exact bound is not
feasible for most fields of interest because their likelihoods are intractable
and have intractable derivatives. We show here how it is possible to formulate
the computation of the bound as a statistical inference problem that can be
solve approximately, but with arbitrarily high accuracy, by using a Monte Carlo
method. The proposed methodology is successfully applied on the Ising and the
Potts models.% where it is used to assess the performance of three state-of-the
art estimators of the parameter of these Markov random fields
Improving Simulation Efficiency of MCMC for Inverse Modeling of Hydrologic Systems with a Kalman-Inspired Proposal Distribution
Bayesian analysis is widely used in science and engineering for real-time
forecasting, decision making, and to help unravel the processes that explain
the observed data. These data are some deterministic and/or stochastic
transformations of the underlying parameters. A key task is then to summarize
the posterior distribution of these parameters. When models become too
difficult to analyze analytically, Monte Carlo methods can be used to
approximate the target distribution. Of these, Markov chain Monte Carlo (MCMC)
methods are particularly powerful. Such methods generate a random walk through
the parameter space and, under strict conditions of reversibility and
ergodicity, will successively visit solutions with frequency proportional to
the underlying target density. This requires a proposal distribution that
generates candidate solutions starting from an arbitrary initial state. The
speed of the sampled chains converging to the target distribution deteriorates
rapidly, however, with increasing parameter dimensionality. In this paper, we
introduce a new proposal distribution that enhances significantly the
efficiency of MCMC simulation for highly parameterized models. This proposal
distribution exploits the cross-covariance of model parameters, measurements
and model outputs, and generates candidate states much alike the analysis step
in the Kalman filter. We embed the Kalman-inspired proposal distribution in the
DREAM algorithm during burn-in, and present several numerical experiments with
complex, high-dimensional or multi-modal target distributions. Results
demonstrate that this new proposal distribution can greatly improve simulation
efficiency of MCMC. Specifically, we observe a speed-up on the order of 10-30
times for groundwater models with more than one-hundred parameters
Joint state-parameter estimation of a nonlinear stochastic energy balance model from sparse noisy data
While nonlinear stochastic partial differential equations arise naturally in
spatiotemporal modeling, inference for such systems often faces two major
challenges: sparse noisy data and ill-posedness of the inverse problem of
parameter estimation. To overcome the challenges, we introduce a strongly
regularized posterior by normalizing the likelihood and by imposing physical
constraints through priors of the parameters and states. We investigate joint
parameter-state estimation by the regularized posterior in a physically
motivated nonlinear stochastic energy balance model (SEBM) for paleoclimate
reconstruction. The high-dimensional posterior is sampled by a particle Gibbs
sampler that combines MCMC with an optimal particle filter exploiting the
structure of the SEBM. In tests using either Gaussian or uniform priors based
on the physical range of parameters, the regularized posteriors overcome the
ill-posedness and lead to samples within physical ranges, quantifying the
uncertainty in estimation. Due to the ill-posedness and the regularization, the
posterior of parameters presents a relatively large uncertainty, and
consequently, the maximum of the posterior, which is the minimizer in a
variational approach, can have a large variation. In contrast, the posterior of
states generally concentrates near the truth, substantially filtering out
observation noise and reducing uncertainty in the unconstrained SEBM
Collaborative sparse regression using spatially correlated supports - Application to hyperspectral unmixing
This paper presents a new Bayesian collaborative sparse regression method for
linear unmixing of hyperspectral images. Our contribution is twofold; first, we
propose a new Bayesian model for structured sparse regression in which the
supports of the sparse abundance vectors are a priori spatially correlated
across pixels (i.e., materials are spatially organised rather than randomly
distributed at a pixel level). This prior information is encoded in the model
through a truncated multivariate Ising Markov random field, which also takes
into consideration the facts that pixels cannot be empty (i.e, there is at
least one material present in each pixel), and that different materials may
exhibit different degrees of spatial regularity. Secondly, we propose an
advanced Markov chain Monte Carlo algorithm to estimate the posterior
probabilities that materials are present or absent in each pixel, and,
conditionally to the maximum marginal a posteriori configuration of the
support, compute the MMSE estimates of the abundance vectors. A remarkable
property of this algorithm is that it self-adjusts the values of the parameters
of the Markov random field, thus relieving practitioners from setting
regularisation parameters by cross-validation. The performance of the proposed
methodology is finally demonstrated through a series of experiments with
synthetic and real data and comparisons with other algorithms from the
literature
Data augmentation in Rician noise model and Bayesian Diffusion Tensor Imaging
Mapping white matter tracts is an essential step towards understanding brain
function. Diffusion Magnetic Resonance Imaging (dMRI) is the only noninvasive
technique which can detect in vivo anisotropies in the 3-dimensional diffusion
of water molecules, which correspond to nervous fibers in the living brain. In
this process, spectral data from the displacement distribution of water
molecules is collected by a magnetic resonance scanner. From the statistical
point of view, inverting the Fourier transform from such sparse and noisy
spectral measurements leads to a non-linear regression problem. Diffusion
tensor imaging (DTI) is the simplest modeling approach postulating a Gaussian
displacement distribution at each volume element (voxel). Typically the
inference is based on a linearized log-normal regression model that can fit the
spectral data at low frequencies. However such approximation fails to fit the
high frequency measurements which contain information about the details of the
displacement distribution but have a low signal to noise ratio. In this paper,
we directly work with the Rice noise model and cover the full range of
-values. Using data augmentation to represent the likelihood, we reduce the
non-linear regression problem to the framework of generalized linear models.
Then we construct a Bayesian hierarchical model in order to perform
simultaneously estimation and regularization of the tensor field. Finally the
Bayesian paradigm is implemented by using Markov chain Monte Carlo.Comment: 37 pages, 3 figure
- …