361 research outputs found

### Spectral mixture analysis of EELS spectrum-images

Recent advances in detectors and computer science have enabled the
acquisition and the processing of multidimensional datasets, in particular in
the field of spectral imaging. Benefiting from these new developments, earth
scientists try to recover the reflectance spectra of macroscopic materials
(e.g., water, grass, mineral types...) present in an observed scene and to
estimate their respective proportions in each mixed pixel of the acquired
image. This task is usually referred to as spectral mixture analysis or
spectral unmixing (SU). SU aims at decomposing the measured pixel spectrum into
a collection of constituent spectra, called endmembers, and a set of
corresponding fractions (abundances) that indicate the proportion of each
endmember present in the pixel. Similarly, when processing spectrum-images,
microscopists usually try to map elemental, physical and chemical state
information of a given material. This paper reports how a SU algorithm
dedicated to remote sensing hyperspectral images can be successfully applied to
analyze spectrum-image resulting from electron energy-loss spectroscopy (EELS).
SU generally overcomes standard limitations inherent to other multivariate
statistical analysis methods, such as principal component analysis (PCA) or
independent component analysis (ICA), that have been previously used to analyze
EELS maps. Indeed, ICA and PCA may perform poorly for linear spectral mixture
analysis due to the strong dependence between the abundances of the different
materials. One example is presented here to demonstrate the potential of this
technique for EELS analysis.Comment: Manuscript accepted for publication in Ultramicroscop

### Joint segmentation of wind speed and direction using a hierarchical model

The problem of detecting changes in wind speed and direction is considered. Bayesian priors, with various degrees of certainty, are used to represent relationships between the two time series. Segmentation is then conducted using a hierarchical Bayesian model that accounts for correlations between the wind speed and direction. A Gibbs sampling strategy overcomes the computational complexity of the hierarchical model and is used to estimate the unknown parameters and hyperparameters. Extensions to other statistical models are also discussed. These models allow us to study other joint segmentation problems including segmentation of wave amplitude and direction. The performance of the proposed algorithms is illustrated with results obtained with synthetic and real data

### Bayesian orthogonal component analysis for sparse representation

This paper addresses the problem of identifying a lower dimensional space
where observed data can be sparsely represented. This under-complete dictionary
learning task can be formulated as a blind separation problem of sparse sources
linearly mixed with an unknown orthogonal mixing matrix. This issue is
formulated in a Bayesian framework. First, the unknown sparse sources are
modeled as Bernoulli-Gaussian processes. To promote sparsity, a weighted
mixture of an atom at zero and a Gaussian distribution is proposed as prior
distribution for the unobserved sources. A non-informative prior distribution
defined on an appropriate Stiefel manifold is elected for the mixing matrix.
The Bayesian inference on the unknown parameters is conducted using a Markov
chain Monte Carlo (MCMC) method. A partially collapsed Gibbs sampler is
designed to generate samples asymptotically distributed according to the joint
posterior distribution of the unknown model parameters and hyperparameters.
These samples are then used to approximate the joint maximum a posteriori
estimator of the sources and mixing matrix. Simulations conducted on synthetic
data are reported to illustrate the performance of the method for recovering
sparse representations. An application to sparse coding on under-complete
dictionary is finally investigated.Comment: Revised version. Accepted to IEEE Trans. Signal Processin

### Estimating the number of endmembers in hyperspectral images using the normal compositional model and a hierarchical Bayesian algorithm.

This paper studies a semi-supervised Bayesian unmixing algorithm for hyperspectral images. This algorithm is based on the normal compositional model recently introduced by Eismann and Stein. The normal compositional model assumes that each pixel of the image is modeled as a linear combination of an unknown number of pure materials, called endmembers. However, contrary to the classical linear mixing model, these endmembers are supposed to be random in order to model uncertainties regarding their knowledge. This paper proposes to estimate the mixture coefficients of the Normal Compositional Model (referred to as abundances) as well as their number using a reversible jump Bayesian algorithm. The performance of the proposed methodology is evaluated thanks to simulations conducted on synthetic and real AVIRIS images

### Enhancing hyperspectral image unmixing with spatial correlations

This paper describes a new algorithm for hyperspectral image unmixing. Most
of the unmixing algorithms proposed in the literature do not take into account
the possible spatial correlations between the pixels. In this work, a Bayesian
model is introduced to exploit these correlations. The image to be unmixed is
assumed to be partitioned into regions (or classes) where the statistical
properties of the abundance coefficients are homogeneous. A Markov random field
is then proposed to model the spatial dependency of the pixels within any
class. Conditionally upon a given class, each pixel is modeled by using the
classical linear mixing model with additive white Gaussian noise. This strategy
is investigated the well known linear mixing model. For this model, the
posterior distributions of the unknown parameters and hyperparameters allow
ones to infer the parameters of interest. These parameters include the
abundances for each pixel, the means and variances of the abundances for each
class, as well as a classification map indicating the classes of all pixels in
the image. To overcome the complexity of the posterior distribution of
interest, we consider Markov chain Monte Carlo methods that generate samples
distributed according to the posterior of interest. The generated samples are
then used for parameter and hyperparameter estimation. The accuracy of the
proposed algorithms is illustrated on synthetic and real data.Comment: Manuscript accepted for publication in IEEE Trans. Geoscience and
Remote Sensin

### CS Decomposition Based Bayesian Subspace Estimation

In numerous applications, it is required to estimate the principal subspace of the data, possibly from a very limited number of samples. Additionally, it often occurs that some rough knowledge about this subspace is available and could be used to improve subspace estimation accuracy in this case. This is the problem we address herein and, in order to solve it, a Bayesian approach is proposed. The main idea consists of using the CS decomposition of the semi-orthogonal matrix whose columns span the subspace of interest. This parametrization is intuitively appealing and allows for non informative prior distributions of the matrices involved in the CS decomposition and very mild assumptions about the angles between the actual subspace and the prior subspace. The posterior distributions are derived and a Gibbs sampling scheme is presented to obtain the minimum mean-square distance estimator of the subspace of interest. Numerical simulations and an application to real hyperspectral data assess the validity and the performances of the estimator

### Joint segmentation of multivariate astronomical time series : bayesian sampling with a hierarchical model

Astronomy and other sciences often face the problem of detecting and characterizing structure in two or more related time series. This paper approaches such problems using Bayesian priors to represent relationships between signals with various degrees of certainty, and not just rigid constraints. The segmentation is conducted by using a hierarchical Bayesian approach to a piecewise constant Poisson rate model. A Gibbs sampling strategy allows joint estimation of the unknown parameters and hyperparameters. Results obtained with synthetic and real photon counting data illustrate the performance of the proposed algorithm

### Minimum mean square distance estimation of a subspace

We consider the problem of subspace estimation in a Bayesian setting. Since
we are operating in the Grassmann manifold, the usual approach which consists
of minimizing the mean square error (MSE) between the true subspace $U$ and its
estimate $\hat{U}$ may not be adequate as the MSE is not the natural metric in
the Grassmann manifold. As an alternative, we propose to carry out subspace
estimation by minimizing the mean square distance (MSD) between $U$ and its
estimate, where the considered distance is a natural metric in the Grassmann
manifold, viz. the distance between the projection matrices. We show that the
resulting estimator is no longer the posterior mean of $U$ but entails
computing the principal eigenvectors of the posterior mean of $U U^{T}$.
Derivation of the MMSD estimator is carried out in a few illustrative examples
including a linear Gaussian model for the data and a Bingham or von Mises
Fisher prior distribution for $U$. In all scenarios, posterior distributions
are derived and the MMSD estimator is obtained either analytically or
implemented via a Markov chain Monte Carlo simulation method. The method is
shown to provide accurate estimates even when the number of samples is lower
than the dimension of $U$. An application to hyperspectral imagery is finally
investigated

### Hierarchical Bayesian sparse image reconstruction with application to MRFM

This paper presents a hierarchical Bayesian model to reconstruct sparse
images when the observations are obtained from linear transformations and
corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is
well suited to such naturally sparse image applications as it seamlessly
accounts for properties such as sparsity and positivity of the image via
appropriate Bayes priors. We propose a prior that is based on a weighted
mixture of a positive exponential distribution and a mass at zero. The prior
has hyperparameters that are tuned automatically by marginalization over the
hierarchical Bayesian model. To overcome the complexity of the posterior
distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be
used to estimate the image to be recovered, e.g. by maximizing the estimated
posterior distribution. In our fully Bayesian approach the posteriors of all
the parameters are available. Thus our algorithm provides more information than
other previously proposed sparse reconstruction methods that only give a point
estimate. The performance of our hierarchical Bayesian sparse reconstruction
method is illustrated on synthetic and real data collected from a tobacco virus
sample using a prototype MRFM instrument.Comment: v2: final version; IEEE Trans. Image Processing, 200

- âŠ