1,210 research outputs found
Gradient Scan Gibbs Sampler: an efficient algorithm for high-dimensional Gaussian distributions
This paper deals with Gibbs samplers that include high dimensional
conditional Gaussian distributions. It proposes an efficient algorithm that
avoids the high dimensional Gaussian sampling and relies on a random excursion
along a small set of directions. The algorithm is proved to converge, i.e. the
drawn samples are asymptotically distributed according to the target
distribution. Our main motivation is in inverse problems related to general
linear observation models and their solution in a hierarchical Bayesian
framework implemented through sampling algorithms. It finds direct applications
in semi-blind/unsupervised methods as well as in some non-Gaussian methods. The
paper provides an illustration focused on the unsupervised estimation for
super-resolution methods.Comment: 18 page
Estimating hyperparameters and instrument parameters in regularized inversion. Illustration for SPIRE/Herschel map making
We describe regularized methods for image reconstruction and focus on the
question of hyperparameter and instrument parameter estimation, i.e.
unsupervised and myopic problems. We developed a Bayesian framework that is
based on the \post density for all unknown quantities, given the observations.
This density is explored by a Markov Chain Monte-Carlo sampling technique based
on a Gibbs loop and including a Metropolis-Hastings step. The numerical
evaluation relies on the SPIRE instrument of the Herschel observatory. Using
simulated and real observations, we show that the hyperparameters and
instrument parameters are correctly estimated, which opens up many perspectives
for imaging in astrophysics
Bayesian orthogonal component analysis for sparse representation
This paper addresses the problem of identifying a lower dimensional space
where observed data can be sparsely represented. This under-complete dictionary
learning task can be formulated as a blind separation problem of sparse sources
linearly mixed with an unknown orthogonal mixing matrix. This issue is
formulated in a Bayesian framework. First, the unknown sparse sources are
modeled as Bernoulli-Gaussian processes. To promote sparsity, a weighted
mixture of an atom at zero and a Gaussian distribution is proposed as prior
distribution for the unobserved sources. A non-informative prior distribution
defined on an appropriate Stiefel manifold is elected for the mixing matrix.
The Bayesian inference on the unknown parameters is conducted using a Markov
chain Monte Carlo (MCMC) method. A partially collapsed Gibbs sampler is
designed to generate samples asymptotically distributed according to the joint
posterior distribution of the unknown model parameters and hyperparameters.
These samples are then used to approximate the joint maximum a posteriori
estimator of the sources and mixing matrix. Simulations conducted on synthetic
data are reported to illustrate the performance of the method for recovering
sparse representations. An application to sparse coding on under-complete
dictionary is finally investigated.Comment: Revised version. Accepted to IEEE Trans. Signal Processin
Making use of partial knowledge about hidden states in HMMs : an approach based on belief functions.
International audienceThis paper addresses the problem of parameter estimation and state prediction in Hidden Markov Models (HMMs) based on observed outputs and partial knowledge of hidden states expressed in the belief function framework. The usual HMM model is recovered when the belief functions are vacuous. Parameters are learnt using the Evidential Expectation- Maximization algorithm, a recently introduced variant of the Expectation-Maximization algorithm for maximum likelihood estimation based on uncertain data. The inference problem, i.e., finding the most probable sequence of states based on observed outputs and partial knowledge of states, is also addressed. Experimental results demonstrate that partial information about hidden states, when available, may substantially improve the estimation and prediction performances
Robust partial-learning in linear Gaussian systems
International audienceThis paper deals with unsupervised and off-line learning of parameters involved in linear Gaussian systems, i.e. the estimation of the transition and the noise covariances matrices of a state-space system from a finite series of observations only. In practice, these systems are the result of a physical problem for which there is a partial knowledge either on the sensors from which the observations are issued or on the state of the studied system. We therefore propose in this work an " Expectation-Maximization " learning type algorithm that takes into account constraints on parameters such as the fact that two identical sensors have the same noise characteristics, and so estimation procedure should exploit this knowledge. The algorithms are designed for the pairwise linear Gaussian system that takes into account supplementary cross-dependences between observations and hidden states w.r.t. the conventional linear system, while still allowing optimal filtering by means of a Kalman-like filter. The algorithm is made robust through QR decompositions and the propagation of a square-root of the covariance matrices instead of the matrices themselves. It is assessed through a series of experiments that compare the algorithms which incorporate or not partial knowledge, for short as well as for long signals
Bayesian image restoration and bacteria detection in optical endomicroscopy
Optical microscopy systems can be used to obtain high-resolution microscopic images of tissue cultures and ex vivo tissue samples. This imaging technique can be translated for in vivo, in situ applications by using optical fibres and miniature optics. Fibred optical endomicroscopy (OEM) can enable optical biopsy in organs inaccessible by any other imaging systems, and hence can provide rapid and accurate diagnosis in a short time. The raw data the system produce is difficult to interpret as it is modulated by a fibre bundle pattern, producing what is called the “honeycomb effect”. Moreover, the data is further degraded due to the fibre core cross coupling problem. On the other hand, there is an unmet clinical need for automatic tools that can help the clinicians to detect fluorescently labelled bacteria in distal lung images. The aim of this thesis is to develop advanced image processing algorithms that can address the above mentioned problems. First, we provide a statistical model for the fibre core cross coupling problem and the sparse sampling by imaging fibre bundles (honeycomb artefact), which are formulated here as a restoration problem for the first time in the literature. We then introduce a non-linear interpolation method, based on Gaussian processes regression, in order to recover an interpretable scene from the deconvolved data. Second, we develop two bacteria detection algorithms, each of which provides different characteristics. The first approach considers joint formulation to the sparse coding and anomaly detection problems. The anomalies here are considered as candidate bacteria, which are annotated with the help of a trained clinician. Although this approach provides good detection performance and outperforms existing methods in the literature, the user has to carefully tune some crucial model parameters. Hence, we propose a more adaptive approach, for which a Bayesian framework is adopted. This approach not only outperforms the proposed supervised approach and existing methods in the literature but also provides computation time that competes with optimization-based methods
Semi-Huber Half Quadratic Function and Comparative Study of Some MRFs for Bayesian Image Restoration
The present work introduces an alternative method to deal with digital image restoration into a Bayesian framework, particularly, the use of a new half-quadratic function is proposed which performance is satisfactory compared with respect to some other functions in existing literature. The bayesian methodology is based on the prior knowledge of some information that allows an efficient modelling of the image acquisition process. The edge preservation of objects into the image while smoothing noise is necessary in an adequate model. Thus, we use a convexity criteria given by a semi-Huber function to obtain adequate weighting of the cost functions (half-quadratic) to be minimized. The principal objective when using Bayesian methods based on the Markov Random Fields (MRF) in the context of image processing is to eliminate those effects caused by the excessive smoothness on the reconstruction process of image which are rich in contours or edges. A comparison between the new introduced scheme and other three existing schemes, for the cases of noise filtering and image deblurring, is presented. This collection of implemented methods is inspired of course on the use of MRFs such as the semi-Huber, the generalized Gaussian, the Welch, and Tukey potential functions with granularity control. The obtained results showed a satisfactory performance and the effectiveness of the proposed estimator with respect to other three estimators
- …