82,556 research outputs found
Dropout Sampling for Robust Object Detection in Open-Set Conditions
Dropout Variational Inference, or Dropout Sampling, has been recently
proposed as an approximation technique for Bayesian Deep Learning and evaluated
for image classification and regression tasks. This paper investigates the
utility of Dropout Sampling for object detection for the first time. We
demonstrate how label uncertainty can be extracted from a state-of-the-art
object detection system via Dropout Sampling. We evaluate this approach on a
large synthetic dataset of 30,000 images, and a real-world dataset captured by
a mobile robot in a versatile campus environment. We show that this uncertainty
can be utilized to increase object detection performance under the open-set
conditions that are typically encountered in robotic vision. A Dropout Sampling
network is shown to achieve a 12.3% increase in recall (for the same precision
score as a standard network) and a 15.1% increase in precision (for the same
recall score as the standard network).Comment: to appear in IEEE International Conference on Robotics and Automation
2018 (ICRA 2018
Enhancing hyperspectral image unmixing with spatial correlations
This paper describes a new algorithm for hyperspectral image unmixing. Most
of the unmixing algorithms proposed in the literature do not take into account
the possible spatial correlations between the pixels. In this work, a Bayesian
model is introduced to exploit these correlations. The image to be unmixed is
assumed to be partitioned into regions (or classes) where the statistical
properties of the abundance coefficients are homogeneous. A Markov random field
is then proposed to model the spatial dependency of the pixels within any
class. Conditionally upon a given class, each pixel is modeled by using the
classical linear mixing model with additive white Gaussian noise. This strategy
is investigated the well known linear mixing model. For this model, the
posterior distributions of the unknown parameters and hyperparameters allow
ones to infer the parameters of interest. These parameters include the
abundances for each pixel, the means and variances of the abundances for each
class, as well as a classification map indicating the classes of all pixels in
the image. To overcome the complexity of the posterior distribution of
interest, we consider Markov chain Monte Carlo methods that generate samples
distributed according to the posterior of interest. The generated samples are
then used for parameter and hyperparameter estimation. The accuracy of the
proposed algorithms is illustrated on synthetic and real data.Comment: Manuscript accepted for publication in IEEE Trans. Geoscience and
Remote Sensin
Classification of chirp signals using hierarchical bayesian learning and MCMC methods
This paper addresses the problem of classifying chirp signals using hierarchical Bayesian learning together with Markov chain Monte Carlo (MCMC) methods. Bayesian learning consists of estimating the distribution of the observed data conditional on each class from a set of training samples. Unfortunately, this estimation requires to evaluate intractable multidimensional integrals. This paper studies an original implementation of hierarchical Bayesian learning that estimates the class conditional probability densities using MCMC methods. The performance of this implementation is first studied via an academic example for which the class conditional densities are known. The problem of classifying chirp signals is then addressed by using a similar hierarchical Bayesian learning implementation based on a Metropolis-within-Gibbs algorithm
Generative Supervised Classification Using Dirichlet Process Priors.
Choosing the appropriate parameter prior distributions associated to a given Bayesian model is a challenging problem. Conjugate priors can be selected for simplicity motivations. However, conjugate priors can be too restrictive to accurately model the available prior information. This paper studies a new generative supervised classifier which assumes that the parameter prior distributions conditioned on each class are mixtures of Dirichlet processes. The motivations for using mixtures of Dirichlet processes is their known ability to model accurately a large class of probability distributions. A Monte Carlo method allowing one to sample according to the resulting class-conditional posterior distributions is then studied. The parameters appearing in the class-conditional densities can then be estimated using these generated samples (following Bayesian learning). The proposed supervised classifier is applied to the classification of altimetric waveforms backscattered from different surfaces (oceans, ices, forests, and deserts). This classification is a first step before developing tools allowing for the extraction of useful geophysical information from altimetric waveforms backscattered from nonoceanic surfaces
Adaptive Markov random fields for joint unmixing and segmentation of hyperspectral image
Linear spectral unmixing is a challenging problem in hyperspectral imaging that consists of decomposing an observed pixel into a linear combination of pure spectra (or endmembers) with their corresponding proportions (or abundances). Endmember extraction algorithms can be employed for recovering the spectral signatures while abundances are estimated using an inversion step. Recent works have shown that exploiting spatial dependencies between image pixels can improve spectral unmixing. Markov random fields (MRF) are classically used to model these spatial correlations and partition the image into multiple classes with homogeneous abundances. This paper proposes to define the MRF sites using similarity regions. These regions are built using a self-complementary area filter that stems from the morphological theory. This kind of filter divides the original image into flat zones where the underlying pixels have the same spectral values. Once the MRF has been clearly established, a hierarchical Bayesian algorithm is proposed to estimate the abundances, the class labels, the noise variance, and the corresponding hyperparameters. A hybrid Gibbs sampler is constructed to generate samples according to the corresponding posterior distribution of the unknown parameters and hyperparameters. Simulations conducted on synthetic and real AVIRIS data demonstrate the good performance of the algorithm
- …