926 research outputs found
A bayesian approach to adaptive detection in nonhomogeneous environments
We consider the adaptive detection of a signal of interest embedded in colored noise, when the environment is nonhomogeneous, i.e., when the training samples used for adaptation do not share the same covariance matrix as the vector under test. A Bayesian framework is proposed where the covariance matrices of the primary and the secondary data are assumed to be random, with some appropriate joint distribution. The prior distributions of these matrices require a rough knowledge about the environment. This provides a flexible, yet simple, knowledge-aided model where the degree of nonhomogeneity can be tuned through some scalar variables. Within this framework, an approximate generalized likelihood ratio test is formulated. Accordingly, two Bayesian versions of the adaptive matched filter are presented, where the conventional maximum likelihood estimate of the primary data covariance matrix is replaced either by its minimum mean-square error estimate or by its maximum a posteriori estimate. Two detectors require generating samples distributed according to the joint posterior distribution of primary and secondary data covariance matrices. This is achieved through the use of a Gibbs sampling strategy. Numerical simulations illustrate the performances of these detectors, and compare them with those of the conventional adaptive matched filter
Statistical computation with kernels
Modern statistical inference has seen a tremendous increase in the size and complexity of models and datasets. As such, it has become reliant on advanced com- putational tools for implementation. A first canonical problem in this area is the numerical approximation of integrals of complex and expensive functions. Numerical integration is required for a variety of tasks, including prediction, model comparison and model choice. A second canonical problem is that of statistical inference for models with intractable likelihoods. These include models with intractable normal- isation constants, or models which are so complex that their likelihood cannot be evaluated, but from which data can be generated. Examples include large graphical models, as well as many models in imaging or spatial statistics.
This thesis proposes to tackle these two problems using tools from the kernel methods and Bayesian non-parametrics literature. First, we analyse a well-known algorithm for numerical integration called Bayesian quadrature, and provide consis- tency and contraction rates. The algorithm is then assessed on a variety of statistical inference problems, and extended in several directions in order to reduce its compu- tational requirements. We then demonstrate how the combination of reproducing kernels with Steinâs method can lead to computational tools which can be used with unnormalised densities, including numerical integration and approximation of probability measures. We conclude by studying two minimum distance estimators derived from kernel-based statistical divergences which can be used for unnormalised and generative models.
In each instance, the tractability provided by reproducing kernels and their properties allows us to provide easily-implementable algorithms whose theoretical foundations can be studied in depth
Robust Optimisation Monte Carlo
This paper is on Bayesian inference for parametric statistical models that
are defined by a stochastic simulator which specifies how data is generated.
Exact sampling is then possible but evaluating the likelihood function is
typically prohibitively expensive. Approximate Bayesian Computation (ABC) is a
framework to perform approximate inference in such situations. While basic ABC
algorithms are widely applicable, they are notoriously slow and much research
has focused on increasing their efficiency. Optimisation Monte Carlo (OMC) has
recently been proposed as an efficient and embarrassingly parallel method that
leverages optimisation to accelerate the inference. In this paper, we
demonstrate an important previously unrecognised failure mode of OMC: It
generates strongly overconfident approximations by collapsing regions of
similar or near-constant likelihood into a single point. We propose an
efficient, robust generalisation of OMC that corrects this. It makes fewer
assumptions, retains the main benefits of OMC, and can be performed either as
post-processing to OMC or as a stand-alone computation. We demonstrate the
effectiveness of the proposed Robust OMC on toy examples and tasks in
inverse-graphics where we perform Bayesian inference with a complex image
renderer.Comment: 8 pages + 6 page appendix; v2: made clarifications, added a second
possible algorithm implementation and its results; v3: small clarifications,
to be published in AISTATS 202
AudioâVisual Speaker Tracking
Target motion tracking found its application in interdisciplinary fields, including but not limited to surveillance and security, forensic science, intelligent transportation system, driving assistance, monitoring prohibited area, medical science, robotics, action and expression recognition, individual speaker discrimination in multiâspeaker environments and video conferencing in the fields of computer vision and signal processing. Among these applications, speaker tracking in enclosed spaces has been gaining relevance due to the widespread advances of devices and technologies and the necessity for seamless solutions in realâtime tracking and localization of speakers. However, speaker tracking is a challenging task in realâlife scenarios as several distinctive issues influence the tracking process, such as occlusions and an unknown number of speakers. One approach to overcome these issues is to use multiâmodal information, as it conveys complementary information about the state of the speakers compared to singleâmodal tracking. To use multiâmodal information, several approaches have been proposed which can be classified into two categories, namely deterministic and stochastic. This chapter aims at providing multimedia researchers with a stateâofâtheâart overview of tracking methods, which are used for combining multiple modalities to accomplish various multimedia analysis tasks, classifying them into different categories and listing new and future trends in this field
Accelerating MCMC Algorithms
Markov chain Monte Carlo algorithms are used to simulate from complex
statistical distributions by way of a local exploration of these distributions.
This local feature avoids heavy requests on understanding the nature of the
target, but it also potentially induces a lengthy exploration of this target,
with a requirement on the number of simulations that grows with the dimension
of the problem and with the complexity of the data behind it. Several
techniques are available towards accelerating the convergence of these Monte
Carlo algorithms, either at the exploration level (as in tempering, Hamiltonian
Monte Carlo and partly deterministic methods) or at the exploitation level
(with Rao-Blackwellisation and scalable methods).Comment: This is a survey paper, submitted WIREs Computational Statistics, to
with 6 figure
Regression-based {Monte Carlo} integration
© Corentin Salaun, Adrien Gruson, Binh-Son Hua, Toshiya Hachisuka & Gurprit Singh | ACM, (2022). This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM Transactions on Graphics, http://dx.doi.org/10.1145/3528223.3530095.Monte Carlo integration is typically interpreted as an estimator of the expected value using stochastic samples. There exists an alternative interpretation in calculus where Monte Carlo integration can be seen as estimating a constant functionâfrom the stochastic evaluations of the integrandâthat integrates to the original integral. The integral mean value theorem states that this constant function should be the mean (or expectation) of the integrand. Since both interpretations result in the same estimator, little attention has been devoted to the calculus-oriented interpretation. We show that the calculus-oriented interpretation actually implies the possibility of using a more complex function than a constant one to construct a more efficient estimator for Monte Carlo integration. We build a new estimator based on this interpretation and relate our estimator to control variates with least-squares regression on the stochastic samples of the integrand. Unlike prior work, our resulting estimator is provably better than or equal to the conventional Monte Carlo estimator. To demonstrate the strength of our approach, we introduce a practical estimator that can act as a simple drop-in replacement for conventional Monte Carlo integration. We experimentally validate our framework on various light transport integrals. The code is available at https://github.com/iribis/regressionmc
Detection-assisted Object Tracking by Mobile Cameras
Tracking-by-detection is a class of new tracking approaches that utilizes recent development of object detection algorithms. This type of approach performs object detection for each frame and uses data association algorithms to associate new observations to existing targets. Inspired by the core idea of the tracking-by-detection framework, we propose a new framework called detection-assisted tracking where object detection algorithm provides help to the tracking algorithm when such help is necessary; thus object detection, a very time consuming task, is performed only when needed. The proposed framework is also able to handle complicated scenarios where cameras are allowed to move, and occlusion or multiple similar objects exist.
We also port the core component of the proposed framework, the detector, onto embedded smart cameras. Contrary to traditional scenarios where the smart cameras are assumed to be static, we allow the smart cameras to move around in the scene. Our approach employs histogram of oriented gradients (HOG) object detector for foreground detection, to enable more robust detection on mobile platform. Traditional background subtraction methods are not suitable for mobile platforms where the background changes constantly.
Adviser: Senem Velipasalar and Mustafa Cenk Gurso
Computation of Electromagnetic Fields Scattered From Objects With Uncertain Shapes Using Multilevel Monte Carlo Method
Computational tools for characterizing electromagnetic scattering from
objects with uncertain shapes are needed in various applications ranging from
remote sensing at microwave frequencies to Raman spectroscopy at optical
frequencies. Often, such computational tools use the Monte Carlo (MC) method to
sample a parametric space describing geometric uncertainties. For each sample,
which corresponds to a realization of the geometry, a deterministic
electromagnetic solver computes the scattered fields. However, for an accurate
statistical characterization the number of MC samples has to be large. In this
work, to address this challenge, the continuation multilevel Monte Carlo
(CMLMC) method is used together with a surface integral equation solver. The
CMLMC method optimally balances statistical errors due to sampling of the
parametric space, and numerical errors due to the discretization of the
geometry using a hierarchy of discretizations, from coarse to fine. The number
of realizations of finer discretizations can be kept low, with most samples
computed on coarser discretizations to minimize computational cost.
Consequently, the total execution time is significantly reduced, in comparison
to the standard MC scheme.Comment: 25 pages, 10 Figure
- âŠ