595 research outputs found

    Goal-oriented adaptive sampling under random field modelling of response probability distributions

    Get PDF
    In the study of natural and artificial complex systems, responses that are not completely determined by the considered decision variables are commonly modelled probabilistically, resulting in response distributions varying across decision space. We consider cases where the spatial variation of these response distributions does not only concern their mean and/or variance but also other features including for instance shape or uni-modality versus multi-modality. Our contributions build upon a non-parametric Bayesian approach to modelling the thereby induced fields of probability distributions, and in particular to a spatial extension of the logistic Gaussian model. The considered models deliver probabilistic predictions of response distributions at candidate points, allowing for instance to perform (approximate) posterior simulations of probability density functions, to jointly predict multiple moments and other functionals of target distributions, as well as to quantify the impact of collecting new samples on the state of knowledge of the distribution field of interest. In particular, we introduce adaptive sampling strategies leveraging the potential of the considered random distribution field models to guide system evaluations in a goal-oriented way, with a view towards parsimoniously addressing calibration and related problems from non-linear (stochastic) inversion and global optimisation

    A Naive Bayes Source Classifier for X-ray Sources

    Full text link
    The Chandra Carina Complex Project (CCCP) provides a sensitive X-ray survey of a nearby starburst region over >1 square degree in extent. Thousands of faint X-ray sources are found, many concentrated into rich young stellar clusters. However, significant contamination from unrelated Galactic and extragalactic sources is present in the X-ray catalog. We describe the use of a naive Bayes classifier to assign membership probabilities to individual sources, based on source location, X-ray properties, and visual/infrared properties. For the particular membership decision rule adopted, 75% of CCCP sources are classified as members, 11% are classified as contaminants, and 14% remain unclassified. The resulting sample of stars likely to be Carina members is used in several other studies, which appear in a Special Issue of the ApJS devoted to the CCCP.Comment: Accepted for the ApJS Special Issue on the Chandra Carina Complex Project (CCCP), scheduled for publication in May 2011. All 16 CCCP Special Issue papers are available at http://cochise.astro.psu.edu/Carina_public/special_issue.html through 2011 at least. 19 pages, 7 figure

    On the use of autonomous unmanned vehicles in response to hazardous atmospheric release incidents

    Get PDF
    Recent events have induced a surge of interest in the methods of response to releases of hazardous materials or gases into the atmosphere. In the last decade there has been particular interest in mapping and quantifying emissions for regulatory purposes, emergency response, and environmental monitoring. Examples include: responding to events such as gas leaks, nuclear accidents or chemical, biological or radiological (CBR) accidents or attacks, and even exploring sources of methane emissions on the planet Mars. This thesis presents a review of the potential responses to hazardous releases, which includes source localisation, boundary tracking, mapping and source term estimation. [Continues.]</div

    Implicit Deep Adaptive Design: Policy–Based Experimental Design without Likelihoods

    Get PDF
    We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models. iDAD amortizes the cost of Bayesian optimal experimental design (BOED) by learning a design policy network upfront, which can then be deployed quickly at the time of the experiment. The iDAD network can be trained on any model which simulates differentiable samples, unlike previous design policy work that requires a closed form likelihood and conditionally independent experiments. At deployment, iDAD allows design decisions to be made in milliseconds, in contrast to traditional BOED approaches that require heavy computation during the experiment itself. We illustrate the applicability of iDAD on a number of experiments, and show that it provides a fast and effective mechanism for performing adaptive design with implicit models.Comment: 33 pages, 8 figures. Published as a conference paper at NeurIPS 202

    Deep Gaussian Processes: Advances in Models and Inference

    Get PDF
    Hierarchical models are certainly in fashion these days. It seems difficult to navigate the field of machine learning without encountering `deep' models of one sort or another. The popularity of the deep learning revolution has been driven by some striking empirical successes, prompting both intense rapture and intense criticism. The criticisms often centre around the lack of model uncertainty, leading to sometimes drastically overconfident predictions. Others point to the lack of a mechanism for incorporating prior knowledge, and the reliance on large datasets. A widely held hope is that a Bayesian approach might overcome these problems. The deep Gaussian process presents a paradigm for building deep models from a Bayesian perspective. A Gaussian process is a prior for functions. A deep Gaussian process uses several Gaussian process functions and combines them hierarchically through composition (that is, the output of one is the input to the next). The deep Gaussian process promises to capture the compositional nature of deep learning while mitigating some of the disadvantages through a Bayesian approach. The thesis develops deep Gaussian process modelling in a number of ways. The model is first interpreted differently from previous work, not as a `hierarchical prior' but as a factorized prior with an hierarchical likelihood. Mean functions are suggested to avoid issues of degeneracy and to aid initialization. The main contribution is a new method of inference that avoids the burden of representing the function values directly through an application of sparse variational inference. This method scales to arbitrarily large data and is shown to work well in practice through experiments. The use of variational inference recasts (approximate) inference as optimization of Gaussian distributions. This optimization has an exploitable geometry via the natural gradient. The natural gradient is shown to be advantageous for single layer non-conjugate models, and for the (final layer of a) deep Gaussian process model. Deep Gaussian processes can be a model both for complex associations between variables and complex marginal distributions of single variables. Incorporating noise in the hierarchy leads to complex marginal distribution through the non-linearities of the mappings at each layer. The inference required for noisy variables cannot be handled with sparse methods, as sparse methods rely on correlations between variables, which are absent for noisy variables. Instead, a more direct approach is developed, using an importance weighted variational scheme.Open Acces

    Multilevel Delayed Acceptance MCMC with Applications to Hydrogeological Inverse Problems

    Get PDF
    Quantifying the uncertainty of model predictions is a critical task for engineering decision support systems. This is a particularly challenging effort in the context of statistical inverse problems, where the model parameters are unknown or poorly constrained, and where the data is often scarce. Many such problems emerge in the fields of hydrology and hydro--environmental engineering in general, and in hydrogeology in particular. While methods for rigorously quantifying the uncertainty of such problems exist, they are often prohibitively computationally expensive, particularly when the forward model is high--dimensional and expensive to evaluate. In this thesis, I present a Metropolis--Hastings algorithm, namely the Multilevel Delayed Acceptance (MLDA) algorithm, which exploits a hierarchy of forward models of increasing computational cost to significantly reduce the total cost of quantifying the uncertainty of high--dimensional, expensive forward models. The algorithm is shown to be in detailed balance with the posterior distribution of parameters, and the computational gains of the algorithm is demonstrated on multiple examples. Additionally, I present an approach for exploiting a deep neural network as an ultra--fast model approximation in an MLDA model hierarchy. This method is demonstrated in the context of both 2D and 3D groundwater flow modelling. Finally, I present a novel approach to adaptive optimal design of groundwater surveying, in which MLDA is employed to construct the posterior Monte Carlo estimates. This method utilises the posterior uncertainty of the primary problem in conjunction with the expected solution to an adjoint problem to sequentially determine the optimal location of the next datapoint.Engineering and Physical Sciences Research Council (EPSRC)Alan Turing InstituteEngineering and Physical Sciences Research Council (EPSRC

    Radiation Sensing: Design and Deployment of Sensors and Detectors

    Get PDF
    Radiation detection is important in many fields, and it poses significant challenges for instrument designers. Radiation detection instruments, particularly for nuclear decommissioning and security applications, are required to operate in unknown environments and should detect and characterise radiation fields in real time. This book covers both theory and practice, and it solicits recent advances in radiation detection, with a particular focus on radiation detection instrument design, real-time data processing, radiation simulation and experimental work, robot design, control systems, task planning and radiation shielding
    • …
    corecore