145 research outputs found

    Adaptive Sampling For Efficient Online Modelling

    Get PDF
    This thesis examines methods enabling autonomous systems to make active sampling and planning decisions in real time. Gaussian Process (GP) regression is chosen as a framework for its non-parametric approach allowing flexibility in unknown environments. The first part of the thesis focuses on depth constrained full coverage bathymetric surveys in unknown environments. Algorithms are developed to find and follow a depth contour, modelled with a GP, and produce a depth constrained boundary. An extension to the Boustrophedon Cellular Decomposition, Discrete Monotone Polygonal Partitioning is developed allowing efficient planning for coverage within this boundary. Efficient computational methods such as incremental Cholesky updates are implemented to allow online Hyper Parameter optimisation and fitting of the GP's. This is demonstrated in simulation and the field on a platform built for the purpose. The second part of this thesis focuses on modelling the surface salinity profiles of estuarine tidal fronts. The standard GP model assumes evenly distributed noise, which does not always hold. This can be handled with Heteroscedastic noise. An efficient new method, Parametric Heteroscedastic Gaussian Process regression, is proposed. This is applied to active sample selection on stationary fronts and adaptive planning on moving fronts where a number of information theoretic methods are compared. The use of a mean function is shown to increase the accuracy of predictions whilst reducing optimisation time. These algorithms are validated in simulation. Algorithmic development is focused on efficient methods allowing deployment on platforms with constrained computational resources. Whilst the application of this thesis is Autonomous Surface Vessels, it is hoped the issues discussed and solutions provided have relevance to other applications in robotics and wider fields such as spatial statistics and machine learning in general

    Bayesian modeling with spatial curvature processes

    Full text link
    Spatial process models are widely used for modeling point-referenced variables arising from diverse scientific domains. Analyzing the resulting random surface provides deeper insights into the nature of latent dependence within the studied response. We develop Bayesian modeling and inference for rapid changes on the response surface to assess directional curvature along a given trajectory. Such trajectories or curves of rapid change, often referred to as \emph{wombling} boundaries, occur in geographic space in the form of rivers in a flood plain, roads, mountains or plateaus or other topographic features leading to high gradients on the response surface. We demonstrate fully model based Bayesian inference on directional curvature processes to analyze differential behavior in responses along wombling boundaries. We illustrate our methodology with a number of simulated experiments followed by multiple applications featuring the Boston Housing data; Meuse river data; and temperature data from the Northeastern United States

    Potential fields data modeling: new frontiers in forward and inverse problems

    Get PDF
    Since the '50, potential fields data modeling has played an important role in analyzing the density and magnetization distribution in Earth's subsurface for a wide variety of applications. Examples are the characterization of ore deposits and the assessment of geothermal and petroleum potential, which turned out to be key contributors for the economic and industrial development after World War II. Current modeling methods mainly rely on two popular parameterization approaches, either involving a discretization of target geological bodies by means of 2D to 2.75D horizontal prisms with polygonal vertical cross-section (polygon-based approach) or prismatic cells (prism-based approach). Despite the great endeavour made by scientists in recent decades, inversion methods based on these parameterization approaches still suffers from a limited ability to (i) realistically characterize the variability of density and magnetization expected in a study area and (ii) take into account the strong non-uniqueness affecting potential fields theory. The prism-based approach is used in linear deterministic inverse methods, which provide just one single solution, preventing uncertainty estimation and statistical analysis on the parameters we would like to characterize (i.e, density or magnetization). On the contrary, the polygon-based approach is almost exclusively exploited in a trial-and-error modeling strategy, leaving the potential to develop innovative inverse methods untapped. The reason is two-fold, namely (i) its strongly non-linear forward problem requires an efficient probabilistic inverse modeling methodology to solve the related inverse problem, and (ii) unpredictable cross-intersections between polygonal bodies during inversion represent a challenging task to be tackled in order to achieve geologically plausible model solutions. The goal of this thesis is then to contribute to solving the critical issues outlined above, developing probabilistic inversion methodologies based on the polygon- and prism-based parameterization approaches aiming to help improving our capability to unravel the structure of the subsurface. Regarding the polygon-based parameterization strategy, at first a deep review of its mathematical framework has been performed, allowing us (i) to restore the validity of a recently criticized mathematical formulation for the 2D magnetic case, and (ii) to find an error sign in the derivation for the 2.75D magnetic case causing potentially wrong numerical results. Such preliminary phase allowed us to develop a methodology to independently or jointly invert gravity and magnetic data exploiting the Hamilton Monte Carlo approach, thanks to which collection of models allow researchers to appraise different geological scenarios and fully characterize uncertainties on the model parameters. Geological plausibility of results is ensured by automatic checks on the geometries of modelled bodies, which avoid unrealistic cross-intersections among them. Regarding the prism-based parameterization approach, the linear inversion method based on the probabilistic approach considers a discretization of target geological scenarios by prismatic bodies, arranged horizontally to cover it and finitely extended in the vertical direction, particularly suitable to model density and magnetization variability inside strata. Its strengths have been proven, for the magnetic case, in the characterization of the magnetization variability expected for the shallower volcanic unit of the Mt. Melbourne Volcanic Field (Northern Victoria Land, Antarctica), helping significantly us to unravel its poorly known inner geophysical architecture

    High-speed surface profilometry based on an adaptive microscope with axial chromatic encoding

    Get PDF
    An adaptive microscope with axial chromatic encoding is designed and developed, namely the AdaScope. With the ability to confocally address any locations within the measurement volume, the AdaScope provides the hardware foundation for a cascade measurement strategy to be developed, dramatically accelerating the speed of 3D confocal microscopy

    Autonomous Exploration of Large-Scale Natural Environments

    Get PDF
    This thesis addresses issues which arise when using robotic platforms to explore large-scale, natural environments. Two main problems are identified: the volume of data collected by autonomous platforms and the complexity of planning surveys in large environments. Autonomous platforms are able to rapidly accumulate large data sets. The volume of data that must be processed is often too large for human experts to analyse exhaustively in a practical amount of time or in a cost-effective manner. This burden can create a bottleneck in the process of converting observations into scientifically relevant data. Although autonomous platforms can collect precisely navigated, high-resolution data, they are typically limited by finite battery capacities, data storage and computational resources. Deployments are also limited by project budgets and time frames. These constraints make it impractical to sample large environments exhaustively. To use the limited resources effectively, trajectories which maximise the amount of information gathered from the environment must be designed. This thesis addresses these problems. Three primary contributions are presented: a new classifier designed to accept probabilistic training targets rather than discrete training targets; a semi-autonomous pipeline for creating models of the environment; and an offline method for autonomously planning surveys. These contributions allow large data sets to be processed with minimal human intervention and promote efficient allocation of resources. In this thesis environmental models are established by learning the correlation between data extracted from a digital elevation model (DEM) of the seafloor and habitat categories derived from in-situ images. The DEM of the seafloor is collected using ship-borne multibeam sonar and the in-situ images are collected using an autonomous underwater vehicle (AUV). While the thesis specifically focuses on mapping and exploring marine habitats with an AUV, the research applies equally to other applications such as aerial and terrestrial environmental monitoring and planetary exploration
    corecore