1,174 research outputs found

    On high-dimensional support recovery and signal detection

    Get PDF

    On high-dimensional support recovery and signal detection

    Get PDF

    Deep probabilistic methods for improved radar sensor modelling and pose estimation

    Get PDF
    Radar’s ability to sense under adverse conditions and at far-range makes it a valuable alternative to vision and lidar for mobile robotic applications. However, its complex, scene-dependent sensing process and significant noise artefacts makes working with radar challenging. Moving past classical rule-based approaches, which have dominated the literature to date, this thesis investigates deep and data-driven solutions across a range of tasks in robotics. Firstly, a deep approach is developed for mapping raw sensor measurements to a grid-map of occupancy probabilities, outperforming classical filtering approaches by a significant margin. A distribution over the occupancy state is captured, additionally allowing uncertainty in predictions to be identified and managed. The approach is trained entirely using partial labels generated automatically from lidar, without requiring manual labelling. Next, a deep model is proposed for generating stochastic radar measurements from simulated elevation maps. The model is trained by learning the forward and backward processes side-by-side, using a combination of adversarial and cyclical consistency constraints in combination with a partial alignment loss, using labels generated in lidar. By faithfully replicating the radar sensing process, new models can be trained for down-stream tasks, using labels that are readily available in simulation. In this case, segmentation models trained on simulated radar measurements, when deployed in the real world, are shown to approach the performance of a model trained entirely on real-world measurements. Finally, the potential of deep approaches applied to the radar odometry task are explored. A learnt feature space is combined with a classical correlative scan matching procedure and optimised for pose prediction, allowing the proposed method to outperform the previous state-of-the-art by a significant margin. Through a probabilistic consideration the uncertainty in the pose is also successfully characterised. Building upon this success, properties of the Fourier Transform are then utilised to separate the search for translation and angle. It is shown that this decoupled search results in a significant boost to run-time performance, allowing the approach to run in real-time on CPUs and embedded devices, whilst remaining competitive with other radar odometry methods proposed in the literature

    A stochastic method for representation, modelling and fusion of excavated material in mining

    Get PDF
    The ability to safely and economically extract raw materials such as iron ore from a greater number of remote, isolated and possibly dangerous locations will become more pressing over the coming decades as easily accessible deposits become depleted. An autonomous mining system has the potential to make the mining process more efficient, predictable and safe under these changing conditions. One of the key parts of the mining process is the estimation and tracking of bulk material through the mining production chain. Current state-of-the-art tracking and estimation systems use a deterministic representation for bulk material. This is problematic for wide-scale automation of mine processes as there is no measurement of the uncertainty in the estimates provided. A probabilistic representation is critical for autonomous systems to correctly interpret and fuse the available data in order to make the most informed decision given the available information without human intervention. This thesis investigates whether bulk material properties can be represented probabilistically through a mining production chain to provide statistically consistent estimates of the material at each stage of the production chain. Experiments and methods within this thesis focus on the load-haul-dump cycle. The development of a representation of bulk material using lumped masses is presented. A method for tracking and estimation of these lumped masses within the mining production chain using an 'Augmented State Kalman Filter' (ASKF) is developed. The method ensures that the fusion of new information at different stages will provide statistically consistent estimates of the lumped mass. There is a particular focus on the feasibility and practicality of implementing a solution on a production mine site given the current sensing technology available and how it can be adapted for use within the developed estimation system (with particular focus on remote sensing and volume estimation)

    Self consistent bathymetric mapping from robotic vehicles in the deep ocean

    Get PDF
    Submitted In partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology and Woods Hole Oceanographic Institution June 2005Obtaining accurate and repeatable navigation for robotic vehicles in the deep ocean is difficult and consequently a limiting factor when constructing vehicle-based bathymetric maps. This thesis presents a methodology to produce self-consistent maps and simultaneously improve vehicle position estimation by exploiting accurate local navigation and utilizing terrain relative measurements. It is common for errors in the vehicle position estimate to far exceed the errors associated with the acoustic range sensor. This disparity creates inconsistency when an area is imaged multiple times and causes artifacts that distort map integrity. Our technique utilizes small terrain "submaps" that can be pairwise registered and used to additionally constrain the vehicle position estimates in accordance with actual bottom topography. A delayed state Kalman filter is used to incorporate these sub-map registrations as relative position measurements between previously visited vehicle locations. The archiving of previous positions in a filter state vector allows for continual adjustment of the sub-map locations. The terrain registration is accomplished using a two dimensional correlation and a six degree of freedom point cloud alignment method tailored for bathymetric data. The complete bathymetric map is then created from the union of all sub-maps that have been aligned in a consistent manner. Experimental results from the fully automated processing of a multibeam survey over the TAG hydrothermal structure at the Mid-Atlantic ridge are presented to validate the proposed method.This work was funded by the CenSSIS ERC of the Nation Science Foundation under grant EEC-9986821 and in part by the Woods Hole Oceanographic Institution through a grant from the Penzance Foundation

    Test analysis & fault simulation of microfluidic systems

    Get PDF
    This work presents a design, simulation and test methodology for microfluidic systems, with particular focus on simulation for test. A Microfluidic Fault Simulator (MFS) has been created based around COMSOL which allows a fault-free system model to undergo fault injection and provide test measurements. A post MFS test analysis procedure is also described.A range of fault-free system simulations have been cross-validated to experimental work to gauge the accuracy of the fundamental simulation approach prior to further investigation and development of the simulation and test procedure.A generic mechanism, termed a fault block, has been developed to provide fault injection and a method of describing a low abstraction behavioural fault model within the system. This technique has allowed the creation of a fault library containing a range of different microfluidic fault conditions. Each of the fault models has been cross-validated to experimental conditions or published results to determine their accuracy.Two test methods, namely, impedance spectroscopy and Levich electro-chemical sensors have been investigated as general methods of microfluidic test, each of which has been shown to be sensitive to a multitude of fault. Each method has successfully been implemented within the simulation environment and each cross-validated by first-hand experimentation or published work.A test analysis procedure based around the Neyman-Pearson criterion has been developed to allow a probabilistic metric for each test applied for a given fault condition, providing a quantitive assessment of each test. These metrics are used to analyse the sensitivity of each test method, useful when determining which tests to employ in the final system. Furthermore, these probabilistic metrics may be combined to provide a fault coverage metric for the complete system.The complete MFS method has been applied to two system cases studies; a hydrodynamic “Y” channel and a flow cytometry system for prognosing head and neck cancer.Decision trees are trained based on the test measurement data and fault conditions as a means of classifying the systems fault condition state. The classification rules created by the decision trees may be displayed graphically or as a set of rules which can be loaded into test instrumentation. During the course of this research a high voltage power supply instrument has been developed to aid electro-osmotic experimentation and an impedance spectrometer to provide embedded test

    Context Aided Tracking with Adaptive Hyperspectral Imagery

    Get PDF
    A methodology for the context-aided tracking of ground vehicles in remote airborne imagery is developed in which a background model is inferred from hyperspectral imagery. The materials comprising the background of a scene are remotely identified and lead to this model. Two model formation processes are developed: a manual method, and method that exploits an emerging adaptive, multiple-object-spectrometer instrument. A semi-automated background modeling approach is shown to arrive at a reasonable background model with minimal operator intervention. A novel, adaptive, and autonomous approach uses a new type of adaptive hyperspectral sensor, and converges to a 66% correct background model in 5% the time of the baseline {a 95% reduction in sensor acquisition time. A multiple-hypothesis-tracker is incorporated, which utilizes background statistics to form track costs and associated track maintenance thresholds. The context-aided system is demonstrated in a high- fidelity tracking testbed, and reduces track identity error by 30%

    US Cosmic Visions: New Ideas in Dark Matter 2017: Community Report

    Get PDF
    This white paper summarizes the workshop "U.S. Cosmic Visions: New Ideas in Dark Matter" held at University of Maryland on March 23-25, 2017.Comment: 102 pages + reference
    • …
    corecore