34,549 research outputs found

    Seafloor Segmentation Based on Bathymetric Measurements from Multibeam Echosounders Data

    Get PDF
    Bathymetric data depicts the geomorphology of the seabottom and allows characterization of spatial distributions of apparent benthic habitats. The variability of seafloor topography can be defined as a texture. This prompts for the application of well developed image processing techniques for automatic delineation of regions with clucially different physiographic characteristics. In the present paper histograms of biologically motivated invariant image attributes are used for characterization of local geomorphological feahires. This technique can be naturally applied in a range of spatial scales. Local feature vectors are then submitted to a procedure which divides the set into a number of clusters each representing a distinct type of the seafloor. Prior knowledge about benthic habitat locations allows the use of supervised classification, by training a Suppolt Vector Machine on a chosen data set, and then applying the developed model to a full set. The classification method is shown to perform well on the multibeam echosounder (MBES) data from Piscataqua River, New Hampshire, USA

    Active Sampling-based Binary Verification of Dynamical Systems

    Full text link
    Nonlinear, adaptive, or otherwise complex control techniques are increasingly relied upon to ensure the safety of systems operating in uncertain environments. However, the nonlinearity of the resulting closed-loop system complicates verification that the system does in fact satisfy those requirements at all possible operating conditions. While analytical proof-based techniques and finite abstractions can be used to provably verify the closed-loop system's response at different operating conditions, they often produce conservative approximations due to restrictive assumptions and are difficult to construct in many applications. In contrast, popular statistical verification techniques relax the restrictions and instead rely upon simulations to construct statistical or probabilistic guarantees. This work presents a data-driven statistical verification procedure that instead constructs statistical learning models from simulated training data to separate the set of possible perturbations into "safe" and "unsafe" subsets. Binary evaluations of closed-loop system requirement satisfaction at various realizations of the uncertainties are obtained through temporal logic robustness metrics, which are then used to construct predictive models of requirement satisfaction over the full set of possible uncertainties. As the accuracy of these predictive statistical models is inherently coupled to the quality of the training data, an active learning algorithm selects additional sample points in order to maximize the expected change in the data-driven model and thus, indirectly, minimize the prediction error. Various case studies demonstrate the closed-loop verification procedure and highlight improvements in prediction error over both existing analytical and statistical verification techniques.Comment: 23 page
    • …
    corecore