1,850 research outputs found

    Underwater target detection using multiple disparate sonar platforms

    Get PDF
    2010 Fall.Includes bibliographical references.The detection of underwater objects from sonar imagery presents a difficult problem due to various factors such as variations in the operating and environmental conditions, presence of spatially varying clutter, and variations in target shapes, compositions, and orientation. Additionally, collecting data from multiple platforms can present more challenging questions such as "how should I collaboratively perform detection to achieve optimal performance?", "how many platforms should be employed?", "when do we reach a point of diminishing return when adding platforms?", or more importantly "when does adding an additional platform not help at all?". To perform multi-platform detection and answer these questions we use the coherent information among all disparate sources of information and perform detection on the premise that the amount of coherent information will be greater in situations where a target is present in a region of interest within an image versus a situation where our observation strictly consists of background clutter. To exploit the coherent information among the different sources, we recast the standard Neyman-Pearson, Gauss-Gauss detector into the Multi-Channel Coherence Analysis (MCA) framework. The MCA framework allows one to optimally decompose the multi-channel data into a new appropriate coordinate system in order to analyze their linear dependence or coherence in a more meaningful fashion. To do this, new expressions for the log-likelihood ratio and J-divergence are formulated in this multichannel coordinate system. Using the MCA framework, the data of each channel is first whitened individually, hence removing the second-order information from each channel. Then, a set of linear mapping matrices are obtained which maximizes the sum of the cross-correlations among the channels in the mapped domain. To perform detection in the coordinate system provided by MCA, we first of all construct a model suited to this multiple sensor platform problem and subsequently represent observations in their MCA coordinates associated with the H1 hypothesis. Performing detection in the MCA framework results in a log-likelihood ratio that is written in terms of the MCA correlations and mapping vectors as well as a local signal-to-noise ratio matrix. In this coordinate system, the J-divergence, which is a measure of the difference in means of the likelihood ratio, can effectively be represented in terms of the multi-channel correlations and mapping vectors. Using this J-divergence expression, one can get a more clear picture of the amount of discriminatory information available for detection by analyzing the amount of coherent information present among the channels. New analytical and experimental results are also presented to provide better insight on the effects of adding a new piece of data to the multi-channel Gauss-Gauss detector represented in the MCA framework. To answer questions like those posed in the first paragraph, one must carefully analyze the amount of discriminatory information that is brought to the detection process when adding observations from an additional channel. Rather than attempting to observe the increase (or lack thereof) from the full detection problem it is advantageous to look at the change incrementally. To accomplish this goal, new updating equations for the likelihood ratio are derived that involve linearly estimating the new data from the old (already existing) and updating the likelihood ratio accordingly. In this case, the change in J-divergence can be written in terms of error covariance matrices under each hypothesis. We then derive a change in coordinate system that can be used to perform dimensionality reduction. This especially becomes useful when the data we wish to add exists in high-dimensional space. To demonstrate the usefulness of log-likelihood updating, we conduct two simulation studies. The first simulation corresponds to detecting the presence of dynamical structure in data we have observed and corresponds to a temporal updating scheme. The second is concerned with detecting the presence of a single narrow-band source using multiple linear sensor arrays in which case we consider a platform (or channel) updating scheme. A comprehensive study is carried out on the MCA-based detector on three data sets acquired from the Naval Surface Warfare Center (NSWC) in Panama City, FL. The first data set consists of one high frequency (HF) and three broadband (BB) side-looking sonar imagery coregistered over the same region on the sea floor captured from an Autonomous Underwater Vehicle (AUV) platform. For this data set we consider three different detection schemes using different combinations of these sonar channels. The next data set consists of one HF and only one BB beamformed sonar imagery again coregistered over the same region on the sea floor. This data set consists of not only target objects but also lobster traps giving us experimental intuition as how the multi-channel correlations change for different object compositions. The use of multiple disparate sonar images, e.g., a high frequency, high resolution sonar with good target definition and a multitude of lower resolution broadband sonar with good clutter suppression ability significantly improves the detection and false alarm rates comparing to situations where only single sonar is utilized. Finally, a data set consisting of synthetically generated images of targets with differing degrees of disparity such as signal-to-noise ratio (SNR), aspect angle, resolution, etc., is used to conduct a thorough sensitivity analysis in order to study the effects of different SNR, target types, and disparateness in aspect angle

    Adaptive Mantel Test for AssociationTesting in Imaging Genetics Data

    Full text link
    Mantel's test (MT) for association is conducted by testing the linear relationship of similarity of all pairs of subjects between two observational domains. Motivated by applications to neuroimaging and genetics data, and following the succes of shrinkage and kernel methods for prediction with high-dimensional data, we here introduce the adaptive Mantel test as an extension of the MT. By utilizing kernels and penalized similarity measures, the adaptive Mantel test is able to achieve higher statistical power relative to the classical MT in many settings. Furthermore, the adaptive Mantel test is designed to simultaneously test over multiple similarity measures such that the correct type I error rate under the null hypothesis is maintained without the need to directly adjust the significance threshold for multiple testing. The performance of the adaptive Mantel test is evaluated on simulated data, and is used to investigate associations between genetics markers related to Alzheimer's Disease and heatlhy brain physiology with data from a working memory study of 350 college students from Beijing Normal University

    Regularized estimation of information via canonical correlation analysis on a finite-dimensional feature space

    Get PDF
    This paper aims to estimate the information between two random phenomena by using consolidated second-order statistics tools. The squared-loss mutual information, a surrogate of the Shannon mutual information, is chosen due to its property of being expressed as a second-order moment. We first review the rationale for i.i.d. discrete sources, which involves mapping the data onto the simplex space, and we highlight the links with other well-known related concepts in the literature based on local approximations of information-theoretic measures. Then, the problem is translated to analog sources by mapping the data onto the characteristic space, focusing on the adaptability between the discrete and the analog case and its limitations. The proposed approach gains interpretability and scalability for its use on large data sets, providing a unified rationale for the free regularization parameters. Moreover, the structure of the proposed mapping allows resorting to Szegö’s theorem to reduce the complexity for high dimensional mappings, exhibiting a strong duality with spectral analysis. The performance of the developed estimators is analyzed using Gaussian mixtures.This work has been supported by the Spanish Ministry of Science and Innovation through project RODIN (PID2019-105717RB- C22/MCIN/AEI/10.13039/501100011033), by the grant 2021 SGR 01033 (AGAUR, Generalitat de Catalunya), and fellowship FI 2019 by the Secretary for University and Research of the Generalitat de Catalunya and the European Social Fund.Peer ReviewedPostprint (author's final draft

    Radar data smoothing filter study

    Get PDF
    The accuracy of the current Wallops Flight Facility (WFF) data smoothing techniques for a variety of radars and payloads is examined. Alternative data reduction techniques are given and recommendations are made for improving radar data processing at WFF. A data adaptive algorithm, based on Kalman filtering and smoothing techniques, is also developed for estimating payload trajectories above the atmosphere from noisy time varying radar data. This algorithm is tested and verified using radar tracking data from WFF

    Surrogate time series

    Full text link
    Before we apply nonlinear techniques, for example those inspired by chaos theory, to dynamical phenomena occurring in nature, it is necessary to first ask if the use of such advanced techniques is justified "by the data". While many processes in nature seem very unlikely a priori to be linear, the possible nonlinear nature might not be evident in specific aspects of their dynamics. The method of surrogate data has become a very popular tool to address such a question. However, while it was meant to provide a statistically rigorous, foolproof framework, some limitations and caveats have shown up in its practical use. In this paper, recent efforts to understand the caveats, avoid the pitfalls, and to overcome some of the limitations, are reviewed and augmented by new material. In particular, we will discuss specific as well as more general approaches to constrained randomisation, providing a full range of examples. New algorithms will be introduced for unevenly sampled and multivariate data and for surrogate spike trains. The main limitation, which lies in the interpretability of the test results, will be illustrated through instructive case studies. We will also discuss some implementational aspects of the realisation of these methods in the TISEAN (http://www.mpipks-dresden.mpg.de/~tisean) software package.Comment: 28 pages, 23 figures, software at http://www.mpipks-dresden.mpg.de/~tisea

    A Primer on Reproducing Kernel Hilbert Spaces

    Full text link
    Reproducing kernel Hilbert spaces are elucidated without assuming prior familiarity with Hilbert spaces. Compared with extant pedagogic material, greater care is placed on motivating the definition of reproducing kernel Hilbert spaces and explaining when and why these spaces are efficacious. The novel viewpoint is that reproducing kernel Hilbert space theory studies extrinsic geometry, associating with each geometric configuration a canonical overdetermined coordinate system. This coordinate system varies continuously with changing geometric configurations, making it well-suited for studying problems whose solutions also vary continuously with changing geometry. This primer can also serve as an introduction to infinite-dimensional linear algebra because reproducing kernel Hilbert spaces have more properties in common with Euclidean spaces than do more general Hilbert spaces.Comment: Revised version submitted to Foundations and Trends in Signal Processin
    • …
    corecore