990 research outputs found

    Real-time processing of radar return on a parallel computer

    Get PDF
    NASA is working with the FAA to demonstrate the feasibility of pulse Doppler radar as a candidate airborne sensor to detect low altitude windshears. The need to provide the pilot with timely information about possible hazards has motivated a demand for real-time processing of a radar return. Investigated here is parallel processing as a means of accommodating the high data rates required. A PC based parallel computer, called the transputer, is used to investigate issues in real time concurrent processing of radar signals. A transputer network is made up of an array of single instruction stream processors that can be networked in a variety of ways. They are easily reconfigured and software development is largely independent of the particular network topology. The performance of the transputer is evaluated in light of the computational requirements. A number of algorithms have been implemented on the transputers in OCCAM, a language specially designed for parallel processing. These include signal processing algorithms such as the Fast Fourier Transform (FFT), pulse-pair, and autoregressive modelling, as well as routing software to support concurrency. The most computationally intensive task is estimating the spectrum. Two approaches have been taken on this problem, the first and most conventional of which is to use the FFT. By using table look-ups for the basis function and other optimizing techniques, an algorithm has been developed that is sufficient for real time. The other approach is to model the signal as an autoregressive process and estimate the spectrum based on the model coefficients. This technique is attractive because it does not suffer from the spectral leakage problem inherent in the FFT. Benchmark tests indicate that autoregressive modeling is feasible in real time

    Estimation and Detection

    Get PDF

    Human Motion Trajectory Prediction: A Survey

    Full text link
    With growing numbers of intelligent autonomous systems in human environments, the ability of such systems to perceive, understand and anticipate human behavior becomes increasingly important. Specifically, predicting future positions of dynamic agents and planning considering such predictions are key tasks for self-driving vehicles, service robots and advanced surveillance systems. This paper provides a survey of human motion trajectory prediction. We review, analyze and structure a large selection of work from different communities and propose a taxonomy that categorizes existing methods based on the motion modeling approach and level of contextual information used. We provide an overview of the existing datasets and performance metrics. We discuss limitations of the state of the art and outline directions for further research.Comment: Submitted to the International Journal of Robotics Research (IJRR), 37 page

    Optimization of Automatic Target Recognition with a Reject Option Using Fusion and Correlated Sensor Data

    Get PDF
    This dissertation examines the optimization of automatic target recognition (ATR) systems when a rejection option is included. First, a comprehensive review of the literature inclusive of ATR assessment, fusion, correlated sensor data, and classifier rejection is presented. An optimization framework for the fusion of multiple sensors is then developed. This framework identifies preferred fusion rules and sensors along with rejection and receiver operating characteristic (ROC) curve thresholds without the use of explicit misclassification costs as required by a Bayes\u27 loss function. This optimization framework is the first to integrate both vertical warfighter output label analysis and horizontal engineering confusion matrix analysis. In addition, optimization is performed for the true positive rate, which incorporates the time required by classification systems. The mathematical programming framework is used to assess different fusion methods and to characterize correlation effects both within and across sensors. A synthetic classifier fusion-testing environment is developed by controlling the correlation levels of generated multivariate Gaussian data. This synthetic environment is used to demonstrate the utility of the optimization framework and to assess the performance of fusion algorithms as correlation varies. The mathematical programming framework is then applied to collected radar data. This radar fusion experiment optimizes Boolean and neural network fusion rules across four levels of sensor correlation. Comparisons are presented for the maximum true positive rate and the percentage of feasible thresholds to assess system robustness. Empirical evidence suggests ATR performance may improve by reducing the correlation within and across polarimetric radar sensors. Sensitivity analysis shows ATR performance is affected by the number of forced looks, prior probabilities, the maximum allowable rejection level, and the acceptable error rates

    Collaborative adaptive filtering for machine learning

    No full text
    Quantitative performance criteria for the analysis of machine learning architectures and algorithms have long been established. However, qualitative performance criteria, which identify fundamental signal properties and ensure any processing preserves the desired properties, are still emerging. In many cases, whilst offline statistical tests exist such as assessment of nonlinearity or stochasticity, online tests which not only characterise but also track changes in the nature of the signal are lacking. To that end, by employing recent developments in signal characterisation, criteria are derived for the assessment of the changes in the nature of the processed signal. Through the fusion of the outputs of adaptive filters a single collaborative hybrid filter is produced. By tracking the dynamics of the mixing parameter of this filter, rather than the actual filter performance, a clear indication as to the current nature of the signal is given. Implementations of the proposed method show that it is possible to quantify the degree of nonlinearity within both real- and complex-valued data. This is then extended (in the real domain) from dealing with nonlinearity in general, to a more specific example, namely sparsity. Extensions of adaptive filters from the real to the complex domain are non-trivial and the differences between the statistics in the real and complex domains need to be taken into account. In terms of signal characteristics, nonlinearity can be both split- and fully-complex and complex-valued data can be considered circular or noncircular. Furthermore, by combining the information obtained from hybrid filters of different natures it is possible to use this method to gain a more complete understanding of the nature of the nonlinearity within a signal. This also paves the way for building multidimensional feature spaces and their application in data/information fusion. To produce online tests for sparsity, adaptive filters for sparse environments are investigated and a unifying framework for the derivation of proportionate normalised least mean square (PNLMS) algorithms is presented. This is then extended to derive variants with an adaptive step-size. In order to create an online test for noncircularity, a study of widely linear autoregressive modelling is presented, from which a proof of the convergence of the test for noncircularity can be given. Applications of this method are illustrated on examples such as biomedical signals, speech and wind data
    • …
    corecore