994 research outputs found

    Reliability measure assignment to sonar for robust target differentiation

    Get PDF
    Cataloged from PDF version of article.This article addresses the use of evidential reasoning and majority voting in multi-sensor decision making for target differentiation using sonar sensors. Classification of target primitives which constitute the basic building blocks of typical surfaces in uncluttered robot environments has been considered. Multiple sonar sensors placed at geographically different sensing sites make decisions about the target type based on their measurement patterns. Their decisions are combined to reach a group decision through Dempster-Shafer evidential reasoning and majority voting, The sensing nodes view the targets at different ranges and angles so that they have different degrees of reliability. Proper accounting for these different reliabilities has the potential to improve decision making compared to simple uniform treatment of the sensors. Consistency problems arising in majority voting are addressed with a view to achieving high classification performance. This is done by introducing preference ordering among the possible target types and assigning reliability measures (which essentially serve as weights) to each decision-making node based on the target range and azimuth estimates it makes and the belief values it assigns to possible target types. The results bring substantial improvement over evidential reasoning and simple majority voting by reducing the target misclassification. rate. (C) 2002 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved

    Comparative analysis of different approaches to target classification and localization with sonar

    Get PDF
    The comparison of different classification and fusion techniques was done for target classification and localization with sonar. Target localization performance of artificial neural networks (ANN) was found to be better than the target differentiation algorithm (TDA) and fusion techniques. The target classification performance of non-parametric approaches was better than that of parameterized density estimator (PDE) using homoscedastic and heteroscedastic NM for statistical pattern recognition techniques

    Voting as validation in robot programming

    Get PDF
    This paper investigates the use of voting as a conflict-resolution technique for data analysis in robot programming. Voting represents an information-abstraction technique. It is argued that in some cases a voting approach is inherent in the nature of the data being analyzed: where multiple, independent sources of information must be reconciled to give a group decision that reflects a single outcome rather than a consensus average. This study considers an example of target classification using sonar sensors. Physical models of reflections from target primitives that are typical of the indoor environment of a mobile robot are used. Dispersed sensors take decisions on target type, which must then be fused to give the single group classification of the presence or absence and type of a target. Dempster-Shafer evidential reasoning is used to assign a level of belief to each sensor decision. The decisions are then fused by two means. Using Dempster's rule of combination, conflicts are resolved through a group measure expressing dissonance in the sensor views. This evidential approach is contrasted with the resolution of sensor conflict through voting. It is demonstrated that abstraction of the level of belief through voting proves useful in resolving the straightforward conflicts that arise in the classification problem. Conflicts arise where the discriminant data value, an echo amplitude, is most sensitive to noise. Fusion helps to overcome this vulnerability: in Dempster-Shafer reasoning, through the modeling of nonparametric uncertainty and combination of belief values; and in voting, by emphasizing the majority view. The paper gives theoretical and experimental evidence for the use of voting for data abstraction and conflict resolution in areas such as classification, where a strong argument can be made for techniques that emphasize a single outcome rather than an estimated value. Methods for making the vote more strategic are also investigated. The paper addresses the reduction of dimension of sets of decision points or decision makers. Through a consideration of combination/order, queuing criteria for more strategic fusion are identified

    A comparison of different approaches to target differentiation with sonar

    Get PDF
    Ankara : The Department of Electrical and Electronics Engineering and the Institute of Engineering and Science of Bilkent University, 2001.Thesis (Ph.D.) -- Bilkent University, 2001.Includes bibliographical references leaves 180-197This study compares the performances of di erent classication schemes and fusion techniques for target di erentiation and localization of commonly encountered features in indoor robot environments using sonar sensing Di erentiation of such features is of interest for intelligent systems in a variety of applications such as system control based on acoustic signal detection and identication map building navigation obstacle avoidance and target tracking The classication schemes employed include the target di erentiation algorithm developed by Ayrulu and Barshan statistical pattern recognition techniques fuzzy c means clustering algorithm and articial neural networks The fusion techniques used are Dempster Shafer evidential reasoning and di erent voting schemes To solve the consistency problem arising in simple ma jority voting di erent voting schemes including preference ordering and reliability measures are proposed and veried experimentally To improve the performance of neural network classiers di erent input signal representations two di erent training algorithms and both modular and non modular network structures are considered The best classication and localization scheme is found to be the neural network classier trained with the wavelet transform of the sonar signals This method is applied to map building in mobile robot environments Physically di erent sensors such as infrared sensors and structured light systems besides sonar sensors are also considered to improve the performance in target classication and localization.Ayrulu (Erdem), BirselPh.D

    Computational Analysis of Mass Spectrometric Data for Whole Organism Proteomic Studies

    Get PDF
    In the last decades great breakthroughs have been achieved in the study of the genomes, supplying us with the vast knowledge of the genes and a large number of sequenced organisms. With the availability of genome information, the new systematic studies have arisen. One of the most prominent areas is proteomics. Proteomics is a discipline devoted to the study of the organism’s expressed protein content. Proteomics studies are concerned with a wide range of problems. Some of the major proteomics focuses upon the studies of protein expression patterns, the detection of protein-protein interactions, protein quantitation, protein localization analysis, and characterization of post-translational modifications. The emergence of proteomics shows great promise to furthering our understanding of the cellular processes and mechanisms of life. One of the main techniques used for high-throughput proteomic studies is mass spectrometry. Capable of detecting masses of biological compounds in complex mixtures, it is currently one of the most powerful methods for protein characterization. New horizons are opening with the new developments of mass spectrometry instrumentation, which can now be applied to a variety of proteomic problems. One of the most popular applications of proteomics involves whole organism high-throughput experiments. However, as new instrumentation is being developed, followed by the design of new experiments, we find ourselves needing new computational algorithms to interpret the results of the experiments. As the thresholds of the current technology are being probed, the new algorithmic designs are beginning to emerge to meet the challenges of the mass spectrometry data evaluation and interpretation. This dissertation is devoted to computational analysis of mass spectrometric data, involving a combination of different topics and techniques to improve our understanding of biological processes using high-throughput whole organism proteomic studies. It consists of the development of new algorithms to improve the data interpretation of the current tools, introducing a new algorithmic approach for post-translational modification detection, and the characterization of a set of computational simulations for biological agent detection in a complex organism background. These studies are designed to further the capabilities of understanding the results of high-throughput mass spectrometric experiments and their impact in the field of proteomics

    Multi Sensor Multi Target Perception and Tracking for Informed Decisions in Public Road Scenarios

    Get PDF
    Multi-target tracking in public traffic calls for a tracking system with automated track initiation and termination facilities in a randomly evolving driving environment. Besides, the key problem of data association needs to be handled effectively considering the limitations in the computational resources on-board an autonomous car. The challenge of the tracking problem is further evident in the use of high-resolution automotive sensors which return multiple detections per object. Furthermore, it is customary to use multiple sensors that cover different and/or over-lapping Field of View and fuse sensor detections to provide robust and reliable tracking. As a consequence, in high-resolution multi-sensor settings, the data association uncertainty, and the corresponding tracking complexity increases pointing to a systematic approach to handle and process sensor detections. In this work, we present a multi-target tracking system that addresses target birth/initiation and death/termination processes with automatic track management features. These tracking functionalities can help facilitate perception during common events in public traffic as participants (suddenly) change lanes, navigate intersections, overtake and/or brake in emergencies, etc. Various tracking approaches including the ones based on joint integrated probability data association (JIPDA) filter, Linear Multi-target Integrated Probabilistic Data Association (LMIPDA) Filter, and their multi-detection variants are adapted to specifically include algorithms that handle track initiation and termination, clutter density estimation and track management. The utility of the filtering module is further elaborated by integrating it into a trajectory tracking problem based on model predictive control. To cope with tracking complexity in the case of multiple high-resolution sensors, we propose a hybrid scheme that combines the approaches of data clustering at the local sensor and multiple detections tracking schemes at the fusion layer. We implement a track-to-track fusion scheme that de-correlates local (sensor) tracks to avoid double counting and apply a measurement partitioning scheme to re-purpose the LMIPDA tracking algorithm to multi-detection cases. In addition to the measurement partitioning approach, a joint extent and kinematic state estimation scheme are integrated into the LMIPDA approach to facilitate perception and tracking of an individual as well as group targets as applied to multi-lane public traffic. We formulate the tracking problem as a two hierarchical layer. This arrangement enhances the multi-target tracking performance in situations including but not limited to target initialization(birth process), target occlusion, missed detections, unresolved measurement, target maneuver, etc. Also, target groups expose complex individual target interactions to help in situation assessment which is challenging to capture otherwise. The simulation studies are complemented by experimental studies performed on single and multiple (group) targets. Target detections are collected from a high-resolution radar at a frequency of 20Hz; whereas RTK-GPS data is made available as ground truth for one of the target vehicle\u27s trajectory

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
    corecore