13,503 research outputs found

    Fusion of Sensor Data and Intelligence in FITS

    Get PDF
    Proceedings of: 16th International Conference on Information Fusion (FUSION 2013): Istambul, Turkey 9-12 July 2013.The design and implementation of fusion systems working in real conditions requires functional and performance specification, analysis of information input and contextual domain, and development of testing and validation tools. This paper presents a fusion system recently developed to operate with EW and ISR sensors on-board of patrol aircraft, which must be fused with information from other collaborative entities and intelligence in databases. The paper describes the overall organization of the system developed, modules and the data flow. The characterization of data sources and core algorithms for data alignment, uncertainty representation and fusion management are detailed and validated in realistic situations.This work was supported in part by Projects FITS-DFS (EADS/CASA), MEyC TEC2012-37832-C02-01, MEyC TEC2011-28626-C02-02 and CAM CONTEXTS (S2009/TIC-1485).Publicad

    Fault detection, identification and accommodation techniques for unmanned airborne vehicles

    Get PDF
    Unmanned Airborne Vehicles (UAV) are assuming prominent roles in both the commercial and military aerospace industries. The promise of reduced costs and reduced risk to human life is one of their major attractions, however these low-cost systems are yet to gain acceptance as a safe alternate to manned solutions. The absence of a thinking, observing, reacting and decision making pilot reduces the UAVs capability of managing adverse situations such as faults and failures. This paper presents a review of techniques that can be used to track the system health onboard a UAV. The review is based on a year long literature review aimed at identifying approaches suitable for combating the low reliability and high attrition rates of today’s UAV. This research primarily focuses on real-time, onboard implementations for generating accurate estimations of aircraft health for fault accommodation and mission management (change of mission objectives due to deterioration in aircraft health). The major task of such systems is the process of detection, identification and accommodation of faults and failures (FDIA). A number of approaches exist, of which model-based techniques show particular promise. Model-based approaches use analytical redundancy to generate residuals for the aircraft parameters that can be used to indicate the occurrence of a fault or failure. Actions such as switching between redundant components or modifying control laws can then be taken to accommodate the fault. The paper further describes recent work in evaluating neural-network approaches to sensor failure detection and identification (SFDI). The results of simulations with a variety of sensor failures, based on a Matlab non-linear aircraft model are presented and discussed. Suggestions for improvements are made based on the limitations of this neural network approach with the aim of including a broader range of failures, while still maintaining an accurate model in the presence of these failures

    Biologically Inspired Approaches to Automated Feature Extraction and Target Recognition

    Full text link
    Ongoing research at Boston University has produced computational models of biological vision and learning that embody a growing corpus of scientific data and predictions. Vision models perform long-range grouping and figure/ground segmentation, and memory models create attentionally controlled recognition codes that intrinsically cornbine botton-up activation and top-down learned expectations. These two streams of research form the foundation of novel dynamically integrated systems for image understanding. Simulations using multispectral images illustrate road completion across occlusions in a cluttered scene and information fusion from incorrect labels that are simultaneously inconsistent and correct. The CNS Vision and Technology Labs (cns.bu.edulvisionlab and cns.bu.edu/techlab) are further integrating science and technology through analysis, testing, and development of cognitive and neural models for large-scale applications, complemented by software specification and code distribution.Air Force Office of Scientific Research (F40620-01-1-0423); National Geographic-Intelligence Agency (NMA 201-001-1-2016); National Science Foundation (SBE-0354378; BCS-0235298); Office of Naval Research (N00014-01-1-0624); National Geospatial-Intelligence Agency and the National Society of Siegfried Martens (NMA 501-03-1-2030, DGE-0221680); Department of Homeland Security graduate fellowshi

    Paradox Elimination in Dempster–Shafer Combination Rule with Novel Entropy Function: Application in Decision-Level Multi-Sensor Fusion

    Get PDF
    Multi-sensor data fusion technology in an important tool in building decision-making applications. Modified Dempster–Shafer (DS) evidence theory can handle conflicting sensor inputs and can be applied without any prior information. As a result, DS-based information fusion is very popular in decision-making applications, but original DS theory produces counterintuitive results when combining highly conflicting evidences from multiple sensors. An effective algorithm offering fusion of highly conflicting information in spatial domain is not widely reported in the literature. In this paper, a successful fusion algorithm is proposed which addresses these limitations of the original Dempster–Shafer (DS) framework. A novel entropy function is proposed based on Shannon entropy, which is better at capturing uncertainties compared to Shannon and Deng entropy. An 8-step algorithm has been developed which can eliminate the inherent paradoxes of classical DS theory. Multiple examples are presented to show that the proposed method is effective in handling conflicting information in spatial domain. Simulation results showed that the proposed algorithm has competitive convergence rate and accuracy compared to other methods presented in the literature

    Stereo and ToF Data Fusion by Learning from Synthetic Data

    Get PDF
    Time-of-Flight (ToF) sensors and stereo vision systems are both capable of acquiring depth information but they have complementary characteristics and issues. A more accurate representation of the scene geometry can be obtained by fusing the two depth sources. In this paper we present a novel framework for data fusion where the contribution of the two depth sources is controlled by confidence measures that are jointly estimated using a Convolutional Neural Network. The two depth sources are fused enforcing the local consistency of depth data, taking into account the estimated confidence information. The deep network is trained using a synthetic dataset and we show how the classifier is able to generalize to different data, obtaining reliable estimations not only on synthetic data but also on real world scenes. Experimental results show that the proposed approach increases the accuracy of the depth estimation on both synthetic and real data and that it is able to outperform state-of-the-art methods
    corecore