471 research outputs found

    Robust sensor fusion in real maritime surveillance scenarios

    Get PDF
    8 pages, 14 figures.-- Proceedings of: 13th International Conference on Information Fusion (FUSION'2010), Edinburgh, Scotland, UK, Jul 26-29, 2010).This paper presents the design and evaluation of a sensor fusion system for maritime surveillance. The system must exploit the complementary AIS-radar sensing technologies to synthesize a reliable surveillance picture using a highly efficient implementation to operate in dense scenarios. The paper highlights the realistic effects taken into account for robust data combination and system scalability.This work was supported in part by a national project with NUCLEO CC, and research projects CICYT TEC2008-06732-C02-02/TEC, CICYT TIN2008-06742-C02-02/TSI, SINPROB, CAM CONTEXTS S2009/TIC-1485 and DPS2008-07029-C02-02.Publicad

    Uncertainty in Ontologies: Dempster-Shafer Theory for Data Fusion Applications

    Full text link
    Nowadays ontologies present a growing interest in Data Fusion applications. As a matter of fact, the ontologies are seen as a semantic tool for describing and reasoning about sensor data, objects, relations and general domain theories. In addition, uncertainty is perhaps one of the most important characteristics of the data and information handled by Data Fusion. However, the fundamental nature of ontologies implies that ontologies describe only asserted and veracious facts of the world. Different probabilistic, fuzzy and evidential approaches already exist to fill this gap; this paper recaps the most popular tools. However none of the tools meets exactly our purposes. Therefore, we constructed a Dempster-Shafer ontology that can be imported into any specific domain ontology and that enables us to instantiate it in an uncertain manner. We also developed a Java application that enables reasoning about these uncertain ontological instances.Comment: Workshop on Theory of Belief Functions, Brest: France (2010

    Security of Cyber-Physical Systems in the Presence of Transient Sensor Faults

    Get PDF
    This paper is concerned with the security of modern Cyber-Physical Systems in the presence of transient sensor faults. We consider a system with multiple sensors measuring the same physical variable, where each sensor provides an interval with all possible values of the true state. We note that some sensors might output faulty readings and others may be controlled by a malicious attacker. Different from previous works, in this paper we aim to distinguish between faults and attacks and develop an attack detection algorithm for the latter only. To do this, we note that there are two kinds of faults – transient and permanent; the former are benign and short-lived whereas the latter may have dangerous consequences on system performance.We argue that sensors have an underlying transient fault model that quantifies the amount of time in which transient faults can occur. In addition, we provide a framework for developing such a model if it is not provided by manufacturers. Attacks can manifest as either transient or permanent faults depending on the attacker’s goal. We provide different techniques for handling each kind. For the former, we analyze the worst-case performance of sensor fusion over time given each sensor’s transient fault model and develop a filtered fusion interval that is guaranteed to contain the true value and is bounded in size. To deal with attacks that do not comply with sensors’ transient fault models, we propose a sound attack detection algorithm based on pairwise inconsistencies between sensor measurements. Finally, we provide a real-data case study on an unmanned ground vehicle to evaluate the various aspects of this paper

    Multi-sensor multi-target tracking using domain knowledge and clustering

    Get PDF
    This paper proposes a novel joint multi-target tracking and track maintenance algorithm over a sensor network. Each sensor runs a local joint probabilistic data association (JPDA) filter using only its own measurements. Unlike the original JPDA approach, the proposed local filter utilises the detection amplitude as domain knowledge to improve the estimation accuracy. In the fusion stage, the DBSCAN clustering in conjunction with statistical test is proposed to group all local tracks into several clusters. Each generated cluster represents the local tracks that are from the same target source and the global estimation of each cluster is obtained by the generalized covariance intersection (GCI) algorithm. Extensive simulation results clearly confirms the effectiveness of the proposed multisensor multi-target tracking algorithm

    Distributed Random Set Theoretic Soft/Hard Data Fusion

    Get PDF
    Research on multisensor data fusion aims at providing the enabling technology to combine information from several sources in order to form a unifi ed picture. The literature work on fusion of conventional data provided by non-human (hard) sensors is vast and well-established. In comparison to conventional fusion systems where input data are generated by calibrated electronic sensor systems with well-defi ned characteristics, research on soft data fusion considers combining human-based data expressed preferably in unconstrained natural language form. Fusion of soft and hard data is even more challenging, yet necessary in some applications, and has received little attention in the past. Due to being a rather new area of research, soft/hard data fusion is still in a edging stage with even its challenging problems yet to be adequately de fined and explored. This dissertation develops a framework to enable fusion of both soft and hard data with the Random Set (RS) theory as the underlying mathematical foundation. Random set theory is an emerging theory within the data fusion community that, due to its powerful representational and computational capabilities, is gaining more and more attention among the data fusion researchers. Motivated by the unique characteristics of the random set theory and the main challenge of soft/hard data fusion systems, i.e. the need for a unifying framework capable of processing both unconventional soft data and conventional hard data, this dissertation argues in favor of a random set theoretic approach as the first step towards realizing a soft/hard data fusion framework. Several challenging problems related to soft/hard fusion systems are addressed in the proposed framework. First, an extension of the well-known Kalman lter within random set theory, called Kalman evidential filter (KEF), is adopted as a common data processing framework for both soft and hard data. Second, a novel ontology (syntax+semantics) is developed to allow for modeling soft (human-generated) data assuming target tracking as the application. Third, as soft/hard data fusion is mostly aimed at large networks of information processing, a new approach is proposed to enable distributed estimation of soft, as well as hard data, addressing the scalability requirement of such fusion systems. Fourth, a method for modeling trust in the human agents is developed, which enables the fusion system to protect itself from erroneous/misleading soft data through discounting such data on-the-fly. Fifth, leveraging the recent developments in the RS theoretic data fusion literature a novel soft data association algorithm is developed and deployed to extend the proposed target tracking framework into multi-target tracking case. Finally, the multi-target tracking framework is complemented by introducing a distributed classi fication approach applicable to target classes described with soft human-generated data. In addition, this dissertation presents a novel data-centric taxonomy of data fusion methodologies. In particular, several categories of fusion algorithms have been identifi ed and discussed based on the data-related challenging aspect(s) addressed. It is intended to provide the reader with a generic and comprehensive view of the contemporary data fusion literature, which could also serve as a reference for data fusion practitioners by providing them with conducive design guidelines, in terms of algorithm choice, regarding the specifi c data-related challenges expected in a given application

    Combination of Evidence in Dempster-Shafer Theory

    Full text link
    corecore