94 research outputs found

    Extended incidence calculus and its comparison with related theories

    Get PDF
    This thesis presents a comprehensive study o f incidence calculus, a probabilistic logic for reasoning under uncertainty which extends two-value propositional logic to a multiple-value logic. There are three main contributions in this thesis.First of all, the original incidence calculus is extended considerably in three aspects: (a) the original incidence calculus is generalized; (b) an efficient algorithm for incidence assignment based on generalized incidence calculus is developed; (c) a combination rule is proposed for the combination of both independent and some dependent pieces of evidence. Extended incidence calculus has the advantages of representing information flexibly and combining multiple sources o f evidence.Secondly, a comprehensive comparison between extended incidence calculus and the Dempster-Shafer (DS) theory of evidence is provided. It is proved that extended incidence calculus is equivalent to DS theory in representing evidence and combining independent evidence but superior to DS theory in combining de­pendent evidence.Thirdly, the relations between extended incidence calculus and the assumption- based truth maintenance systems are discussed. It is proved that extended inci­dence calculus is equivalent to the ATM S in calculating labels for nodes. Extended incidence calculus can also be used as a basis for constructing probabilistic ATMSs.The study in this thesis reveals that extended incidence calculus can be re­garded as a bridge between numerical and symbolic reasoning mechanisms

    Using Dempster-Shafer’s evidence theory for query expansion based on freebase knowledge

    Get PDF
    Query expansion is generally a useful technique in improving search performance. However, some expanded query terms obtained by traditional statistical methods (e.g., pseudo-relevance feedback) may not be relevant to the user's information need, while some relevant terms may not be contained in the feedback documents at all. Recent studies utilize external resources to detect terms that are related to the query, and then adopt these terms in query expansion. In this paper, we present a study in the use of Freebase, which is an open source general-purpose ontology, as a source for deriving expansion terms. FreeBase provides a graph-based model of human knowledge, from which a rich and multi-step structure of instances related to the query concept can be extracted, as a complement to the traditional statistical approaches to query expansion. We propose a novel method, based on the well-principled Dempster-Shafer's (D-S) evidence theory, to measure the certainty of expansion terms from the Freebase structure. The expanded query model is then combined with a state of the art statistical query expansion model - the Relevance Model (RM3). Experiments show that the proposed method achieves significant improvements over RM3

    An Evidential Fractal Analytic Hierarchy Process Target Recognition Method

    Get PDF
    Target recognition in uncertain environments is a hot issue, especially in extremely uncertain situation where both the target attribution and the sensor report are not clearly represented. To address this issue, a model which combines fractal theory, Dempster-Shafer evidence theory and analytic hierarchy process (AHP) to classify objects with incomplete information is proposed. The basic probability assignment (BPA), or belief function, can be modelled by conductivity function. The weight of each BPA is determined by AHP. Finally, the collected data are discounted with the weights. The feasibility and validness of proposed model is verified by an evidential classifier case in which sensory data are incomplete and collected from multiple level of granularity. The proposed fusion algorithm takes the advantage of not only efficient modelling of uncertain information, but also efficient combination of uncertain information

    Data Fusion for Close‐Range Detection

    Get PDF
    Two approaches for combining humanitarian mine detection sensors are described in parallel, one based on belief functions and the other one based on possibility theory. In a first step, different measures are extracted from the sensor data. After that, based on prior information, mass functions and possibility distributions are derived. The combination of possibility degrees, as well as of masses, is performed in two steps. The first one applies to all measures derived from one sensor. The second one combines results obtained in the first step for all sensors used. Combination operators are chosen to account for different characteristics of the sensors. Comparison of the combination equations of the two approaches is performed as well. Furthermore, selection of the decision rules is discussed for both approaches. These approaches are illustrated on a set of real mines and non‐dangerous objects and using three sensors: an infrared camera, an imaging metal detector and a ground‐penetrating radar

    Improving landslide detection from airborne laser scanning data using optimized Dempster-Shafer

    Full text link
    © 2018 by the authors. A detailed and state-of-the-art landslide inventory map including precise landslide location is greatly required for landslide susceptibility, hazard, and risk assessments. Traditional techniques employed for landslide detection in tropical regions include field surveys, synthetic aperture radar techniques, and optical remote sensing. However, these techniques are time consuming and costly. Furthermore, complications arise for the generation of accurate landslide location maps in these regions due to dense vegetation in tropical forests. Given its ability to penetrate vegetation cover, high-resolution airborne light detection and ranging (LiDAR) is typically employed to generate accurate landslide maps. The object-based technique generally consists of many homogeneous pixels grouped together in a meaningful way through image segmentation. In this paper, in order to address the limitations of this approach, the final decision is executed using Dempster-Shafer theory (DST) rule combination based on probabilistic output from object-based support vector machine (SVM), random forest (RF), and K-nearest neighbor (KNN) classifiers. Therefore, this research proposes an efficient framework by combining three object-based classifiers using the DST method. Consequently, an existing supervised approach (i.e., fuzzy-based segmentation parameter optimizer) was adopted to optimize multiresolution segmentation parameters such as scale, shape, and compactness. Subsequently, a correlation-based feature selection (CFS) algorithm was employed to select the relevant features. Two study sites were selected to implement the method of landslide detection and evaluation of the proposed method (subset "A" for implementation and subset "B" for the transferrable). The DST method performed well in detecting landslide locations in tropical regions such as Malaysia, with potential applications in other similarly vegetated regions

    Sequential two-player games with ambiguity

    Full text link
    If players' beliefs are strictly non-additive, the Dempster-Shafer updating rule can be used to define beliefs off the equilibrium path. We define an equilibrium concept in sequential two-person games where players update their beliefs with the Dempster-Shafer updating rule. We show that in the limit as uncertainty tends to zero, our equilibrium approximates Bayesian Nash equilibrium by imposing context-dependent constraints on beliefs under uncertainty

    Context Exploitation in Data Fusion

    Get PDF
    Complex and dynamic environments constitute a challenge for existing tracking algorithms. For this reason, modern solutions are trying to utilize any available information which could help to constrain, improve or explain the measurements. So called Context Information (CI) is understood as information that surrounds an element of interest, whose knowledge may help understanding the (estimated) situation and also in reacting to that situation. However, context discovery and exploitation are still largely unexplored research topics. Until now, the context has been extensively exploited as a parameter in system and measurement models which led to the development of numerous approaches for the linear or non-linear constrained estimation and target tracking. More specifically, the spatial or static context is the most common source of the ambient information, i.e. features, utilized for recursive enhancement of the state variables either in the prediction or the measurement update of the filters. In the case of multiple model estimators, context can not only be related to the state but also to a certain mode of the filter. Common practice for multiple model scenarios is to represent states and context as a joint distribution of Gaussian mixtures. These approaches are commonly referred as the join tracking and classification. Alternatively, the usefulness of context was also demonstrated in aiding the measurement data association. Process of formulating a hypothesis, which assigns a particular measurement to the track, is traditionally governed by the empirical knowledge of the noise characteristics of sensors and operating environment, i.e. probability of detection, false alarm, clutter noise, which can be further enhanced by conditioning on context. We believe that interactions between the environment and the object could be classified into actions, activities and intents, and formed into structured graphs with contextual links translated into arcs. By learning the environment model we will be able to make prediction on the target\u2019s future actions based on its past observation. Probability of target future action could be utilized in the fusion process to adjust tracker confidence on measurements. By incorporating contextual knowledge of the environment, in the form of a likelihood function, in the filter measurement update step, we have been able to reduce uncertainties of the tracking solution and improve the consistency of the track. The promising results demonstrate that the fusion of CI brings a significant performance improvement in comparison to the regular tracking approaches

    A basic probability assignment methodology for unsupervised wireless intrusion detection

    Get PDF
    The broadcast nature of Wireless Local Area Networks (WLANs) has made them prone to several types of wireless injection attacks, such as Man-in-the-Middle (MitM) at the physical layer, deauthentication and rogue access point attacks. The implementation of novel Intrusion Detection Systems (IDSs) is fundamental to provide stronger protection against these wireless injection attacks. Because most attacks manifest themselves through different metrics, current IDSs should leverage a cross-layer approach to help towards improving the detection accuracy. The data fusion technique based on Dempster-Shafer (D-S) theory has been proven to be an efficient data fusion technique to implement the cross-layer metric approach. However, the dynamic generation of the Basic Probability Assignment (BPA) values used by D-S is still an open research problem. In this paper, we propose a novel unsupervised methodology to dynamically generate the BPA values, based on both the Gaussian and exponential probability density functions (pdf), the categorical probability mass function (pmf), and the local reachability density (lrd). Then, D-S is used to fuse the BPA values to classify whether the Wi-Fi frame is normal (i.e. non-malicious) or malicious. The proposed methodology provides 100% True Positive Rate (TPR) and 4.23% False Positive Rate (FPR) for the MitM attack, and 100% TPR and 2.44% FPR for the deauthentication attack, which confirm the efficiency of the dynamic BPA generation methodology

    A basic probability assignment methodology for unsupervised wireless intrusion detection

    Get PDF
    YesThe broadcast nature of wireless local area networks has made them prone to several types of wireless injection attacks, such as Man-in-the-Middle (MitM) at the physical layer, deauthentication, and rogue access point attacks. The implementation of novel intrusion detection systems (IDSs) is fundamental to provide stronger protection against these wireless injection attacks. Since most attacks manifest themselves through different metrics, current IDSs should leverage a cross-layer approach to help toward improving the detection accuracy. The data fusion technique based on the Dempster–Shafer (D-S) theory has been proven to be an efficient technique to implement the cross-layer metric approach. However, the dynamic generation of the basic probability assignment (BPA) values used by D-S is still an open research problem. In this paper, we propose a novel unsupervised methodology to dynamically generate the BPA values, based on both the Gaussian and exponential probability density functions, the categorical probability mass function, and the local reachability density. Then, D-S is used to fuse the BPA values to classify whether the Wi-Fi frame is normal (i.e., non-malicious) or malicious. The proposed methodology provides 100% true positive rate (TPR) and 4.23% false positive rate (FPR) for the MitM attack and 100% TPR and 2.44% FPR for the deauthentication attack, which confirm the efficiency of the dynamic BPA generation methodology.Gulf Science, Innovation and Knowledge Economy Programme of the U.K. Government under UK-Gulf Institutional Link Grant IL 279339985 and in part by the Engineering and Physical Sciences Research Council (EPSRC), U.K., under Grant EP/R006385/1

    Integrating Degradation Forecasting and Abatement Framework Into Advanced Distribution Management System

    Get PDF
    Future distribution grids are expected to face an increasing penetration of heterogeneous distributed energy resources (DERs) and electric vehicles (EVs). This landscape change will pose challenges to the control and management of distribution grids because of the variability of renewable energy resources and EV charging. In addition, multiple DERs dispersed over networks can also challenge the grid operation and maintenance as various DERs at various locations are needed to be monitored and managed. However, customers will not be content with reductions in power quality, reliability, economy, safety, or security. To enhance the effectiveness of grid control and management, future grids will be given more autonomy in the form of advanced distribution management systems (ADMS). Energy management (EM) is one of the main constituents of ADMS to enhance system efficiency. EM typically considers only saving fuel consumption costs. However, grids’ components degrade over time, and it adds up to the systems’ operation cost. Knowing the degradation behaviors of grids’ components to control them properly can reduce their degradation, and consequentially it can reduce the total operation cost. In addition, in order to maintain the highest reliability of the system, degradation models should also be developed along with appropriate decision-making strategies that allow information regarding components’ status to be integrated with ADMS. This dissertation proposes a framework to integrate a degradation forecasting (DF) layer into ADMS to abate components’ degradation processes, reduce the total operation cost, and enhance system reliability. The DF layer will collaborate with EM to find a solution that compromises fuel consumption costs and degradation costs
    • 

    corecore