3 research outputs found

    Delta Divergence: A Novel Decision Cognizant Measure of Classifier Incongruence

    Get PDF
    In pattern recognition, disagreement between two classifiers regarding the predicted class membership of an observation can be indicative of an anomaly and its nuance. Since, in general, classifiers base their decisions on class a posteriori probabilities, the most natural approach to detecting classifier incongruence is to use divergence. However, existing divergences are not particularly suitable to gauge classifier incongruence. In this paper, we postulate the properties that a divergence measure should satisfy and propose a novel divergence measure, referred to as delta divergence. In contrast to existing measures, it focuses on the dominant (most probable) hypotheses and, thus, reduces the effect of the probability mass distributed over the non dominant hypotheses (clutter). The proposed measure satisfies other important properties, such as symmetry, and independence of classifier confidence. The relationship of the proposed divergence to some baseline measures, and its superiority, is shown experimentally

    Localization-Based Sensor Validation Using The Kullback-Leibler Divergence

    No full text
    A sensor validation criteria based on the sensor's object localization accuracy is proposed. Assuming that the true probability distribution of an object or event in space f(x) is known and a spatial likelihood function (SLF) #(x) for the same object or event in space is obtained from a sensor, then the expected value of the SLF E[#(x)] is proposed as a suitable validity metric for the sensor, where the expectation is performed over the distribution f(x). It is shown that for the class of increasing linear log-likelihood SLFs, the proposed validity metric is equivalent to the Kullback-Leibler distance between f(x) and the unknown sensor-based distribution g(x) where the SLF #(x) is an observable increasing function of the unobservable g(x). The proposed technique is illustrated through several simulated and experimental examples

    Localization-Based Sensor Validation Using the Kullback–Leibler Divergence

    No full text
    Abstract—A sensor validation criteria based on the sensor’s object localization accuracy is proposed. Assuming that the true probability distribution of an object or event in space @�A is known and a spatial likelihood function (SLF) @�A for the same object or event in space is obtained from a sensor, then the expected value of the SLF
    corecore