1,269 research outputs found
A Dual Measure of Uncertainty: The Deng Extropy
The extropy has recently been introduced as the dual concept of entropy. Moreover, in the context of the Dempster–Shafer evidence theory, Deng studied a new measure of discrimination, named the Deng entropy. In this paper, we define the Deng extropy and study its relation with Deng entropy, and examples are proposed in order to compare them. The behaviour of Deng extropy is studied under changes of focal elements. A characterization result is given for the maximum Deng extropy and, finally, a numerical example in pattern recognition is discussed in order to highlight the relevance of the new measure
On extropy of past lifetime distribution
Recently Qiu et al. (2017) have introduced residual extropy as measure of
uncertainty in residual lifetime distributions analogues to residual entropy
(1996). Also, they obtained some properties and applications of that. In this
paper, we study the extropy to measure the uncertainty in a past lifetime
distribution. This measure of uncertainty is called past extropy. Also it is
showed a characterization result about the past extropy of largest order
statistics
Using Landmarks for Explaining Entity Matching Models
The state of the art approaches for performing Entity Matching (EM) rely on machine & deep learning models for inferring pairs of matching / non-matching entities. Although the experimental evaluations demonstrate that these approaches are effective, their adoption in real scenarios is limited by the fact that they are difficult to interpret. Explainable AI systems have been recently proposed for complementing deep learning approaches. Their application to the scenario offered by EM is still new and requires to address the specificity of this task, characterized by particular dataset schemas, describing a pair of entities, and imbalanced classes.
This paper introduces Landmark Explanation, a generic and extensible framework that extends the capabilities of a post-hoc perturbation-based explainer over the EM scenario. Landmark Explanation generates perturbations that take advantage of the particular schemas of the EM datasets, thus generating explanations more accurate and more interesting for the users than the ones generated by competing approaches
On Cumulative Entropies in Terms of Moments of Order Statistics
In this paper, relations between some kinds of cumulative entropies and moments of order statistics are established. By using some characterizations and the symmetry of a non-negative and absolutely continuous random variable , lower and upper bounds for entropies are obtained and illustrative examples are given. By the relations with the moments of order statistics, a method is shown to compute an estimate of cumulative entropies and an application to testing whether data are exponentially distributed is outlined
On cumulative entropies in terms of moments of order statistics
In this paper relations among some kinds of cumulative entropies and moments
of order statistics are presented. By using some characterizations and the
symmetry of a non negative and absolutely continuous random variable X, lower
and upper bounds for entropies are obtained and examples are given.Comment: 13 pages, 1 tabl
On Tsallis extropy with an application to pattern recognition
Recently, a new measure of information called extropy has been introduced by
Lad, Sanfilippo and Agr\`o as the dual version of Shannon entropy. In the
literature, Tsallis introduced a measure for a discrete random variable, named
Tsallis entropy, as a generalization of Boltzmann-Gibbs statistics. In this
work, a new measure of discrimination, called Tsallis extropy, is introduced
and some of its properties are then discussed. The relation between Tsallis
extropy and entropy is given and some bounds are also presented. Finally, an
application of this extropy to pattern recognition is demonstrated
- …