4,818 research outputs found
On Cognitive Preferences and the Plausibility of Rule-based Models
It is conventional wisdom in machine learning and data mining that logical
models such as rule sets are more interpretable than other models, and that
among such rule-based models, simpler models are more interpretable than more
complex ones. In this position paper, we question this latter assumption by
focusing on one particular aspect of interpretability, namely the plausibility
of models. Roughly speaking, we equate the plausibility of a model with the
likeliness that a user accepts it as an explanation for a prediction. In
particular, we argue that, all other things being equal, longer explanations
may be more convincing than shorter ones, and that the predominant bias for
shorter models, which is typically necessary for learning powerful
discriminative models, may not be suitable when it comes to user acceptance of
the learned models. To that end, we first recapitulate evidence for and against
this postulate, and then report the results of an evaluation in a
crowd-sourcing study based on about 3.000 judgments. The results do not reveal
a strong preference for simple rules, whereas we can observe a weak preference
for longer rules in some domains. We then relate these results to well-known
cognitive biases such as the conjunction fallacy, the representative heuristic,
or the recogition heuristic, and investigate their relation to rule length and
plausibility.Comment: V4: Another rewrite of section on interpretability to clarify focus
on plausibility and relation to interpretability, comprehensibility, and
justifiabilit
Median evidential c-means algorithm and its application to community detection
Median clustering is of great value for partitioning relational data. In this
paper, a new prototype-based clustering method, called Median Evidential
C-Means (MECM), which is an extension of median c-means and median fuzzy
c-means on the theoretical framework of belief functions is proposed. The
median variant relaxes the restriction of a metric space embedding for the
objects but constrains the prototypes to be in the original data set. Due to
these properties, MECM could be applied to graph clustering problems. A
community detection scheme for social networks based on MECM is investigated
and the obtained credal partitions of graphs, which are more refined than crisp
and fuzzy ones, enable us to have a better understanding of the graph
structures. An initial prototype-selection scheme based on evidential
semi-centrality is presented to avoid local premature convergence and an
evidential modularity function is defined to choose the optimal number of
communities. Finally, experiments in synthetic and real data sets illustrate
the performance of MECM and show its difference to other methods
Finding Academic Experts on a MultiSensor Approach using Shannon's Entropy
Expert finding is an information retrieval task concerned with the search for
the most knowledgeable people, in some topic, with basis on documents
describing peoples activities. The task involves taking a user query as input
and returning a list of people sorted by their level of expertise regarding the
user query. This paper introduces a novel approach for combining multiple
estimators of expertise based on a multisensor data fusion framework together
with the Dempster-Shafer theory of evidence and Shannon's entropy. More
specifically, we defined three sensors which detect heterogeneous information
derived from the textual contents, from the graph structure of the citation
patterns for the community of experts, and from profile information about the
academic experts. Given the evidences collected, each sensor may define
different candidates as experts and consequently do not agree in a final
ranking decision. To deal with these conflicts, we applied the Dempster-Shafer
theory of evidence combined with Shannon's Entropy formula to fuse this
information and come up with a more accurate and reliable final ranking list.
Experiments made over two datasets of academic publications from the Computer
Science domain attest for the adequacy of the proposed approach over the
traditional state of the art approaches. We also made experiments against
representative supervised state of the art algorithms. Results revealed that
the proposed method achieved a similar performance when compared to these
supervised techniques, confirming the capabilities of the proposed framework
Aleatoric and Epistemic Uncertainty in Machine Learning: An Introduction to Concepts and Methods
The notion of uncertainty is of major importance in machine learning and
constitutes a key element of machine learning methodology. In line with the
statistical tradition, uncertainty has long been perceived as almost synonymous
with standard probability and probabilistic predictions. Yet, due to the
steadily increasing relevance of machine learning for practical applications
and related issues such as safety requirements, new problems and challenges
have recently been identified by machine learning scholars, and these problems
may call for new methodological developments. In particular, this includes the
importance of distinguishing between (at least) two different types of
uncertainty, often referred to as aleatoric and epistemic. In this paper, we
provide an introduction to the topic of uncertainty in machine learning as well
as an overview of attempts so far at handling uncertainty in general and
formalizing this distinction in particular.Comment: 59 page
Bayesian interpolation
Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other data modeling problems. Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. “Occam's razor” is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling
Uncertainty Management of Intelligent Feature Selection in Wireless Sensor Networks
Wireless sensor networks (WSN) are envisioned to revolutionize the paradigm of monitoring complex real-world systems at a very high resolution. However, the deployment of a large number of unattended sensor nodes in hostile environments, frequent changes of environment dynamics, and severe resource constraints pose uncertainties and limit the potential use of WSN in complex real-world applications. Although uncertainty management in Artificial Intelligence (AI) is well developed and well investigated, its implications in wireless sensor environments are inadequately addressed. This dissertation addresses uncertainty management issues of spatio-temporal patterns generated from sensor data. It provides a framework for characterizing spatio-temporal pattern in WSN. Using rough set theory and temporal reasoning a novel formalism has been developed to characterize and quantify the uncertainties in predicting spatio-temporal patterns from sensor data. This research also uncovers the trade-off among the uncertainty measures, which can be used to develop a multi-objective optimization model for real-time decision making in sensor data aggregation and samplin
Counter-Hypothetical Particle Filters for Single Object Pose Tracking
Particle filtering is a common technique for six degree of freedom (6D) pose
estimation due to its ability to tractably represent belief over object pose.
However, the particle filter is prone to particle deprivation due to the
high-dimensional nature of 6D pose. When particle deprivation occurs, it can
cause mode collapse of the underlying belief distribution during importance
sampling. If the region surrounding the true state suffers from mode collapse,
recovering its belief is challenging since the area is no longer represented in
the probability mass formed by the particles. Previous methods mitigate this
problem by randomizing and resetting particles in the belief distribution, but
determining the frequency of reinvigoration has relied on hand-tuning abstract
heuristics. In this paper, we estimate the necessary reinvigoration rate at
each time step by introducing a Counter-Hypothetical likelihood function, which
is used alongside the standard likelihood. Inspired by the notions of
plausibility and implausibility from Evidential Reasoning, the addition of our
Counter-Hypothetical likelihood function assigns a level of doubt to each
particle. The competing cumulative values of confidence and doubt across the
particle set are used to estimate the level of failure within the filter, in
order to determine the portion of particles to be reinvigorated. We demonstrate
the effectiveness of our method on the rigid body object 6D pose tracking task.Comment: International Conference on Robotics and Automation (ICRA) 202
A belief function theory based approach to combining different representation of uncertainty in prognostics
International audienceIn this work, we consider two prognostic approaches for the prediction of the remaining useful life (RUL) of degrading equipment. The first approach is based on Gaussian Process Regression (GPR) and provides the probability distribution of the equipment RUL; the second approach adopts a Similarity-Based Regression (SBR) method for the RUL prediction and belief function theory for modeling the uncertainty on the prediction. The performance of the two approaches is comparable and we propose a method for combining their outcomes in an ensemble. The least commitment principle is adopted to transform the RUL probability density function supplied by the GPR method into a belief density function. Then, the Dempster's rule is used to aggregate the belief assignments provided by the GPR and the SBR approaches. The ensemble method is applied to the problem of predicting the RUL of filters used to clean the sea water entering the condenser of the boiling water reactor (BWR) in a Swedish nuclear power plant. The results by the ensemble method are shown to be more satisfactory than that provided by the individual GPR and SBR approaches from the point of view of the representation of the uncertainty in the RUL prediction
- …