3,201 research outputs found
Axiomatic Characterization of Data-Driven Influence Measures for Classification
We study the following problem: given a labeled dataset and a specific
datapoint x, how did the i-th feature influence the classification for x? We
identify a family of numerical influence measures - functions that, given a
datapoint x, assign a numeric value phi_i(x) to every feature i, corresponding
to how altering i's value would influence the outcome for x. This family, which
we term monotone influence measures (MIM), is uniquely derived from a set of
desirable properties, or axioms. The MIM family constitutes a provably sound
methodology for measuring feature influence in classification domains; the
values generated by MIM are based on the dataset alone, and do not make any
queries to the classifier. While this requirement naturally limits the scope of
our framework, we demonstrate its effectiveness on data
How to choose the most appropriate centrality measure?
We propose a new method to select the most appropriate network centrality
measure based on the user's opinion on how such a measure should work on a set
of simple graphs. The method consists in: (1) forming a set of
candidate measures; (2) generating a sequence of sufficiently simple graphs
that distinguish all measures in on some pairs of nodes; (3) compiling
a survey with questions on comparing the centrality of test nodes; (4)
completing this survey, which provides a centrality measure consistent with all
user responses. The developed algorithms make it possible to implement this
approach for any finite set of measures. This paper presents its
realization for a set of 40 centrality measures. The proposed method called
culling can be used for rapid analysis or combined with a normative approach by
compiling a survey on the subset of measures that satisfy certain normative
conditions (axioms). In the present study, the latter was done for the subsets
determined by the Self-consistency or Bridge axioms.Comment: 26 pages, 1 table, 1 algorithm, 8 figure
Appraising Diversity with an Ordinal Notion of Similarity: An Axiomatic Approach
This paper provides an axiomatic characterization of two rules for comparing alternative sets of objects on the basis of the diversity that they offer. The framework considered assumes a finite universe of objects and an a priori given ordinal quadernary relation that compares alternative pairs of objects on the basis of their ordinal dissimilarity. Very few properties of this quadernary relation are assumed (beside completeness, transitivity and a very natural form of symmetry). The two rules that we characterize are the maxi-max criterion and the lexi-max criterion. The maxi-max criterion considers that a set is more diverse than another if and only if the two objects that are the most dissimilar in the former are weakly as dissimilar as the two most dissimilar objects in the later. The lexi-max criterion is defined as usual as the lexicographic extension of the maxi-max criterion. Some connections with the broader issue of measuring freedom of choice are also provided.Diversity, Measurement, Axioms, Freedom of choice
Geospatial Narratives and their Spatio-Temporal Dynamics: Commonsense Reasoning for High-level Analyses in Geographic Information Systems
The modelling, analysis, and visualisation of dynamic geospatial phenomena
has been identified as a key developmental challenge for next-generation
Geographic Information Systems (GIS). In this context, the envisaged
paradigmatic extensions to contemporary foundational GIS technology raises
fundamental questions concerning the ontological, formal representational, and
(analytical) computational methods that would underlie their spatial
information theoretic underpinnings.
We present the conceptual overview and architecture for the development of
high-level semantic and qualitative analytical capabilities for dynamic
geospatial domains. Building on formal methods in the areas of commonsense
reasoning, qualitative reasoning, spatial and temporal representation and
reasoning, reasoning about actions and change, and computational models of
narrative, we identify concrete theoretical and practical challenges that
accrue in the context of formal reasoning about `space, events, actions, and
change'. With this as a basis, and within the backdrop of an illustrated
scenario involving the spatio-temporal dynamics of urban narratives, we address
specific problems and solutions techniques chiefly involving `qualitative
abstraction', `data integration and spatial consistency', and `practical
geospatial abduction'. From a broad topical viewpoint, we propose that
next-generation dynamic GIS technology demands a transdisciplinary scientific
perspective that brings together Geography, Artificial Intelligence, and
Cognitive Science.
Keywords: artificial intelligence; cognitive systems; human-computer
interaction; geographic information systems; spatio-temporal dynamics;
computational models of narrative; geospatial analysis; geospatial modelling;
ontology; qualitative spatial modelling and reasoning; spatial assistance
systemsComment: ISPRS International Journal of Geo-Information (ISSN 2220-9964);
Special Issue on: Geospatial Monitoring and Modelling of Environmental
Change}. IJGI. Editor: Duccio Rocchini. (pre-print of article in press
Theory of Effectiveness Measurement
Effectiveness measures provide decision makers feedback on the impact of deliberate actions and affect critical issues such as allocation of scarce resources, as well as whether to maintain or change existing strategy. Currently, however, there is no formal foundation for formulating effectiveness measures. This research presents a new framework for effectiveness measurement from both a theoretical and practical view. First, accepted effects-based principles, as well as fundamental measurement concepts are combined into a general, domain independent, effectiveness measurement methodology. This is accomplished by defining effectiveness measurement as the difference, or conceptual distance from a given system state to some reference system state (e.g. desired end-state). Then, by developing system attribute measures such that they yield a system state-space that can be characterized as a metric space, differences in system states relative to the reference state can be gauged over time, yielding a generalized, axiomatic definition of effectiveness measurement. The effectiveness measurement framework is then extended to mitigate the influence of measurement error and uncertainty by employing Kalman filtering techniques. Finally, the pragmatic nature of the approach is illustrated by measuring the effectiveness of a notional, security force response strategy in a scenario involving a terrorist attack on a United States Air Force base
Data-Informed Calibration and Aggregation of Expert Judgment in a Bayesian Framework
Historically, decision-makers have used expert opinion to supplement lack of data. Expert opinion, however, is applied with much caution. This is because judgment is subjective and contains estimation error with some degree of uncertainty. The purpose of this study is to quantify the uncertainty surrounding the unknown of interest, given an expert opinion, in order to reduce the error of the estimate. This task is carried out by data-informed calibration and aggregation of expert opinion in a Bayesian framework. Additionally, this study evaluates the impact of the number of experts on the accuracy of aggregated estimate. The objective is to determine the correlation between the number of experts and the accuracy of the combined estimate in order to recommend an expert panel size
- âŠ