110,313 research outputs found
Tractable approximate deduction for OWL
Acknowledgements This work has been partially supported by the European project Marrying Ontologies and Software Technologies (EU ICT2008-216691), the European project Knowledge Driven Data Exploitation (EU FP7/IAPP2011-286348), the UK EPSRC project WhatIf (EP/J014354/1). The authors thank Prof. Ian Horrocks and Dr. Giorgos Stoilos for their helpful discussion on role subsumptions. The authors thank Rafael S. Gonçalves et al. for providing their hotspots ontologies. The authors also thank BoC-group for providing their ADOxx Metamodelling ontologies.Peer reviewedPostprin
Complementary Lipschitz continuity results for the distribution of intersections or unions of independent random sets in finite discrete spaces
We prove that intersections and unions of independent random sets in finite
spaces achieve a form of Lipschitz continuity. More precisely, given the
distribution of a random set , the function mapping any random set
distribution to the distribution of its intersection (under independence
assumption) with is Lipschitz continuous with unit Lipschitz constant if
the space of random set distributions is endowed with a metric defined as the
norm distance between inclusion functionals also known as commonalities.
Moreover, the function mapping any random set distribution to the distribution
of its union (under independence assumption) with is Lipschitz continuous
with unit Lipschitz constant if the space of random set distributions is
endowed with a metric defined as the norm distance between hitting
functionals also known as plausibilities.
Using the epistemic random set interpretation of belief functions, we also
discuss the ability of these distances to yield conflict measures. All the
proofs in this paper are derived in the framework of Dempster-Shafer belief
functions. Let alone the discussion on conflict measures, it is straightforward
to transcribe the proofs into the general (non necessarily epistemic) random
set terminology
Continuous Improvement Through Knowledge-Guided Analysis in Experience Feedback
Continuous improvement in industrial processes is increasingly a key element of competitiveness for industrial systems. The management of experience feedback in this framework is designed to build, analyze and facilitate the knowledge sharing among problem solving practitioners of an organization in order to improve processes and products achievement. During Problem Solving Processes, the intellectual investment of experts is often considerable and the opportunities for expert knowledge exploitation are numerous: decision making, problem solving under uncertainty, and expert configuration. In this paper, our contribution relates to the structuring of a cognitive experience feedback framework, which allows a flexible exploitation of expert knowledge during Problem Solving Processes and a reuse such collected experience. To that purpose, the proposed approach uses the general principles of root cause analysis for identifying the root causes of problems or events, the conceptual graphs formalism for the semantic conceptualization of the domain vocabulary and the Transferable Belief Model for the fusion of information from different sources. The underlying formal reasoning mechanisms (logic-based semantics) in conceptual graphs enable intelligent information retrieval for the effective exploitation of lessons learned from past projects. An example will illustrate the application of the proposed approach of experience feedback processes formalization in the transport industry sector
Probabilistic Logic Programming with Beta-Distributed Random Variables
We enable aProbLog---a probabilistic logical programming approach---to reason
in presence of uncertain probabilities represented as Beta-distributed random
variables. We achieve the same performance of state-of-the-art algorithms for
highly specified and engineered domains, while simultaneously we maintain the
flexibility offered by aProbLog in handling complex relational domains. Our
motivation is that faithfully capturing the distribution of probabilities is
necessary to compute an expected utility for effective decision making under
uncertainty: unfortunately, these probability distributions can be highly
uncertain due to sparse data. To understand and accurately manipulate such
probability distributions we need a well-defined theoretical framework that is
provided by the Beta distribution, which specifies a distribution of
probabilities representing all the possible values of a probability when the
exact value is unknown.Comment: Accepted for presentation at AAAI 201
A Tutorial on Bayesian Nonparametric Models
A key problem in statistical modeling is model selection, how to choose a
model at an appropriate level of complexity. This problem appears in many
settings, most prominently in choosing the number ofclusters in mixture models
or the number of factors in factor analysis. In this tutorial we describe
Bayesian nonparametric methods, a class of methods that side-steps this issue
by allowing the data to determine the complexity of the model. This tutorial is
a high-level introduction to Bayesian nonparametric methods and contains
several examples of their application.Comment: 28 pages, 8 figure
Improving legibility of natural deduction proofs is not trivial
In formal proof checking environments such as Mizar it is not merely the
validity of mathematical formulas that is evaluated in the process of adoption
to the body of accepted formalizations, but also the readability of the proofs
that witness validity. As in case of computer programs, such proof scripts may
sometimes be more and sometimes be less readable. To better understand the
notion of readability of formal proofs, and to assess and improve their
readability, we propose in this paper a method of improving proof readability
based on Behaghel's First Law of sentence structure. Our method maximizes the
number of local references to the directly preceding statement in a proof
linearisation. It is shown that our optimization method is NP-complete.Comment: 33 page
Parametric matroid of rough set
Rough set is mainly concerned with the approximations of objects through an
equivalence relation on a universe. Matroid is a combinatorial generalization
of linear independence in vector spaces. In this paper, we define a parametric
set family, with any subset of a universe as its parameter, to connect rough
sets and matroids. On the one hand, for a universe and an equivalence relation
on the universe, a parametric set family is defined through the lower
approximation operator. This parametric set family is proved to satisfy the
independent set axiom of matroids, therefore it can generate a matroid, called
a parametric matroid of the rough set. Three equivalent representations of the
parametric set family are obtained. Moreover, the parametric matroid of the
rough set is proved to be the direct sum of a partition-circuit matroid and a
free matroid. On the other hand, since partition-circuit matroids were well
studied through the lower approximation number, we use it to investigate the
parametric matroid of the rough set. Several characteristics of the parametric
matroid of the rough set, such as independent sets, bases, circuits, the rank
function and the closure operator, are expressed by the lower approximation
number.Comment: 15 page
- …