48,216 research outputs found
Two Approaches to Ontology Aggregation Based on Axiom Weakening
Axiom weakening is a novel technique that allows
for fine-grained repair of inconsistent ontologies.
In a multi-agent setting, integrating ontologies corresponding
to multiple agents may lead to inconsistencies.
Such inconsistencies can be resolved after
the integrated ontology has been built, or their
generation can be prevented during ontology generation.
We implement and compare these two approaches.
First, we study how to repair an inconsistent
ontology resulting from a voting-based aggregation
of views of heterogeneous agents. Second,
we prevent the generation of inconsistencies by letting
the agents engage in a turn-based rational protocol
about the axioms to be added to the integrated
ontology. We instantiate the two approaches using
real-world ontologies and compare them by measuring
the levels of satisfaction of the agents w.r.t.
the ontology obtained by the two procedures
Ontology Merging as Social Choice
The problem of merging several ontologies has important applications in the Semantic Web, medical ontology engineering
and other domains where information from several distinct sources needs to be integrated in a coherent manner.We propose
to view ontology merging as a problem of social choice, i.e. as a problem of aggregating the input of a set of individuals
into an adequate collective decision. That is, we propose to view ontology merging as ontology aggregation. As a first step in
this direction, we formulate several desirable properties for ontology aggregators, we identify the incompatibility of some of
these properties, and we define and analyse several simple aggregation procedures. Our approach is closely related to work
in judgment aggregation, but with the crucial difference that we adopt an open world assumption, by distinguishing between
facts not included in an agent’s ontology and facts explicitly negated in an agent’s ontology
Logics for modelling collective attitudes
We introduce a number of logics to reason about collective propositional
attitudes that are defined by means of the majority rule. It is well known that majoritarian
aggregation is subject to irrationality, as the results in social choice theory and judgment
aggregation show. The proposed logics for modelling collective attitudes are based on
a substructural propositional logic that allows for circumventing inconsistent outcomes.
Individual and collective propositional attitudes, such as beliefs, desires, obligations, are
then modelled by means of minimal modalities to ensure a number of basic principles. In
this way, a viable consistent modelling of collective attitudes is obtained
Coherent Integration of Databases by Abductive Logic Programming
We introduce an abductive method for a coherent integration of independent
data-sources. The idea is to compute a list of data-facts that should be
inserted to the amalgamated database or retracted from it in order to restore
its consistency. This method is implemented by an abductive solver, called
Asystem, that applies SLDNFA-resolution on a meta-theory that relates
different, possibly contradicting, input databases. We also give a pure
model-theoretic analysis of the possible ways to `recover' consistent data from
an inconsistent database in terms of those models of the database that exhibit
as minimal inconsistent information as reasonably possible. This allows us to
characterize the `recovered databases' in terms of the `preferred' (i.e., most
consistent) models of the theory. The outcome is an abductive-based application
that is sound and complete with respect to a corresponding model-based,
preferential semantics, and -- to the best of our knowledge -- is more
expressive (thus more general) than any other implementation of coherent
integration of databases
Do Goedel's incompleteness theorems set absolute limits on the ability of the brain to express and communicate mental concepts verifiably?
Classical interpretations of Goedel's formal reasoning imply that the truth
of some arithmetical propositions of any formal mathematical language, under
any interpretation, is essentially unverifiable. However, a language of
general, scientific, discourse cannot allow its mathematical propositions to be
interpreted ambiguously. Such a language must, therefore, define mathematical
truth verifiably. We consider a constructive interpretation of classical,
Tarskian, truth, and of Goedel's reasoning, under which any formal system of
Peano Arithmetic is verifiably complete. We show how some paradoxical concepts
of Quantum mechanics can be expressed, and interpreted, naturally under a
constructive definition of mathematical truth.Comment: 73 pages; this is an updated version of the NQ essay; an HTML version
is available at http://alixcomsi.com/Do_Goedel_incompleteness_theorems.ht
Challenges and complexities in application of LCA approaches in the case of ICT for a sustainable future
In this work, three of many ICT-specific challenges of LCA are discussed.
First, the inconsistency versus uncertainty is reviewed with regard to the
meta-technological nature of ICT. As an example, the semiconductor technologies
are used to highlight the complexities especially with respect to energy and
water consumption. The need for specific representations and metric to
separately assess products and technologies is discussed. It is highlighted
that applying product-oriented approaches would result in abandoning or
disfavoring of new technologies that could otherwise help toward a better
world. Second, several believed-untouchable hot spots are highlighted to
emphasize on their importance and footprint. The list includes, but not limited
to, i) User Computer-Interfaces (UCIs), especially screens and displays, ii)
Network-Computer Interlaces (NCIs), such as electronic and optical ports, and
iii) electricity power interfaces. In addition, considering cross-regional
social and economic impacts, and also taking into account the marketing nature
of the need for many ICT's product and services in both forms of hardware and
software, the complexity of End of Life (EoL) stage of ICT products,
technologies, and services is explored. Finally, the impact of smart management
and intelligence, and in general software, in ICT solutions and products is
highlighted. In particular, it is observed that, even using the same
technology, the significance of software could be highly variable depending on
the level of intelligence and awareness deployed. With examples from an
interconnected network of data centers managed using Dynamic Voltage and
Frequency Scaling (DVFS) technology and smart cooling systems, it is shown that
the unadjusted assessments could be highly uncertain, and even inconsistent, in
calculating the management component's significance on the ICT impacts.Comment: 10 pages. Preprint/Accepted of a paper submitted to the ICT4S
Conferenc
A systematic review of data quality issues in knowledge discovery tasks
Hay un gran crecimiento en el volumen de datos porque las organizaciones capturan permanentemente la cantidad colectiva de datos para lograr un mejor proceso de toma de decisiones. El desafío mas fundamental es la exploración de los grandes volúmenes de datos y la extracción de conocimiento útil para futuras acciones por medio de tareas para el descubrimiento del conocimiento; sin embargo, muchos datos presentan mala calidad. Presentamos una revisión sistemática de los asuntos de calidad de datos en las áreas del descubrimiento de conocimiento y un estudio de caso aplicado a la enfermedad agrícola conocida como la roya del café.Large volume of data is growing because the organizations are continuously capturing the collective amount of data for better decision-making process. The most fundamental challenge is to explore the large volumes of data and extract useful knowledge for future actions through knowledge discovery tasks, nevertheless many data has poor quality. We presented a systematic review of the data quality issues in knowledge discovery tasks and a case study applied to agricultural disease named coffee rust
- …