523 research outputs found
BFO and DOLCE: So Far, So Close…
A survey of the similarities and differences between BFO and DOLCE, and of the mutual interactions between Nicola Guarino and Barry Smit
Some Ontological Principles for Designing Upper Level Lexical Resources
The purpose of this paper is to explore some semantic problems related to the
use of linguistic ontologies in information systems, and to suggest some
organizing principles aimed to solve such problems. The taxonomic structure of
current ontologies is unfortunately quite complicated and hard to understand,
especially for what concerns the upper levels. I will focus here on the problem
of ISA overloading, which I believe is the main responsible of these
difficulties. To this purpose, I will carefully analyze the ontological nature
of the categories used in current upper-level structures, considering the
necessity of splitting them according to more subtle distinctions or the
opportunity of excluding them because of their limited organizational role.Comment: 8 pages - gzipped postscript file - A4 forma
Engineering ontologies: Foundations and theories from philosophy and logical theory
Ontology as a branch of philosophy is the science of what is, of the kinds and
structures of objects, properties, events, processes and relations in every area of
reality. ‘Ontology’ is often used by philosophers as a synonym for ‘metaphysics’
(literally: ‘what comes after the Physics’), a term which was used by early students of
Aristotle to refer to what Aristotle himself called ‘first philosophy’. The term ‘ontology’ (or ontologia) was itself coined in 1613, independently, by two
philosophers, Rudolf Göckel (Goclenius), in his Lexicon philosophicum and Jacob
Lorhard (Lorhardus), in his Theatrum philosophicum. The first occurrence in English
recorded by the OED appears in Bailey’s dictionary of 1721, which defines ontology
as ‘an Account of being in the Abstract’
Fondamenti ontologici per una scienza dei servizi
Nonostante la pervasività della nozione di servizio e le recenti proposte per una Scienza dei Servizi
unificata, esistono ancora parecchie inconsistenze tra le varie definizioni di servizio in uso nelle
diverse discipline (e spesso anche all'interno della stessa disciplina). In particolare, a dispetto del fatto
che l'obiettivo generale di questa scienza dovrebbe essere di permettere a persone e calcolatori di
interagire agevolmente con i servizi nella vita di tutti i giorni, molti approcci alla modellazione dei
servizi in informatica (specialmente quelli centrati sui servizi web) sembrano focalizzarsi
principalmente sugli aspetti connessi al flusso dati, per cui i servizi sono considerati come scatole
nere che trasformano un ingresso in un’uscita, e che interoperano tra loro secondo modalitÃ
predefinite. Questo modello a scatola nera ha sicuramente i suoi vantaggi ma, stando a quanto dicono
Petrie e Bussler, sembra funzionare bene sono all'interno di contesti omogenei, i cosiddetti parchi di
servizi, dove l'interoperabilità è tecnicamente possibile solo perché i contenuti e le modalità di
erogazione di ogni servizio sono predefiniti e condivisi da tutti i soggetti coinvolti
Processes as variable embodiments
In a number of papers, Kit Fine introduced a theory of embodiment which distinguishes between rigid and variable embodiments, and has been successfully applied to clarify the ontological nature of entities whose parts may or may not vary in time. In particular, he has applied this theory to describe a process such as the erosion of a cliff, which would be a variable embodiment whose manifestations are the different statesof erosion of the cliff. We find this theory very powerful, and especially appropriate to capture the intuition that the same process may go on at different times. However, its formal principles have been subject to some criticisms, mainly concerning the mereological structure of a variable embodiment. Moreover, since the notion of variable embodiment is very general, simply saying that processes are variable embodiments is not enough to understand their ontological nature. To address these concerns, in this paper we proceed in two phases: first, we propose a revised version of Fine’s original theory adapted to the case of processes, which adopts a classical mereology instead of Fine’s hylomorphic mereology, and a temporalized constitution relation in place of Fine’s function of variable embodiment; second, we go deeper into the ontologicalnature of processes by revisiting the notions of homogeneity, intentionality, and telicity discussed in the literature, and propose an account based on ontological principles and not on semantic properties of predicates. This allows us to organize processes into a novel taxonomy based exclusively on their unity and individuation principles
In the Defense of Ontological Foundations for Conceptual Modeling
Abstract not available
Semantics, Ontology and Explanation
The terms 'semantics' and 'ontology' are increasingly appearing together with
'explanation', not only in the scientific literature, but also in
organizational communication. However, all of these terms are also being
significantly overloaded. In this paper, we discuss their strong relation under
particular interpretations. Specifically, we discuss a notion of explanation
termed ontological unpacking, which aims at explaining symbolic domain
descriptions (conceptual models, knowledge graphs, logical specifications) by
revealing their ontological commitment in terms of their assumed truthmakers,
i.e., the entities in one's ontology that make the propositions in those
descriptions true. To illustrate this idea, we employ an ontological theory of
relations to explain (by revealing the hidden semantics of) a very simple
symbolic model encoded in the standard modeling language UML. We also discuss
the essential role played by ontology-driven conceptual models (resulting from
this form of explanation processes) in properly supporting semantic
interoperability tasks. Finally, we discuss the relation between ontological
unpacking and other forms of explanation in philosophy and science, as well as
in the area of Artificial Intelligence
On weak truthmaking
Informally speaking, a truthmaker is something in the world in virtue
of which the sentences of a language can be made true. This fundamental philosophical
notion plays a central role in applied ontology. In particular, a recent nonorthodox
formulation of this notion proposed by the philosopher Josh Parsons,
which we labelled weak truthamking, has been shown to be extremely useful in addressing
a number of classical problems in the area of Conceptual Modeling. In this
paper, after revisiting the classical notion of truthmaking, we conduct an in depth
analysis of Parsons’ account of weak truthmaking. By doing that, we expose some
difficulties in his original formulation. As the main contribution of this paper, we
propose solutions to address these issues which are then integrated in a new precise
interpretation of truthmaking that is harmonizable wit
Urban Artefacts and Their Social Roles: Towards an Ontology of Social Practices
Cities can be seen as systems of urban artefacts interacting with human activities. Since cities in this sense need to be organized and coordinated, convergences and divergences between the "planned" and the "lived" city have always been of paramount interest in urban planning. The increasing amount of geo big data and the growing impact of Internet of Things (IoT) in contemporary smart city is pushing toward a re-conceptualization of urban systems taking into consideration the complexity of human behaviors. This work contributes to this view by proposing an ontological analysis of urban artefacts and their roles, focusing in particular on the difference between social roles and functional roles through the prism of social practices
Analysis of touch gestures for online child protection
AbstractThe growth of Internet and the pervasiveness of ICT have led to a radical change in social relationships. One of the drawbacks of this change is the exposure of individuals to threats during online activities. In this context, thetechno-regulationparadigm is inspiring new ways to safeguard legally interests by means of tools allowing to hamper breaches of law. In this paper, we focus on the exposure of individuals to specific online threats when interacting with smartphones. We propose a novel techno-regulatory approach exploiting machine learning techniques to provide safeguards against threats online. Specifically, we study a set of touch-based gestures to distinguish between underages or adults who is accessing a smartphone, and so to guarantee protection. To evaluate the proposed approach's effectiveness, we developed an Android app to build a dataset consisting of more than 9000 touch-gestures from 147 participants. We experimented bothsingle-viewandmulti-viewlearning techniques to find the best combination of touch-gestures able of distinguishing between adults and underages. Results show that the multi-view learning combining scrolls, swipes, and pinch-to-zoom gestures, achieves the best ROC AUC (0.92) and accuracy (88%) scores
- …