10 research outputs found

    A diagrammatic representation for entities and mereotopological relations in ontologies

    Get PDF
    In the graphical representation of ontologies, it is customary to use graph theory as the representational background. We claim here that the standard graph-based approach has a number of limitations. We focus here on a problem in the graph-based representation of ontologies in complex domains such as biomedical, engineering and manufacturing: lack of mereotopological representation. Based on such limitation, we proposed a diagrammatic way to represent an entity’s structure and various forms of mereotopological relationships between the entities

    Transforming semi-structured life science diagrams into meaningful domain ontologies with DiDOn

    Get PDF
    AbstractBio-ontology development is a resource-consuming task despite the many open source ontologies available for reuse. Various strategies and tools for bottom-up ontology development have been proposed from a computing angle, yet the most obvious one from a domain expert perspective is unexplored: the abundant diagrams in the sciences. To speed up and simplify bio-ontology development, we propose a detailed, micro-level, procedure, DiDOn, to formalise such semi-structured biological diagrams availing also of a foundational ontology for more precise and interoperable subject domain semantics. The approach is illustrated using Pathway Studio as case study

    Blast Theory: Intermedial Performance Praxis and the Generative Conditions for Performance Subjectivity

    Get PDF
    The work of the British theatre company Blast Theory explores intermedial dramaturgies that this thesis claims can be categorized as radical because they present a generative characteristic. Intermediality, understood here as the impact of analogue and digital technologies in theatrical performance, establishes complex relationships between physical and virtual spaces, structures that create a rich polyphony of multiple temporal orchestrations, and narratives that present a multiplicity of performative arrangements. Intermedial performance, as a performative and experiential event, encompasses a triad of performative interactions between performers, spectators and the media itself executed at and concentrated on the moment of the performance encounter. This research argues that this encounter displays a generative character – a moment at which all the attending performance variables come together in a constant process of performative re-activation thus generating the intermedial performance event. Within this descriptive parameter, this research claims that recent performance conceptualizations fail to account for the work of Blast Theory. Contemporary performance and liveness debates focus principally on the ontology of performance. So, notwithstanding their differences, performance theorists such as Lavender (2002), Fischer-Lichte (2008), and Schechner (2003), and presentness/presence theorists such as Phelan (1993) and Power (2008) all agree that performance is an ontological, ephemeral, and fleeting event. While there are many valid points in these diverse approaches, they only offer a partial account of the specificities of the work of Blast Theory and, by extent, the intermedial performance event. This thesis therefore relocates the terms of the debate on a constructivist epistemological basis. In this way, the thesis proposes that an intermedial performance event must be understood beyond the ontological approach by specifically interrogating the conditions of intelligibility; that is, its operative and intelligible architecture of attending elements and the participating subject. The key hypothesis shared is that in introducing a constructivist reading of epistemology, as described by Alfred Whitehead and Gilles Deleuze, a new account of intermediality in performance emerges as a radical dramaturgy, incorporating generative aspects, and with this, a unique type of intermedial performance subjectivity is enabled

    Ontological foundations for structural conceptual models

    Get PDF
    In this thesis, we aim at contributing to the theory of conceptual modeling and ontology representation. Our main objective here is to provide ontological foundations for the most fundamental concepts in conceptual modeling. These foundations comprise a number of ontological theories, which are built on established work on philosophical ontology, cognitive psychology, philosophy of language and linguistics. Together these theories amount to a system of categories and formal relations known as a foundational ontolog

    Deploying ontologies in software design

    Get PDF
    In this thesis we will be concerned with the relation between ontologies and software design. Ontologies are studied in the artificial intelligence community as a means to explicitly represent standardised domain knowledge in order to enable knowledge shar¬ ing and reuse. We deploy ontologies in software design with emphasis on a traditional software engineering theme: error detection. In particular, we identify a type of error that is often difficult to detect: conceptual errors. These are related to the description of the domain whom which the system will operate. They require subjective knowledge about correct forms of domain description to detect them. Ontologies provide these forms of domain description and we are interested in applying them and verify their correctness(chapter 1). After presenting an in depth analysis of the field of ontologies and software testing as conceived and implemented by the software engineering and artificial intelligence communities(chapter 2), we discuss an approach which enabled us to deploy ontologies in the early phases of software development (i.e., specifications) in order to detect conceptual errors (chapter 3). This is based on the provision of ontological axioms which are used to verify conformance of specification constructs to the underpinning ontology. To facilitate the integration of ontology with applications that adopt it we developed an architecture and built tools to implement this form of conceptual error check(chapter 4). We apply and evaluate the architecture in a variety of contexts to identify potential uses (chapter 5). An implication of this method for de¬ ploying ontologies to reason about the correctness of applications is to raise our trust in the given ontologies. However, when the ontologies themselves are erroneous we might fail to reveal pernicious discrepancies. To cope with this problem we extended the architecture to a multi-layer form(chapter 4) which gives us the ability to check the ontologies themselves for correctness. We apply this multi-layer architecture to cap¬ ture errors found in a complex ontologies lattice(chapter 6). We further elaborate on the weaknesses in ontology evaluation methods and employ a technique stemming from software engineering, that of experience management, to facilitate ontology testing and deployment(chapter 7). The work presented in this thesis aims to improve practice in ontology use and identify areas to which ontologies could be of benefits other than the advocated ones of knowledge sharing and reuse(chapter 8)

    Graph Granularity through Bi-intuitionistic Modal Logic

    Get PDF
    This thesis concerns the use of a bi-intuitionistic modal logic, UBiSKt, in the field of Knowledge Representation and Reasoning. The logic is shown to be able to represent qualitative spatial relations between subgraphs at different levels of detail, or granularity. The level of detail is provided by the modal accessibility relation R defined on the set of nodes and edges. The connection between modal logic and mathematical morphology is exploited to study notions of granulation on subgraphs, namely the process of changing granularity, and to define qualitative spatial relations between these “granular” regions. In addition, a special case of graph and hypergraph granularity is analysed, namely when the accessibility relation gives rise to a partition of the underlying set of nodes and edges. Different S5 extensions of intuitionistic modal logic are considered and compared in the thesis. It is shown that these logics, and their associated semantics, provide different ways of partitioning a graph, a hypergraph, or, more generally, a partially ordered set

    Organising knowledge in the age of the semantic web: a study of the commensurability of ontologies

    Get PDF
     This study is directed towards the problem of conceptual translation across different data management systems and formats, with a particular focus on those used in the emerging world of the Semantic Web. Increasingly, organisations have sought to connect information sources and services within and beyond their enterprise boundaries, building upon existing Internet facilities to offer improved research, planning, reporting and management capabilities. The Semantic Web is an ambitious response to this growing demand, offering a standards-based platform for sharing, linking and reasoning with information. The imagined result, a globalised knowledge network formed out of mutually referring data structures termed "ontologies", would make possible new kinds of queries, inferences and amalgamations of information. Such a network, though, is premised upon large numbers of manually drawn links between these ontologies. In practice, establishing these links is a complex translation task requiring considerable time and expertise; invariably, as ontologies and other structured information sources are published, many useful connections are neglected. To combat this, in recent years substantial research has been invested into "ontology matching" - the exploration of algorithmic approaches for automatically translating or aligning ontologies. These approaches, which exploit the explicit semantic properties of individual concepts, have registered impressive precision and recall results against humanly-engineered translations. However they are unable to make use of background cultural information about the overall systems in which those concepts are housed - how those systems are used, for what purpose they were designed, what methodological or theoretical principles underlined their construction, and so on. The present study investigates whether paying attention to these sociological dimensions of electronic knowledge systems could supplement algorithmic approaches in some circumstances. Specifically, it asks whether a holistic notion of commensurability can be useful when aligning or translating between such systems.      The first half of the study introduces the problem, surveys the literature, and outlines the general approach. It then proposes both a theoretical foundation and a practical framework for assessing commensurability of ontologies and other knowledge systems. Chapter 1 outlines the Semantic Web, ontologies and the problem of conceptual translation, and poses the key research questions. Conceptual translation can be treated as, by turns, a social, philosophical, linguistic or technological problem; Chapter 2 surveys a correspondingly wide range of literature and approaches.      The methods employed by the study are described in Chapter 3. Chapter 4 critically examines theories of conceptual schemes and commensurability, while Chapter 5 describes the framework itself, comprising a series of specific dimensions, a broad methodological approach, and a means for generating both qualitative and quantitative assessments. The second half of the study then explores the notion of commensurability through several empirical frames. Chapters 6 to 8 applies the framework to a series of case studies. Chapter 6 presents a brief history of knowledge systems, and compares two of these systems - relational databases and Semantic Web ontologies. Chapter 7, in turn, compares several "upper-level" ontologies - reusable schematisations of abstract concepts like Time and Space . Chapter 8 reviews a recent, widely publicised controversy over the standardisation of document formats. This analysis in particular shows how the opaque dry world of technical specifications can reveal the complex network of social dynamics, interests and beliefs which coordinate and motivate them. Collectively, these studies demonstrate the framework is useful in making evident assumptions which motivate the design of different knowledge systems, and further, in assessing the commensurability of those systems. Chapter 9 then presents a further empirical study; here, the framework is implemented as a software system, and pilot tested among a small cohort of researchers. Finally, Chapter 10 summarises the argumentative trajectory of the study as a whole - that, broadly, an elaborated notion of commensurability can tease out important and salient features of translation inscrutable to purely algorithmic methods - and suggests some possibilities for further work

    The Value of Technics: An Ontogenetic Approach to Money, Markets, and Networks

    Full text link
    This thesis investigates the impact of the digitalization of monetary and financial flows on the political-economic sphere in order to provide a novel perspective on the relations between economic and technological forces at the present global juncture. In the aftermath of the Global Financial Crisis and with the rise of the cryptoeconomy, an increasing number of scholars have highlighted the immanence of market logic to cultural and social life. At the same time, speculative practices have emerged that attempt to challenge the political economy through financial experiments. This dissertation complements these approaches by stressing the need to pair the critical study of finance with scholarship in the philosophy of technology that emphasizes the value immanent to technics and technology – i.e. the normative and genetic role of ubiquitous algorithmic networks in the organization of markets and socius. In order to explore these events, I propose an interdisciplinary theoretical framework informed largely by Gilbert Simondon’s philosophy of individuation and technics and the contemporary literature on the ontology of computation, supported by insights drawn from the history of finance and economic theory. This novel framework will provide the means to investigate the ontogenetic processes at work in the techno-cultural ecosystem following the digitalization of monetary and financial flows. Through an exploration of the fleeting materiality and multifaceted character of digital fiat money, the social power of algorithmic financial logic, and the new possibilities offered by the invention of the Bitcoin protocol, this research aims to challenge some of the bedrocks of the economic orthodoxy – economic and monetary value, liquidity, market rationality – in order to move beyond the overarching narrative of capitalism as a monolithic system. The thesis instead foregrounds the techno-historical contingencies that have led to the contemporary power formation. Furthermore, it argues that the ontogenetic character of algorithmic technology ushers in novel possibilities for the speculative engineering of alternative networks of value creation and distribution that have the potential to reverse the current balance of power

    Einstein vs. Bergson

    Get PDF
    On 6 April 1922, Einstein met Bergson to debate the nature of time: is the time the physicist calculates the same time the philosopher reflects on? Einstein claimed that only scientific time is real, while Bergson argued that scientific time always presupposes a living and perceiving subject. On that day, nearly 100 years ago, conflict was inevitable. Is it still inevitable today? How many kinds of time are there

    Einstein vs. Bergson

    Get PDF
    On 6 April 1922, Einstein met Bergson to debate the nature of time: is the time the physicist calculates the same time the philosopher reflects on? Einstein claimed that only scientific time is real, while Bergson argued that scientific time always presupposes a living and perceiving subject. On that day, nearly 100 years ago, conflict was inevitable. Is it still inevitable today? How many kinds of time are there
    corecore