59,539 research outputs found

    Using philosophy to improve the coherence and interoperability of applications ontologies: A field report on the collaboration of IFOMIS and L&C

    Get PDF
    The collaboration of Language and Computing nv (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is guided by the hypothesis that quality constraints on ontologies for software ap-plication purposes closely parallel the constraints salient to the design of sound philosophical theories. The extent of this parallel has been poorly appreciated in the informatics community, and it turns out that importing the benefits of phi-losophical insight and methodology into application domains yields a variety of improvements. L&C’s LinKBase® is one of the world’s largest medical domain ontologies. Its current primary use pertains to natural language processing ap-plications, but it also supports intelligent navigation through a range of struc-tured medical and bioinformatics information resources, such as SNOMED-CT, Swiss-Prot, and the Gene Ontology (GO). In this report we discuss how and why philosophical methods improve both the internal coherence of LinKBase®, and its capacity to serve as a translation hub, improving the interoperability of the ontologies through which it navigates

    Dublin City University at QA@CLEF 2008

    Get PDF
    We describe our participation in Multilingual Question Answering at CLEF 2008 using German and English as our source and target languages respectively. The system was built using UIMA (Unstructured Information Management Architecture) as underlying framework

    Backwards is the way forward: feedback in the cortical hierarchy predicts the expected future

    Get PDF
    Clark offers a powerful description of the brain as a prediction machine, which offers progress on two distinct levels. First, on an abstract conceptual level, it provides a unifying framework for perception, action, and cognition (including subdivisions such as attention, expectation, and imagination). Second, hierarchical prediction offers progress on a concrete descriptive level for testing and constraining conceptual elements and mechanisms of predictive coding models (estimation of predictions, prediction errors, and internal models)

    Ontology as the core discipline of biomedical informatics: Legacies of the past and recommendations for the future direction of research

    Get PDF
    The automatic integration of rapidly expanding information resources in the life sciences is one of the most challenging goals facing biomedical research today. Controlled vocabularies, terminologies, and coding systems play an important role in realizing this goal, by making it possible to draw together information from heterogeneous sources – for example pertaining to genes and proteins, drugs and diseases – secure in the knowledge that the same terms will also represent the same entities on all occasions of use. In the naming of genes, proteins, and other molecular structures, considerable efforts are under way to reduce the effects of the different naming conventions which have been spawned by different groups of researchers. Electronic patient records, too, increasingly involve the use of standardized terminologies, and tremendous efforts are currently being devoted to the creation of terminology resources that can meet the needs of a future era of personalized medicine, in which genomic and clinical data can be aligned in such a way that the corresponding information systems become interoperable

    Experiences with the GTU grammar development environment

    Full text link
    In this paper we describe our experiences with a tool for the development and testing of natural language grammars called GTU (German: Grammatik-Testumgebumg; grammar test environment). GTU supports four grammar formalisms under a window-oriented user interface. Additionally, it contains a set of German test sentences covering various syntactic phenomena as well as three types of German lexicons that can be attached to a grammar via an integrated lexicon interface. What follows is a description of the experiences we gained when we used GTU as a tutoring tool for students and as an experimental tool for CL researchers. From these we will derive the features necessary for a future grammar workbench.Comment: 7 pages, uses aclap.st

    A Fuzzy Approach to Erroneous Inputs in Context-Free Language Recognition

    Get PDF
    Using fuzzy context-free grammars one can easily describe a finite number of ways to derive incorrect strings together with their degree of correctness. However, in general there is an infinite number of ways to perform a certain task wrongly. In this paper we introduce a generalization of fuzzy context-free grammars, the so-called fuzzy context-free KK-grammars, to model the situation of making a finite choice out of an infinity of possible grammatical errors during each context-free derivation step. Under minor assumptions on the parameter KK this model happens to be a very general framework to describe correctly as well as erroneously derived sentences by a single generating mechanism. Our first result characterizes the generating capacity of these fuzzy context-free KK-grammars. As consequences we obtain: (i) bounds on modeling grammatical errors within the framework of fuzzy context-free grammars, and (ii) the fact that the family of languages generated by fuzzy context-free KK-grammars shares closure properties very similar to those of the family of ordinary context-free languages. The second part of the paper is devoted to a few algorithms to recognize fuzzy context-free languages: viz. a variant of a functional version of Cocke-Younger- Kasami's algorithm and some recursive descent algorithms. These algorithms turn out to be robust in some very elementary sense and they can easily be extended to corresponding parsing algorithms

    To Teach Modal Logic: An Opinionated Survey

    Get PDF
    I aim to promote an alternative agenda for teaching modal logic chiefly inspired by the relationships between modal logic and philosophy. The guiding idea for this proposal is a reappraisal of the interest of modal logic in philosophy, which do not stem mainly from mathematical issues, but which is motivated by central problems of philosophy and language. I will point out some themes to start elaborating a guide for a more comprehensive approach to teach modal logic, and consider the contributions of dual-process theories in cognitive science, in order to explore a pedagogical framework for the proposed point of view.Comment: Proceedings of the Fourth International Conference on Tools for Teaching Logic (TTL2015), Rennes, France, June 9-12, 2015. Editors: M. Antonia Huertas, Jo\~ao Marcos, Mar\'ia Manzano, Sophie Pinchinat, Fran\c{c}ois Schwarzentrube

    COGNITIVE LINGUISTICS AS A METHODOLOGICAL PARADIGM

    Get PDF
    A general direction in which cognitive linguistics is heading at the turn of the century is outlined and a revised understanding of cognitive linguistics as a methodological paradigm is suggest. The goal of cognitive linguistics is defined as understanding what language is and what language does to ensure the predominance of homo sapiens as a biological species. This makes cognitive linguistics a biologically oriented empirical science
    corecore