16,701 research outputs found

    Knowledge Representation with Ontologies: The Present and Future

    No full text
    Recently, we have seen an explosion of interest in ontologies as artifacts to represent human knowledge and as critical components in knowledge management, the semantic Web, business-to-business applications, and several other application areas. Various research communities commonly assume that ontologies are the appropriate modeling structure for representing knowledge. However, little discussion has occurred regarding the actual range of knowledge an ontology can successfully represent

    Grounding Dynamic Spatial Relations for Embodied (Robot) Interaction

    Full text link
    This paper presents a computational model of the processing of dynamic spatial relations occurring in an embodied robotic interaction setup. A complete system is introduced that allows autonomous robots to produce and interpret dynamic spatial phrases (in English) given an environment of moving objects. The model unites two separate research strands: computational cognitive semantics and on commonsense spatial representation and reasoning. The model for the first time demonstrates an integration of these different strands.Comment: in: Pham, D.-N. and Park, S.-B., editors, PRICAI 2014: Trends in Artificial Intelligence, volume 8862 of Lecture Notes in Computer Science, pages 958-971. Springe

    Traditional Crafts: Learning by Doing.

    Get PDF

    LTLf and LDLf Monitoring: A Technical Report

    Get PDF
    Runtime monitoring is one of the central tasks to provide operational decision support to running business processes, and check on-the-fly whether they comply with constraints and rules. We study runtime monitoring of properties expressed in LTL on finite traces (LTLf) and in its extension LDLf. LDLf is a powerful logic that captures all monadic second order logic on finite traces, which is obtained by combining regular expressions and LTLf, adopting the syntax of propositional dynamic logic (PDL). Interestingly, in spite of its greater expressivity, LDLf has exactly the same computational complexity of LTLf. We show that LDLf is able to capture, in the logic itself, not only the constraints to be monitored, but also the de-facto standard RV-LTL monitors. This makes it possible to declaratively capture monitoring metaconstraints, and check them by relying on usual logical services instead of ad-hoc algorithms. This, in turn, enables to flexibly monitor constraints depending on the monitoring state of other constraints, e.g., "compensation" constraints that are only checked when others are detected to be violated. In addition, we devise a direct translation of LDLf formulas into nondeterministic automata, avoiding to detour to Buechi automata or alternating automata, and we use it to implement a monitoring plug-in for the PROM suite

    What words mean and express: semantics and pragmatics of kind terms and verbs

    Get PDF
    For many years, it has been common-ground in semantics and in philosophy of language that semantics is in the business of providing a full explanation about how propositional meanings are obtained. This orthodox picture seems to be in trouble these days, as an increasing number of authors now hold that semantics does not deal with thought-contents. Some of these authors have embraced a “thin meanings” view, according to which lexical meanings are too schematic to enter propositional contents. I will suggest that it is plausible to adopt thin semantics for a class of words. However, I’ll also hold that some classes of words, like kind terms, plausibly have richer lexical meanings, and so that an adequate theory of word meaning may have to combine thin and rich semantics
    corecore