79 research outputs found

    From compositional to systematic semantics

    Full text link
    We prove a theorem stating that any semantics can be encoded as a compositional semantics, which means that, essentially, the standard definition of compositionality is formally vacuous. We then show that when compositional semantics is required to be "systematic" (that is, the meaning function cannot be arbitrary, but must belong to some class), it is possible to distinguish between compositional and non-compositional semantics. As a result, we believe that the paper clarifies the concept of compositionality and opens a possibility of making systematic formal comparisons of different systems of grammars.Comment: 11 pp. Latex.

    Structural Ambiguity and Conceptual Information Retrieval

    Get PDF

    Natural language programming of industrial robots

    Get PDF
    In this paper, we introduce a method to use written natural language instructions to program assembly tasks for industrial robots. In our application, we used a state-of-the-art semantic and syntactic parser together with semantically rich world and skill descriptions to create highlevel symbolic task sequences. From these sequences, we generated executable code for both virtual and physical robot systems. Our focus lays on the applicability of these methods in an industrial setting with real-time constraints

    Selective Sampling for Example-based Word Sense Disambiguation

    Full text link
    This paper proposes an efficient example sampling method for example-based word sense disambiguation systems. To construct a database of practical size, a considerable overhead for manual sense disambiguation (overhead for supervision) is required. In addition, the time complexity of searching a large-sized database poses a considerable problem (overhead for search). To counter these problems, our method selectively samples a smaller-sized effective subset from a given example set for use in word sense disambiguation. Our method is characterized by the reliance on the notion of training utility: the degree to which each example is informative for future example sampling when used for the training of the system. The system progressively collects examples by selecting those with greatest utility. The paper reports the effectiveness of our method through experiments on about one thousand sentences. Compared to experiments with other example sampling methods, our method reduced both the overhead for supervision and the overhead for search, without the degeneration of the performance of the system.Comment: 25 pages, 14 Postscript figure

    Using Case Prototypicality as a Semantic Primitive

    Get PDF
    corecore