28 research outputs found

    Evaluating Supervised Semantic Parsing Methods on Application-Independent Data

    No full text

    Towards a Natural Language Driven Automated Help Desk

    No full text

    A fuzzy frame-based knowledge representation formalism

    No full text
    This paper describes a formalism for representing imprecise knowledge which combines traditional frame-based formalisms with fuzzy logic and fuzzy IF-THEN rules. Inference in this formalism is based on unification and the calculus of fuzzy IF-THEN rules, and lends itself to an efficient implementation

    Finding Optimal 1-Endpoint-Crossing Trees

    No full text

    CLG : A grammar formalism based on constraint resolution

    No full text

    Left Corner Parser for Tree Insertion Grammars

    No full text

    A Graphical Model for Context-Free Grammar ParsingCompiler Construction

    No full text
    In the compiler literature, parsing algorithms for context-free grammars are presented using string rewriting systems or abstract machines such as pushdown automata. Unfortunately, the resulting descriptions can be baroque, and even a basic understanding of some parsing algorithms, such as Earley\u2019s algorithm for general context-free grammars, can be elusive. In this paper, we present a graphical representation of context-free grammars called the Grammar Flow Graph (GFG) that permits parsing problems to be phrased as path problems in graphs; intuitively, the GFG plays the same role for context-free grammars that nondeterministic finite-state automata play for regular grammars. We show that the GFG permits an elementary treatment of Earley\u2019s algorithm that is much easier to understand than previous descriptions of this algorithm. In addition, look-ahead computation can be expressed as a simple inter-procedural dataflow analysis problem, providing an unexpected link between front-end and back-end technologies in compilers. These results suggest that the GFG can be a new foundation for the study of context-free grammars

    Languages as hyperplanes: grammatical inference with string kernels

    No full text
    Using string kernels, languages can be represented as hyperplanes in a high dimensional feature space. We present a new family of grammatical inference algorithms based on this idea. We demonstrate that some mildly context sensitive languages can be represented in this way and it is possible to efficiently learn these using kernel PCA. We present some experiments demonstrating the effectiveness of this approach on some standard examples of context sensitive languages using small synthetic data sets
    corecore