6,130 research outputs found

    Learning probability distributions generated by finite-state machines

    Get PDF
    We review methods for inference of probability distributions generated by probabilistic automata and related models for sequence generation. We focus on methods that can be proved to learn in the inference in the limit and PAC formal models. The methods we review are state merging and state splitting methods for probabilistic deterministic automata and the recently developed spectral method for nondeterministic probabilistic automata. In both cases, we derive them from a high-level algorithm described in terms of the Hankel matrix of the distribution to be learned, given as an oracle, and then describe how to adapt that algorithm to account for the error introduced by a finite sample.Peer ReviewedPostprint (author's final draft

    Complexity of Grammar Induction for Quantum Types

    Full text link
    Most categorical models of meaning use a functor from the syntactic category to the semantic category. When semantic information is available, the problem of grammar induction can therefore be defined as finding preimages of the semantic types under this forgetful functor, lifting the information flow from the semantic level to a valid reduction at the syntactic level. We study the complexity of grammar induction, and show that for a variety of type systems, including pivotal and compact closed categories, the grammar induction problem is NP-complete. Our approach could be extended to linguistic type systems such as autonomous or bi-closed categories.Comment: In Proceedings QPL 2014, arXiv:1412.810

    Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy

    Get PDF
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the regular languages, using analytic tools adapted from the classical setting we show there is no collapse in the probabilistic hierarchy: more distributions become definable at each level. We also address related issues such as closure under probabilistic conditioning

    Connectionist natural language parsing

    Get PDF
    The key developments of two decades of connectionist parsing are reviewed. Connectionist parsers are assessed according to their ability to learn to represent syntactic structures from examples automatically, without being presented with symbolic grammar rules. This review also considers the extent to which connectionist parsers offer computational models of human sentence processing and provide plausible accounts of psycholinguistic data. In considering these issues, special attention is paid to the level of realism, the nature of the modularity, and the type of processing that is to be found in a wide range of parsers

    A 4-valued logic of strong conditional

    Get PDF
    How to say no less, no more about conditional than what is needed? From a logical analysis of necessary and sufficient conditions (Section 1), we argue that a stronger account of conditional can be obtained in two steps: firstly, by reminding its historical roots inside modal logic and set-theory (Section 2); secondly, by revising the meaning of logical values, thereby getting rid of the paradoxes of material implication whilst showing the bivalent roots of conditional as a speech-act based on affirmations and rejections (Section 3). Finally, the two main inference rules for conditional, viz. Modus Ponens and Modus Tollens, are reassessed through a broader definition of logical consequence that encompasses both a normal relation of truth propagation and a weaker relation of falsity non-propagation from premises to conclusion (Section 3)

    Empirical Risk Minimization with Approximations of Probabilistic Grammars

    Get PDF
    Probabilistic grammars are generative statistical models that are useful for compositional and sequential structures. We present a framework, reminiscent of structural risk minimization, for empirical risk minimization of the parameters of a fixed probabilistic grammar using the log-loss. We derive sample complexity bounds in this framework that apply both to the supervised setting and the unsupervised setting.

    The Probably and the Provable. By Jonathan L. Cohen

    Get PDF
    corecore