466 research outputs found

    A Proof-Theoretic Approach to Scope Ambiguity in Compositional Vector Space Models

    Full text link
    We investigate the extent to which compositional vector space models can be used to account for scope ambiguity in quantified sentences (of the form "Every man loves some woman"). Such sentences containing two quantifiers introduce two readings, a direct scope reading and an inverse scope reading. This ambiguity has been treated in a vector space model using bialgebras by (Hedges and Sadrzadeh, 2016) and (Sadrzadeh, 2016), though without an explanation of the mechanism by which the ambiguity arises. We combine a polarised focussed sequent calculus for the non-associative Lambek calculus NL, as described in (Moortgat and Moot, 2011), with the vector based approach to quantifier scope ambiguity. In particular, we establish a procedure for obtaining a vector space model for quantifier scope ambiguity in a derivational way.Comment: This is a preprint of a paper to appear in: Journal of Language Modelling, 201

    "Not not bad" is not "bad": A distributional account of negation

    Full text link
    With the increasing empirical success of distributional models of compositional semantics, it is timely to consider the types of textual logic that such models are capable of capturing. In this paper, we address shortcomings in the ability of current models to capture logical operations such as negation. As a solution we propose a tripartite formulation for a continuous vector space representation of semantics and subsequently use this representation to develop a formal compositional notion of negation within such models.Comment: 9 pages, to appear in Proceedings of the 2013 Workshop on Continuous Vector Space Models and their Compositionalit

    Many Valued Generalised Quantifiers for Natural Language in the DisCoCat Model

    Get PDF
    DisCoCat refers to the Categorical compositional distributional model of natural language, which combines the statistical vector space models of words with the compositional logic-based models of grammar. It is fair to say that despite existing work on incorporating notions of entailment, quantification, and coordination in this setting, a uniform modelling of logical operations is still an open problem. In this report, we take a step towards an answer. We show how one can generalise our previous DisCoCat model of generalised quantifiers from category of sets and relations to category of sets and many valued rations. As a result, we get a fuzzy version of these quantifiers. Our aim is to extend this model to all other logical connectives and develop a fuzzy logic for DisCoCat. The main contributions are showing that category of many valued relations is compact closed, defining appropriate bialgebra structures over it, and demonstrating how one can compute within this setting many valued meanings for quantified sentences.EPSRC Career Acceleration Fellowship EP/J002607/

    Semantic Composition via Probabilistic Model Theory

    Get PDF
    Semantic composition remains an open problem for vector space models of semantics. In this paper, we explain how the probabilistic graphical model used in the framework of Functional Distributional Semantics can be interpreted as a probabilistic version of model theory. Building on this, we explain how various semantic phenomena can be recast in terms of conditional probabilities in the graphical model. This connection between formal semantics and machine learning is helpful in both directions: it gives us an explicit mechanism for modelling context-dependent meanings (a challenge for formal semantics), and also gives us well-motivated techniques for composing distributed representations (a challenge for distributional semantics). We present results on two datasets that go beyond word similarity, showing how these semantically-motivated techniques improve on the performance of vector models.Schiff Foundatio

    Idioms and the syntax/semantics interface of descriptive content vs. reference

    Get PDF
    This publication is with permission of the rights owner freely accessible due to an Alliance licence and a national licence (funded by the DFG, German Research Foundation) respectively.The syntactic literature on idioms contains some proposals that are surprising from a compositional perspective. For example, there are proposals that, in the case of verb-object idioms, the verb combines directly with the noun inside its DP complement, and the determiner is introduced higher up in the syntactic structure, or is late-adjoined. This seems to violate compositionality insofar as it is generally assumed that the semantic role of the determiner is to convert a noun to the appropriate semantic type to serve as the argument to the function denoted by the verb. In this paper, we establish a connection between this line of analysis and lines of work in semantics that have developed outside of the domain of idioms, particularly work on incorporation and work that combines formal and distributional semantic modelling. This semantic work separates the composition of descriptive content from that of discourse referent introducing material; our proposal shows that this separation offers a particularly promising way to handle the compositional difficulties posed by idioms, including certain patterns of variation in intervening determiners and modifiers.Peer Reviewe

    Quantization, Frobenius and Bi Algebras from the Categorical Framework of Quantum Mechanics to Natural Language Semantics

    Get PDF
    Compact Closed categories and Frobenius and Bi algebras have been applied to model and reason about Quantum protocols. The same constructions have also been applied to reason about natural language semantics under the name: “categorical distributional compositional” semantics, or in short, the “DisCoCat” model. This model combines the statistical vector models of word meaning with the compositional models of grammatical structure. It has been applied to natural language tasks such as disambiguation, paraphrasing and entailment of phrases and sentences. The passage from the grammatical structure to vectors is provided by a functor, similar to the Quantization functor of Quantum Field Theory. The original DisCoCat model only used compact closed categories. Later, Frobenius algebras were added to it to model long distance dependancies such as relative pronouns. Recently, bialgebras have been added to the pack to reason about quantifiers. This paper reviews these constructions and their application to natural language semantics. We go over the theory and present some of the core experimental results

    A semantic theory of a subset of qualifying "as" phrases in English

    Full text link
    Landman (1989) introduced contemporary linguistics to the as-phrase. An as-phrase is a qualifier, introduced in English by "as." "John is corrupt as a judge," for instance, contains the as-phrase "as a judge." Philosophical discourse is full of examples of as-phrase sentences. Their presence can make it difficult to distinguish valid from invalid arguments, a perennial concern for philosophers. Landman proposed the first formal semantic theory of as-phrases, based on a set of seven intuitively-valid patterns of inference involving as-phrases. Szabó (2003), Jaeger (2003), Asher (2011) each attempt to improve upon Landman's theory. Chapter 1 reviews and criticizes a temporal account of as-phrase semantics, while tracing some precedents and motivations for my approach. Chapters 2-3 criticize Szabó's and Asher's theories. Szabó's theory shows problems handling the future tense and intensional contexts. Asher's complex theory solves these problems, but resorts to the obscure notions of relative identity and bare particulars. Chapter 4 argues that neither Szabó's nor Asher's theory is clearly superior, because implicitly, they focus on different classes of sentences, which I call "Type A" and "Type B." From John Bowers' syntactic research, I argue that the element common to Type A and Type B is Pr, a predication head pronounced "as" in some contexts. Chapter 5 develops a formal semantic theory tailored to Type A sentences that solves the problems of Szabó's theory while avoiding Asher's assumptions. On my approach, the semantic properties of Type A sentences resolve into an interaction among generic quantifiers, determiner-phrase interpretation, and one core quantifier based on a principal ultrafilter. It is the interaction-effects of these elements that give rise to the many unusual readings we find in these as-phrase sentences. This result supports my motivating view that linguistic research helps to solve semantic problems of philosophical interest

    Towards Explainable and Language-Agnostic LLMs: Symbolic Reverse Engineering of Language at Scale

    Full text link
    Large language models (LLMs) have achieved a milestone that undenia-bly changed many held beliefs in artificial intelligence (AI). However, there remains many limitations of these LLMs when it comes to true language understanding, limitations that are a byproduct of the under-lying architecture of deep neural networks. Moreover, and due to their subsymbolic nature, whatever knowledge these models acquire about how language works will always be buried in billions of microfeatures (weights), none of which is meaningful on its own, making such models hopelessly unexplainable. To address these limitations, we suggest com-bining the strength of symbolic representations with what we believe to be the key to the success of LLMs, namely a successful bottom-up re-verse engineering of language at scale. As such we argue for a bottom-up reverse engineering of language in a symbolic setting. Hints on what this project amounts to have been suggested by several authors, and we discuss in some detail here how this project could be accomplished.Comment: Draft, preprin

    Efficiency in ambiguity: two models of probabilistic semantics for natural language

    Get PDF
    This paper explores theoretical issues in constructing an adequate probabilistic semantics for natural language. Two approaches are contrasted. The first extends Montague Semantics with a probability distribution over models. It has nice theoretical properties, but does not account for the ubiquitous nature of ambiguity; moreover inference is NP hard. An alternative approach is described in which a sequence of pairs of sentences and truth values is generated randomly. By sacrificing some of the nice theoretical properties of the first approach it is possible to model ambiguity naturally; moreover inference now has polynomial time complexity. Both approaches provide a compositional semantics and account for the gradience of semantic judgements of belief and inference
    corecore