7 research outputs found
Logical Aspects of Computational Linguistics. Celebrating 20 Years of LACL (1996–2016): 9th International Conference, LACL 2016, Nancy, France, December 5-7, 2016, Proceedings
International audienceEdited under the auspices of the Association of Logic, Language andInformation (FoLLI), this book constitutes the refereed proceedings ofthe 20th anniversary of the International Conference on LogicalAspects of Computational Linguistics, LACL 2016, held in LORIA Nancy,France, in December 2016. The 19 contributed papers, presentedtogether with 4 invited papers and 6 abstracts, were carefullyreviewed and selected from 38 submissions. The focus of the conferenceis the use of type theoretic, proof theoretic, and model theoreticmethods for describing and formalising natural language syntax,semantics, and pragmatics as well as the implementation of thecorresponding tools
Explorations in Subexponential Non-associative Non-commutative Linear Logic
In a previous work we introduced a non-associative non-commutative logic extended by multimodalities, called subexponentials, licensing local application of structural rules. Here, we further explore this system, exhibiting a classical one-sided multi-succedent classical analogue of our intuitionistic system, following the exponential-free calculi of Buszkowski, and de Groote, Lamarche. A large fragment of the intuitionistic calculus is shown to embed faithfully into the classical fragment
Classical Copying versus Quantum Entanglement in Natural Language: The Case of VP-ellipsis
In Proceedings CAPNS 2018, arXiv:1811.02701In Proceedings CAPNS 2018, arXiv:1811.0270
Explorations in Subexponential non-associative non-commutative Linear Logic
In a previous work we introduced a non-associative non-commutative logic
extended by multimodalities, called subexponentials, licensing local
application of structural rules. Here, we further explore this system,
considering a classical one-sided multi-succedent classical version of the
system, following the exponential-free calculi of Buszkowski's and de Groote
and Lamarche's works, where the intuitionistic calculus is shown to embed
faithfully into the classical fragment
Involutive Commutative Residuated Lattice without Unit: Logics and Decidability
We investigate involutive commutative residuated lattices without unit, which
are commutative residuated lattice-ordered semigroups enriched with a unary
involutive negation operator. The logic of this structure is discussed and the
Genzten-style sequent calculus of it is presented. Moreover, we prove the
decidability of this logic.Comment: 16 page
A Compositional Vector Space Model of Ellipsis and Anaphora.
PhD ThesisThis thesis discusses research in compositional distributional semantics: if words
are defined by their use in language and represented as high-dimensional vectors
reflecting their co-occurrence behaviour in textual corpora, how should words be
composed to produce a similar numerical representation for sentences, paragraphs
and documents? Neural methods learn a task-dependent composition by generalising
over large datasets, whereas type-driven approaches stipulate that composition
is given by a functional view on words, leaving open the question of what those
functions should do, concretely.
We take on the type-driven approach to compositional distributional semantics
and focus on the categorical framework of Coecke, Grefenstette, and Sadrzadeh
[CGS13], which models composition as an interpretation of syntactic structures as
linear maps on vector spaces using the language of category theory, as well as the
two-step approach of Muskens and Sadrzadeh [MS16], where syntactic structures
map to lambda logical forms that are instantiated by a concrete composition model.
We develop the theory behind these approaches to cover phenomena not dealt with
in previous work, evaluate the models in sentence-level tasks, and implement a tensor
learning method that generalises to arbitrary sentences.
This thesis reports three main contributions. The first, theoretical in nature, discusses
the ability of categorical and lambda-based models of compositional distributional
semantics to model ellipsis, anaphora, and parasitic gaps; phenomena that
challenge the linearity of previous compositional models. Secondly, we perform an
evaluation study on verb phrase ellipsis where we introduce three novel sentence
evaluation datasets and compare algebraic, neural, and tensor-based composition
models to show that models that resolve ellipsis achieve higher correlation with humans.
Finally, we generalise the skipgram model [Mik+13] to a tensor-based setting
and implement it for transitive verbs, showing that neural methods to learn tensor
representations for words can outperform previous tensor-based methods on compositional
tasks