25,629 research outputs found
On Context Shifters and Compositionality in Natural Languages
My modest aim in this paper is to prove certain relations between some type of hyper-intensional operators, namely context shifting operators, and compositionality in natural languages. Various authors (e.g. von Fintel & Matthewson 2008; Stalnaker 2014) have argued that context-shifting operators are incompatible with compositionality. In fact, some of them understand Kaplan’s (1989) famous ban on context-shifting operators as a constraint on compositionality. Others, (e.g. Rabern 2013) take contextshifting operators to be compatible with compositionality but, unfortunately, do not provide a proof, or an argument in favor of their position. The aim of this paper is to do precisely that. Additionally, I provide a new proof that compositionality for propositional content (intension) is a proper generalization of compositionality for character (hyper-intensions)
A Study of Metrics of Distance and Correlation Between Ranked Lists for Compositionality Detection
Compositionality in language refers to how much the meaning of some phrase
can be decomposed into the meaning of its constituents and the way these
constituents are combined. Based on the premise that substitution by synonyms
is meaning-preserving, compositionality can be approximated as the semantic
similarity between a phrase and a version of that phrase where words have been
replaced by their synonyms. Different ways of representing such phrases exist
(e.g., vectors [1] or language models [2]), and the choice of representation
affects the measurement of semantic similarity.
We propose a new compositionality detection method that represents phrases as
ranked lists of term weights. Our method approximates the semantic similarity
between two ranked list representations using a range of well-known distance
and correlation metrics. In contrast to most state-of-the-art approaches in
compositionality detection, our method is completely unsupervised. Experiments
with a publicly available dataset of 1048 human-annotated phrases shows that,
compared to strong supervised baselines, our approach provides superior
measurement of compositionality using any of the distance and correlation
metrics considered
A probabilistic framework for analysing the compositionality of conceptual combinations
Conceptual combination performs a fundamental role in creating the broad
range of compound phrases utilised in everyday language. This article provides
a novel probabilistic framework for assessing whether the semantics of conceptual
combinations are compositional, and so can be considered as a function of
the semantics of the constituent concepts, or not. While the systematicity and
productivity of language provide a strong argument in favor of assuming compositionality,
this very assumption is still regularly questioned in both cognitive
science and philosophy. Additionally, the principle of semantic compositionality
is underspecified, which means that notions of both "strong" and "weak"
compositionality appear in the literature. Rather than adjudicating between
different grades of compositionality, the framework presented here contributes
formal methods for determining a clear dividing line between compositional and
non-compositional semantics. In addition, we suggest that the distinction between
these is contextually sensitive. Compositionality is equated with a joint probability distribution modeling how the constituent concepts in the combination
are interpreted. Marginal selectivity is introduced as a pivotal probabilistic
constraint for the application of the Bell/CH and CHSH systems of inequalities.
Non-compositionality is equated with a failure of marginal selectivity, or violation
of either system of inequalities in the presence of marginal selectivity. This
means that the conceptual combination cannot be modeled in a joint probability
distribution, the variables of which correspond to how the constituent concepts
are being interpreted. The formal analysis methods are demonstrated by applying
them to an empirical illustration of twenty-four non-lexicalised conceptual
combinations
Compositionality for Quantitative Specifications
We provide a framework for compositional and iterative design and
verification of systems with quantitative information, such as rewards, time or
energy. It is based on disjunctive modal transition systems where we allow
actions to bear various types of quantitative information. Throughout the
design process the actions can be further refined and the information made more
precise. We show how to compute the results of standard operations on the
systems, including the quotient (residual), which has not been previously
considered for quantitative non-deterministic systems. Our quantitative
framework has close connections to the modal nu-calculus and is compositional
with respect to general notions of distances between systems and the standard
operations
The myth of occurrence-based semantics
The principle of compositionality requires that the meaning of a complex expression remains the same after substitution of synonymous expressions. Alleged counterexamples to compositionality seem to force a theoretical choice: either apparent synonyms are not synonyms or synonyms do not syntactically occur where they appear to occur. Some theorists have instead looked to Frege’s doctrine of “reference shift” according to which the meaning of an expression is sensitive to its linguistic context. This doctrine is alleged to retain the relevant claims about synonymy and substitution while respecting the compositionality principle. Thus, Salmon :415, 2006) and Glanzberg and King :1–29, 2020) offer occurrence-based accounts of variable binding, and Pagin and Westerståhl :381–415, 2010c) argue that an occurrence-based semantics delivers a compositional account of quotation. Our thesis is this: the occurrence-based strategies resolve the apparent failures of substitutivity in the same general way as the standard expression-based semantics do. So it is a myth that a Frege-inspired occurrence-based semantics affords a genuine alternative strategy
From compositional to systematic semantics
We prove a theorem stating that any semantics can be encoded as a
compositional semantics, which means that, essentially, the standard definition
of compositionality is formally vacuous. We then show that when compositional
semantics is required to be "systematic" (that is, the meaning function cannot
be arbitrary, but must belong to some class), it is possible to distinguish
between compositional and non-compositional semantics. As a result, we believe
that the paper clarifies the concept of compositionality and opens a
possibility of making systematic formal comparisons of different systems of
grammars.Comment: 11 pp. Latex.
Teaching Compositionality to CNNs
Convolutional neural networks (CNNs) have shown great success in computer
vision, approaching human-level performance when trained for specific tasks via
application-specific loss functions. In this paper, we propose a method for
augmenting and training CNNs so that their learned features are compositional.
It encourages networks to form representations that disentangle objects from
their surroundings and from each other, thereby promoting better
generalization. Our method is agnostic to the specific details of the
underlying CNN to which it is applied and can in principle be used with any
CNN. As we show in our experiments, the learned representations lead to feature
activations that are more localized and improve performance over
non-compositional baselines in object recognition tasks.Comment: Preprint appearing in CVPR 201
- …