286,181 research outputs found
Means and covariance functions for geostatistical compositional data: an axiomatic approach
This work focuses on the characterization of the central tendency of a sample
of compositional data. It provides new results about theoretical properties of
means and covariance functions for compositional data, with an axiomatic
perspective. Original results that shed new light on the geostatistical
modeling of compositional data are presented. As a first result, it is shown
that the weighted arithmetic mean is the only central tendency characteristic
satisfying a small set of axioms, namely continuity, reflexivity and marginal
stability. Moreover, this set of axioms also implies that the weights must be
identical for all parts of the composition. This result has deep consequences
on the spatial multivariate covariance modeling of compositional data. In a
geostatistical setting, it is shown as a second result that the proportional
model of covariance functions (i.e., the product of a covariance matrix and a
single correlation function) is the only model that provides identical kriging
weights for all components of the compositional data. As a consequence of these
two results, the proportional model of covariance function is the only
covariance model compatible with reflexivity and marginal stability
Improved Relation Extraction with Feature-Rich Compositional Embedding Models
Compositional embedding models build a representation (or embedding) for a
linguistic structure based on its component word embeddings. We propose a
Feature-rich Compositional Embedding Model (FCM) for relation extraction that
is expressive, generalizes to new domains, and is easy-to-implement. The key
idea is to combine both (unlexicalized) hand-crafted features with learned word
embeddings. The model is able to directly tackle the difficulties met by
traditional compositional embeddings models, such as handling arbitrary types
of sentence annotations and utilizing global information for composition. We
test the proposed model on two relation extraction tasks, and demonstrate that
our model outperforms both previous compositional models and traditional
feature rich models on the ACE 2005 relation extraction task, and the SemEval
2010 relation classification task. The combination of our model and a
log-linear classifier with hand-crafted features gives state-of-the-art
results.Comment: 12 pages for EMNLP 201
Compositional Model Repositories via Dynamic Constraint Satisfaction with Order-of-Magnitude Preferences
The predominant knowledge-based approach to automated model construction,
compositional modelling, employs a set of models of particular functional
components. Its inference mechanism takes a scenario describing the constituent
interacting components of a system and translates it into a useful mathematical
model. This paper presents a novel compositional modelling approach aimed at
building model repositories. It furthers the field in two respects. Firstly, it
expands the application domain of compositional modelling to systems that can
not be easily described in terms of interacting functional components, such as
ecological systems. Secondly, it enables the incorporation of user preferences
into the model selection process. These features are achieved by casting the
compositional modelling problem as an activity-based dynamic preference
constraint satisfaction problem, where the dynamic constraints describe the
restrictions imposed over the composition of partial models and the preferences
correspond to those of the user of the automated modeller. In addition, the
preference levels are represented through the use of symbolic values that
differ in orders of magnitude
Semantic Part Segmentation using Compositional Model combining Shape and Appearance
In this paper, we study the problem of semantic part segmentation for
animals. This is more challenging than standard object detection, object
segmentation and pose estimation tasks because semantic parts of animals often
have similar appearance and highly varying shapes. To tackle these challenges,
we build a mixture of compositional models to represent the object boundary and
the boundaries of semantic parts. And we incorporate edge, appearance, and
semantic part cues into the compositional model. Given part-level segmentation
annotation, we develop a novel algorithm to learn a mixture of compositional
models under various poses and viewpoints for certain animal classes.
Furthermore, a linear complexity algorithm is offered for efficient inference
of the compositional model using dynamic programming. We evaluate our method
for horse and cow using a newly annotated dataset on Pascal VOC 2010 which has
pixelwise part labels. Experimental results demonstrate the effectiveness of
our method
Iterated learning and grounding: from holistic to compositional languages
This paper presents a new computational model for studying the origins and evolution of compositional languages grounded through the interaction between agents and their environment. The model is based on previous work on adaptive grounding of lexicons and the iterated learning model. Although the model is still in a developmental phase, the first results show that a compositional language can emerge in which the structure reflects regularities present in the population's environment
Resolving Lexical Ambiguity in Tensor Regression Models of Meaning
This paper provides a method for improving tensor-based compositional
distributional models of meaning by the addition of an explicit disambiguation
step prior to composition. In contrast with previous research where this
hypothesis has been successfully tested against relatively simple compositional
models, in our work we use a robust model trained with linear regression. The
results we get in two experiments show the superiority of the prior
disambiguation method and suggest that the effectiveness of this approach is
model-independent
A Generalised Quantifier Theory of Natural Language in Categorical Compositional Distributional Semantics with Bialgebras
Categorical compositional distributional semantics is a model of natural
language; it combines the statistical vector space models of words with the
compositional models of grammar. We formalise in this model the generalised
quantifier theory of natural language, due to Barwise and Cooper. The
underlying setting is a compact closed category with bialgebras. We start from
a generative grammar formalisation and develop an abstract categorical
compositional semantics for it, then instantiate the abstract setting to sets
and relations and to finite dimensional vector spaces and linear maps. We prove
the equivalence of the relational instantiation to the truth theoretic
semantics of generalised quantifiers. The vector space instantiation formalises
the statistical usages of words and enables us to, for the first time, reason
about quantified phrases and sentences compositionally in distributional
semantics
- …
