142,914 research outputs found
Better Word Embeddings by Disentangling Contextual n-Gram Information
Pre-trained word vectors are ubiquitous in Natural Language Processing
applications. In this paper, we show how training word embeddings jointly with
bigram and even trigram embeddings, results in improved unigram embeddings. We
claim that training word embeddings along with higher n-gram embeddings helps
in the removal of the contextual information from the unigrams, resulting in
better stand-alone word embeddings. We empirically show the validity of our
hypothesis by outperforming other competing word representation models by a
significant margin on a wide variety of tasks. We make our models publicly
available.Comment: NAACL 201
Graph-Embedding Empowered Entity Retrieval
In this research, we improve upon the current state of the art in entity
retrieval by re-ranking the result list using graph embeddings. The paper shows
that graph embeddings are useful for entity-oriented search tasks. We
demonstrate empirically that encoding information from the knowledge graph into
(graph) embeddings contributes to a higher increase in effectiveness of entity
retrieval results than using plain word embeddings. We analyze the impact of
the accuracy of the entity linker on the overall retrieval effectiveness. Our
analysis further deploys the cluster hypothesis to explain the observed
advantages of graph embeddings over the more widely used word embeddings, for
user tasks involving ranking entities
Accessible points of planar embeddings of tent inverse limit spaces
In this paper we study a class of embeddings of tent inverse limit spaces. We
introduce techniques relying on the Milnor-Thurston kneading theory and use
them to study sets of accessible points and prime ends of given embeddings. We
completely characterize accessible points and prime ends of standard embeddings
arising from the Barge-Martin construction of global attractors. In other
(non-extendable) embeddings we find phenomena which do not occur in the
standard embeddings.Comment: extended preliminaries on construction of planar embeddings; 58
pages, 23 figure
- …