3,547 research outputs found

    Ranking relations using analogies in biological and information networks

    Get PDF
    Analogical reasoning depends fundamentally on the ability to learn and generalize about relations between objects. We develop an approach to relational learning which, given a set of pairs of objects S={A(1):B(1),A(2):B(2),,A(N):B(N)}\mathbf{S}=\{A^{(1)}:B^{(1)},A^{(2)}:B^{(2)},\ldots,A^{(N)}:B ^{(N)}\}, measures how well other pairs A:B fit in with the set S\mathbf{S}. Our work addresses the following question: is the relation between objects A and B analogous to those relations found in S\mathbf{S}? Such questions are particularly relevant in information retrieval, where an investigator might want to search for analogous pairs of objects that match the query set of interest. There are many ways in which objects can be related, making the task of measuring analogies very challenging. Our approach combines a similarity measure on function spaces with Bayesian analysis to produce a ranking. It requires data containing features of the objects of interest and a link matrix specifying which relationships exist; no further attributes of such relationships are necessary. We illustrate the potential of our method on text analysis and information networks. An application on discovering functional interactions between pairs of proteins is discussed in detail, where we show that our approach can work in practice even if a small set of protein pairs is provided.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS321 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Learning Analogy-Preserving Sentence Embeddings for Answer Selection

    Full text link
    Answer selection aims at identifying the correct answer for a given question from a set of potentially correct answers. Contrary to previous works, which typically focus on the semantic similarity between a question and its answer, our hypothesis is that question-answer pairs are often in analogical relation to each other. Using analogical inference as our use case, we propose a framework and a neural network architecture for learning dedicated sentence embeddings that preserve analogical properties in the semantic space. We evaluate the proposed method on benchmark datasets for answer selection and demonstrate that our sentence embeddings indeed capture analogical properties better than conventional embeddings, and that analogy-based question answering outperforms a comparable similarity-based technique.Comment: To appear in CoNLL1

    Search Engines, Social Media, and the Editorial Analogy

    Get PDF
    Deconstructing the “editorial analogy,” and analogical reasoning more generally, in First Amendment litigation involving powerful tech companies

    Visual Cortex Inspired CNN Model for Feature Construction in Text Analysis

    Get PDF
    Recently, biologically inspired models are gradually proposed to solve the problem in text analysis. Convolutional neural networks (CNN) are hierarchical artificial neural networks, which include a various of multilayer perceptrons. According to biological research, CNN can be improved by bringing in the attention modulation and memory processing of primate visual cortex. In this paper, we employ the above properties of primate visual cortex to improve CNN and propose a biological-mechanism-driven-feature-construction based answer recommendation method (BMFC-ARM), which is used to recommend the best answer for the corresponding given questions in community question answering. BMFC-ARM is an improved CNN with four channels respectively representing questions, answers, asker information and answerer information, and mainly contains two stages: biological mechanism driven feature construction (BMFC) and answer ranking. BMFC imitates the attention modulation property by introducing the asker information and answerer information of given questions and the similarity between them, and imitates the memory processing property through bringing in the user reputation information for answerers. Then the feature vector for answer ranking is constructed by fusing the asker-answerer similarities, answerer's reputation and the corresponding vectors of question, answer, asker and answerer. Finally, the Softmax is used at the stage of answer ranking to get best answers by the feature vector. The experimental results of answer recommendation on the Stackexchange dataset show that BMFC-ARM exhibits better performance

    Application of Analogical Reasoning for Use in Visual Knowledge Extraction

    Get PDF
    There is a continual push to make Artificial Intelligence (AI) as human-like as possible; however, this is a difficult task because of its inability to learn beyond its current comprehension. Analogical reasoning (AR) has been proposed as one method to achieve this goal. Current literature lacks a technical comparison on psychologically-inspired and natural-language-processing-produced AR algorithms with consistent metrics on multiple-choice word-based analogy problems. Assessment is based on “correctness” and “goodness” metrics. There is not a one-size-fits-all algorithm for all textual problems. As contribution in visual AR, a convolutional neural network (CNN) is integrated with the AR vector space model, Global Vectors (GloVe), in the proposed, Image Recognition Through Analogical Reasoning Algorithm (IRTARA). Given images outside of the CNN’s training data, IRTARA produces contextual information by leveraging semantic information from GloVe. IRTARA’s quality of results is measured by definition, AR, and human factors evaluation methods, which saw consistency at the extreme ends. The research shows the potential for AR to facilitate more a human-like AI through its ability to understand concepts beyond its foundational knowledge in both a textual and visual problem space
    corecore