4 research outputs found

    Knowledge graph embedding by dynamic translation

    Get PDF
    Knowledge graph embedding aims at representing entities and relations in a knowledge graph as dense, low-dimensional and real-valued vectors. It can efficiently measure semantic correlations of entities and relations in knowledge graphs, and improve the performance of knowledge acquisition, fusion and inference. Among various embedding models appeared in recent years, the translation-based models such as TransE, TransH, TransR and TranSparse achieve state-of-the-art performance. However, the translation principle applied in these models is too strict and can not deal with complex entities and relations very well. In this paper, by introducing parameter vectors into the translation principle which treats each relation as a translation from the head entity to the tail entity, we propose a novel dynamic translation principle which supports flexible translation between the embeddings of entities and relations. We use this principle to improve the TransE, TransR and TranSparse models respectively and build new models named TransE-DT, TransR-DT and TranSparse-DT correspondingly. Experimental results show that our dynamic translation principle achieves great improvement in both the link prediction task and the triple classification task

    Background Knowledge Based Multi-Stream Neural Network for Text Classification

    Get PDF
    As a foundation and typical task in natural language processing, text classification has been widely applied in many fields. However, as the basis of text classification, most existing corpus are imbalanced and often result in the classifier tending its performance to those categories with more texts. In this paper, we propose a background knowledge based multi-stream neural network to make up for the imbalance or insufficient information caused by the limitations of training corpus. The multi-stream network mainly consists of the basal stream, which retained original sequence information, and background knowledge based streams. Background knowledge is composed of keywords and co-occurred words which are extracted from external corpus. Background knowledge based streams are devoted to realizing supplemental information and reinforce basal stream. To better fuse the features extracted from different streams, early-fusion and two after-fusion strategies are employed. According to the results obtained from both Chinese corpus and English corpus, it is demonstrated that the proposed background knowledge based multi-stream neural network performs well in classification tasks

    FRS: A Simple Knowledge Graph Embedding Model for Entity Prediction

    Get PDF
    Abstract: Entity prediction is the task of predicting a missing entity that has a specific relationship with another given entity. Researchers usually use knowledge graphs embedding(KGE) methods to embed triples into continuous vectors for computation and perform the tasks of entity prediction. However, KGE models tend to use simple operations to refactor entities and relationships, resulting in insufficient interaction of components of knowledge graphs (KGs), thus limiting the performance of the entity prediction model. In this paper, we propose a new entity prediction model called FRS(Feature Refactoring Scoring) to alleviate the problem of insufficient interaction and solve information incompleteness problems in the KGs. Different from the traditional KGE methods of directly using simple operations, the FRS model innovatively provides the procedure of feature processing in the entity prediction tasks, realizing the alignment of entities and relationships in the same feature space and improving the performance of entity prediction model. Although FRS is a simple three-layer network, we find that our own model outperforms state-of-the-art KGC methods in FB15K and WN18. Through extensive experiments on FRS, we discover several insights. For example, the effect of embedding size and negative candidate sampling probability on experimental results is in revers
    corecore