9 research outputs found
Ontology-based approach to semantically enhanced question answering for closed domain: a review
Abstract: For many users of natural language processing (NLP), it can be challenging to obtain
concise, accurate and precise answers to a question. Systems such as question answering (QA) enable
users to ask questions and receive feedback in the form of quick answers to questions posed in
natural language, rather than in the form of lists of documents delivered by search engines. This
task is challenging and involves complex semantic annotation and knowledge representation. This
study reviews the literature detailing ontology-based methods that semantically enhance QA for a
closed domain, by presenting a literature review of the relevant studies published between 2000 and
2020. The review reports that 83 of the 124 papers considered acknowledge the QA approach, and
recommend its development and evaluation using different methods. These methods are evaluated
according to accuracy, precision, and recall. An ontological approach to semantically enhancing QA
is found to be adopted in a limited way, as many of the studies reviewed concentrated instead on
NLP and information retrieval (IR) processing. While the majority of the studies reviewed focus on
open domains, this study investigates the closed domain
A Survey on Knowledge Graphs: Representation, Acquisition and Applications
Human knowledge provides a formal understanding of the world. Knowledge
graphs that represent structural relations between entities have become an
increasingly popular research direction towards cognition and human-level
intelligence. In this survey, we provide a comprehensive review of knowledge
graph covering overall research topics about 1) knowledge graph representation
learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph,
and 4) knowledge-aware applications, and summarize recent breakthroughs and
perspective directions to facilitate future research. We propose a full-view
categorization and new taxonomies on these topics. Knowledge graph embedding is
organized from four aspects of representation space, scoring function, encoding
models, and auxiliary information. For knowledge acquisition, especially
knowledge graph completion, embedding methods, path inference, and logical rule
reasoning, are reviewed. We further explore several emerging topics, including
meta relational learning, commonsense reasoning, and temporal knowledge graphs.
To facilitate future research on knowledge graphs, we also provide a curated
collection of datasets and open-source libraries on different tasks. In the
end, we have a thorough outlook on several promising research directions
Graph Neural Networks for Natural Language Processing: A Survey
Deep learning has become the dominant approach in coping with various tasks
in Natural LanguageProcessing (NLP). Although text inputs are typically
represented as a sequence of tokens, there isa rich variety of NLP problems
that can be best expressed with a graph structure. As a result, thereis a surge
of interests in developing new deep learning techniques on graphs for a large
numberof NLP tasks. In this survey, we present a comprehensive overview onGraph
Neural Networks(GNNs) for Natural Language Processing. We propose a new
taxonomy of GNNs for NLP, whichsystematically organizes existing research of
GNNs for NLP along three axes: graph construction,graph representation
learning, and graph based encoder-decoder models. We further introducea large
number of NLP applications that are exploiting the power of GNNs and summarize
thecorresponding benchmark datasets, evaluation metrics, and open-source codes.
Finally, we discussvarious outstanding challenges for making the full use of
GNNs for NLP as well as future researchdirections. To the best of our
knowledge, this is the first comprehensive overview of Graph NeuralNetworks for
Natural Language Processing.Comment: 127 page