28,884 research outputs found
Storage of Natural Language Sentences in a Hopfield Network
This paper look at how the Hopfield neural network can be used to store and
recall patterns constructed from natural language sentences. As a pattern
recognition and storage tool, the Hopfield neural network has received much
attention. This attention however has been mainly in the field of statistical
physics due to the model's simple abstraction of spin glass systems. A
discussion is made of the differences, shown as bias and correlation, between
natural language sentence patterns and the randomly generated ones used in
previous experiments. Results are given for numerical simulations which show
the auto-associative competence of the network when trained with natural
language patterns.Comment: latex, 10 pages with 2 tex figures and a .bib file, uses nemlap.sty,
to appear in Proceedings of NeMLaP-
Knowledge Graph semantic enhancement of input data for improving AI
Intelligent systems designed using machine learning algorithms require a
large number of labeled data. Background knowledge provides complementary, real
world factual information that can augment the limited labeled data to train a
machine learning algorithm. The term Knowledge Graph (KG) is in vogue as for
many practical applications, it is convenient and useful to organize this
background knowledge in the form of a graph. Recent academic research and
implemented industrial intelligent systems have shown promising performance for
machine learning algorithms that combine training data with a knowledge graph.
In this article, we discuss the use of relevant KGs to enhance input data for
two applications that use machine learning -- recommendation and community
detection. The KG improves both accuracy and explainability
- …