4,236 research outputs found
Integer Echo State Networks: Hyperdimensional Reservoir Computing
We propose an approximation of Echo State Networks (ESN) that can be
efficiently implemented on digital hardware based on the mathematics of
hyperdimensional computing. The reservoir of the proposed Integer Echo State
Network (intESN) is a vector containing only n-bits integers (where n<8 is
normally sufficient for a satisfactory performance). The recurrent matrix
multiplication is replaced with an efficient cyclic shift operation. The intESN
architecture is verified with typical tasks in reservoir computing: memorizing
of a sequence of inputs; classifying time-series; learning dynamic processes.
Such an architecture results in dramatic improvements in memory footprint and
computational efficiency, with minimal performance loss.Comment: 10 pages, 10 figures, 1 tabl
Malware Classification Using LSTMs
Signature and anomaly based detection have long been quintessential techniques used in malware detection. However, these techniques have become increasingly ineffective as malware becomes more complex. Researchers have therefore turned to deep learning to construct better performing models. In this project, we create four different long-short term memory (LSTM) models and train each model to classify malware by family type. Our data consists of opcodes extracted from malware executables. We employ techniques used in natural language processing (NLP) such as word embedding and bidirection LSTMs (biLSTM). We also use convolutional neural networks (CNN). We found that our model consisting of word embedding, biLSTMs and CNN layers performed the best in classifying malware
Evolving neural networks with genetic algorithms to study the String Landscape
We study possible applications of artificial neural networks to examine the
string landscape. Since the field of application is rather versatile, we
propose to dynamically evolve these networks via genetic algorithms. This means
that we start from basic building blocks and combine them such that the neural
network performs best for the application we are interested in. We study three
areas in which neural networks can be applied: to classify models according to
a fixed set of (physically) appealing features, to find a concrete realization
for a computation for which the precise algorithm is known in principle but
very tedious to actually implement, and to predict or approximate the outcome
of some involved mathematical computation which performs too inefficient to
apply it, e.g. in model scans within the string landscape. We present simple
examples that arise in string phenomenology for all three types of problems and
discuss how they can be addressed by evolving neural networks from genetic
algorithms.Comment: 17 pages, 7 figures, references added, typos corrected, extended
introductory sectio
Convolutional Neural Networks over Tree Structures for Programming Language Processing
Programming language processing (similar to natural language processing) is a
hot research topic in the field of software engineering; it has also aroused
growing interest in the artificial intelligence community. However, different
from a natural language sentence, a program contains rich, explicit, and
complicated structural information. Hence, traditional NLP models may be
inappropriate for programs. In this paper, we propose a novel tree-based
convolutional neural network (TBCNN) for programming language processing, in
which a convolution kernel is designed over programs' abstract syntax trees to
capture structural information. TBCNN is a generic architecture for programming
language processing; our experiments show its effectiveness in two different
program analysis tasks: classifying programs according to functionality, and
detecting code snippets of certain patterns. TBCNN outperforms baseline
methods, including several neural models for NLP.Comment: Accepted at AAAI-1
- …