4,936 research outputs found

    Using Regular Languages to Explore the Representational Capacity of Recurrent Neural Architectures

    Get PDF
    The presence of Long Distance Dependencies (LDDs) in sequential data poses significant challenges for computational models. Various recurrent neural architectures have been designed to mitigate this issue. In order to test these state-of-the-art architectures, there is growing need for rich benchmarking datasets. However, one of the drawbacks of existing datasets is the lack of experimental control with regards to the presence and/or degree of LDDs. This lack of control limits the analysis of model performance in relation to the specific challenge posed by LDDs. One way to address this is to use synthetic data having the properties of subregular languages. The degree of LDDs within the generated data can be controlled through the k parameter, length of the generated strings, and by choosing appropriate forbidden strings. In this paper, we explore the capacity of different RNN extensions to model LDDs, by evaluating these models on a sequence of SPk synthesized datasets, where each subsequent dataset exhibits a longer degree of LDD. Even though SPk are simple languages, the presence of LDDs does have significant impact on the performance of recurrent neural architectures, thus making them prime candidate in benchmarking tasks.Comment: International Conference of Artificial Neural Networks (ICANN) 201

    On the relevance of the neurobiological analogue of the finite-state architecture

    Get PDF
    We present two simple arguments for the potential relevance of a neurobiological analogue of the finite-state architecture. The first assumes the classical cognitive framework, is well-known, and is based on the assumption that the brain is finite with respect to its memory organization. The second is formulated within a general dynamical systems framework and is based on the assumption that the brain sustains some level of noise and/or does not utilize infinite precision processing. We briefly review the classical cognitive framework based on Church-Turing computability and non-classical approaches based on analog processing in dynamical systems. We conclude that the dynamical neurobiological analogue of the finite-state architecture appears to be relevant, at least at an implementational level, for cognitive brain systems

    Geometric representations for minimalist grammars

    Full text link
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing complexity. Finally, we illustrate our findings by means of two particular arithmetic and fractal representations.Comment: 43 pages, 4 figure

    Grammars and cellular automata for evolving neural networks architectures

    Get PDF
    IEEE International Conference on Systems, Man, and Cybernetics. Nashville, TN, 8-11 October 2000The class of feedforward neural networks trained with back-propagation admits a large variety of specific architectures applicable to approximation pattern tasks. Unfortunately, the architecture design is still a human expert job. In recent years, the interest to develop automatic methods to determine the architecture of the feedforward neural network has increased, most of them based on the evolutionary computation paradigm. From this approach, some perspectives can be considered: at one extreme, every connection and node of architecture can be specified in the chromosome representation using binary bits. This kind of representation scheme is called the direct encoding scheme. In order to reduce the length of the genotype and the search space, and to make the problem more scalable, indirect encoding schemes have been introduced. An indirect scheme under a constructive algorithm, on the other hand, starts with a minimal architecture and new levels, neurons and connections are added, step by step, via some sets of rules. The rules and/or some initial conditions are codified into a chromosome of a genetic algorithm. In this work, two indirect constructive encoding schemes based on grammars and cellular automata, respectively, are proposed to find the optimal architecture of a feedforward neural network
    • …
    corecore