3 research outputs found

    Computational capabilities of recurrent neural networks based on their attractor dynamics

    No full text
    We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and output cells as well as sigmoid internal cells. When subjected to some infinite binary input stream, the Boolean output cells necessarily exhibit some attractor dynamics, which is assumed to be of two possible kinds, namely either meaningful or spurious, and which underlies the arising of spatiotemporal patterns of output discharges. In this context, we show that rational-weighted neural networks are computationally equivalent to deterministic Muller Turing machines, whereas all other models of real-weighted or evolving neural networks are equivalent to each other, and strictly more powerful than deterministic Muller Turing machines. In this precise sense, the analog and evolving neural networks are super-Turing. We further provide some precise mathematical characterization of the expressive powers of all these neural models. These results constitute a generalization to the current computational context of those obtained in the cases of classical as well as interactive computations. They support the idea that recurrent neural networks represent a natural model of computation beyond the Turing limits
    corecore