228 research outputs found

    Deciding Conditional Termination

    Full text link
    We address the problem of conditional termination, which is that of defining the set of initial configurations from which a given program always terminates. First we define the dual set, of initial configurations from which a non-terminating execution exists, as the greatest fixpoint of the function that maps a set of states into its pre-image with respect to the transition relation. This definition allows to compute the weakest non-termination precondition if at least one of the following holds: (i) the transition relation is deterministic, (ii) the descending Kleene sequence overapproximating the greatest fixpoint converges in finitely many steps, or (iii) the transition relation is well founded. We show that this is the case for two classes of relations, namely octagonal and finite monoid affine relations. Moreover, since the closed forms of these relations can be defined in Presburger arithmetic, we obtain the decidability of the termination problem for such loops.Comment: 61 pages, 6 figures, 2 table

    Computability of Julia sets

    Full text link
    In this paper we settle most of the open questions on algorithmic computability of Julia sets. In particular, we present an algorithm for constructing quadratics whose Julia sets are uncomputable. We also show that a filled Julia set of a polynomial is always computable.Comment: Revised. To appear in Moscow Math. Journa

    Universal discrete-time reservoir computers with stochastic inputs and linear readouts using non-homogeneous state-affine systems

    Full text link
    A new class of non-homogeneous state-affine systems is introduced for use in reservoir computing. Sufficient conditions are identified that guarantee first, that the associated reservoir computers with linear readouts are causal, time-invariant, and satisfy the fading memory property and second, that a subset of this class is universal in the category of fading memory filters with stochastic almost surely uniformly bounded inputs. This means that any discrete-time filter that satisfies the fading memory property with random inputs of that type can be uniformly approximated by elements in the non-homogeneous state-affine family.Comment: 41 page

    A modular architecture for transparent computation in recurrent neural networks

    Get PDF
    publisher: Elsevier articletitle: A modular architecture for transparent computation in recurrent neural networks journaltitle: Neural Networks articlelink: http://dx.doi.org/10.1016/j.neunet.2016.09.001 content_type: article copyright: © 2016 Elsevier Ltd. All rights reserved

    Dynamical Systems Theory for Transparent Symbolic Computation in Neuronal Networks

    Get PDF
    In this thesis, we explore the interface between symbolic and dynamical system computation, with particular regard to dynamical system models of neuronal networks. In doing so, we adhere to a definition of computation as the physical realization of a formal system, where we say that a dynamical system performs a computation if a correspondence can be found between its dynamics on a vectorial space and the formal system’s dynamics on a symbolic space. Guided by this definition, we characterize computation in a range of neuronal network models. We first present a constructive mapping between a range of formal systems and Recurrent Neural Networks (RNNs), through the introduction of a Versatile Shift and a modular network architecture supporting its real-time simulation. We then move on to more detailed models of neural dynamics, characterizing the computation performed by networks of delay-pulse-coupled oscillators supporting the emergence of heteroclinic dynamics. We show that a correspondence can be found between these networks and Finite-State Transducers, and use the derived abstraction to investigate how noise affects computation in this class of systems, unveiling a surprising facilitatory effect on information transmission. Finally, we present a new dynamical framework for computation in neuronal networks based on the slow-fast dynamics paradigm, and discuss the consequences of our results for future work, specifically for what concerns the fields of interactive computation and Artificial Intelligence
    • …
    corecore