24,785 research outputs found

    Recurrent kernel machines : computing with infinite echo state networks

    Get PDF
    Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks

    Non-equilibrium stochastic dynamics in continuum: The free case

    Full text link
    We study the problem of identification of a proper state-space for the stochastic dynamics of free particles in continuum, with their possible birth and death. In this dynamics, the motion of each separate particle is described by a fixed Markov process MM on a Riemannian manifold XX. The main problem arising here is a possible collapse of the system, in the sense that, though the initial configuration of particles is locally finite, there could exist a compact set in XX such that, with probability one, infinitely many particles will arrive at this set at some time t>0t>0. We assume that XX has infinite volume and, for each α1\alpha\ge1, we consider the set Θα\Theta_\alpha of all infinite configurations in XX for which the number of particles in a compact set is bounded by a constant times the α\alpha-th power of the volume of the set. We find quite general conditions on the process MM which guarantee that the corresponding infinite particle process can start at each configuration from Θα\Theta_\alpha, will never leave Θα\Theta_\alpha, and has cadlag (or, even, continuous) sample paths in the vague topology. We consider the following examples of applications of our results: Brownian motion on the configuration space, free Glauber dynamics on the configuration space (or a birth-and-death process in XX), and free Kawasaki dynamics on the configuration space. We also show that if X=RdX=\mathbb R^d, then for a wide class of starting distributions, the (non-equilibrium) free Glauber dynamics is a scaling limit of (non-equilibrium) free Kawasaki dynamics

    Conjugate Projective Limits

    Full text link
    We characterize conjugate nonparametric Bayesian models as projective limits of conjugate, finite-dimensional Bayesian models. In particular, we identify a large class of nonparametric models representable as infinite-dimensional analogues of exponential family distributions and their canonical conjugate priors. This class contains most models studied in the literature, including Dirichlet processes and Gaussian process regression models. To derive these results, we introduce a representation of infinite-dimensional Bayesian models by projective limits of regular conditional probabilities. We show under which conditions the nonparametric model itself, its sufficient statistics, and -- if they exist -- conjugate updates of the posterior are projective limits of their respective finite-dimensional counterparts. We illustrate our results both by application to existing nonparametric models and by construction of a model on infinite permutations.Comment: 49 pages; improved version: revised proof of theorem 3 (results unchanged), discussion added, exposition revise
    corecore