16,864 research outputs found

    Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere

    Get PDF
    Among the various architectures of Recurrent Neural Networks, Echo State Networks (ESNs) emerged due to their simplified and inexpensive training procedure. These networks are known to be sensitive to the setting of hyper-parameters, which critically affect their behaviour. Results show that their performance is usually maximized in a narrow region of hyper-parameter space called edge of chaos. Finding such a region requires searching in hyper-parameter space in a sensible way: hyper-parameter configurations marginally outside such a region might yield networks exhibiting fully developed chaos, hence producing unreliable computations. The performance gain due to optimizing hyper-parameters can be studied by considering the memory--nonlinearity trade-off, i.e., the fact that increasing the nonlinear behavior of the network degrades its ability to remember past inputs, and vice-versa. In this paper, we propose a model of ESNs that eliminates critical dependence on hyper-parameters, resulting in networks that provably cannot enter a chaotic regime and, at the same time, denotes nonlinear behaviour in phase space characterised by a large memory of past inputs, comparable to the one of linear networks. Our contribution is supported by experiments corroborating our theoretical findings, showing that the proposed model displays dynamics that are rich-enough to approximate many common nonlinear systems used for benchmarking

    Pursuit-evasion predator-prey waves in two spatial dimensions

    Get PDF
    We consider a spatially distributed population dynamics model with excitable predator-prey dynamics, where species propagate in space due to their taxis with respect to each other's gradient in addition to, or instead of, their diffusive spread. Earlier, we have described new phenomena in this model in one spatial dimension, not found in analogous systems without taxis: reflecting and self-splitting waves. Here we identify new phenomena in two spatial dimensions: unusual patterns of meander of spirals, partial reflection of waves, swelling wavetips, attachment of free wave ends to wave backs, and as a result, a novel mechanism of self-supporting complicated spatio-temporal activity, unknown in reaction-diffusion population models.Comment: 15 pages, 15 figures, submitted to Chao

    Mechanisms of urban change: Regeneration companies or development corporations?

    Get PDF
    This article is an early assessment of the role and performance of URCs, benchmarked against the UDC model. It identified weaknesses and vulnerability of URCs in relation to control over land

    Dynamic Adaptive Computation: Tuning network states to task requirements

    Get PDF
    Neural circuits are able to perform computations under very diverse conditions and requirements. The required computations impose clear constraints on their fine-tuning: a rapid and maximally informative response to stimuli in general requires decorrelated baseline neural activity. Such network dynamics is known as asynchronous-irregular. In contrast, spatio-temporal integration of information requires maintenance and transfer of stimulus information over extended time periods. This can be realized at criticality, a phase transition where correlations, sensitivity and integration time diverge. Being able to flexibly switch, or even combine the above properties in a task-dependent manner would present a clear functional advantage. We propose that cortex operates in a "reverberating regime" because it is particularly favorable for ready adaptation of computational properties to context and task. This reverberating regime enables cortical networks to interpolate between the asynchronous-irregular and the critical state by small changes in effective synaptic strength or excitation-inhibition ratio. These changes directly adapt computational properties, including sensitivity, amplification, integration time and correlation length within the local network. We review recent converging evidence that cortex in vivo operates in the reverberating regime, and that various cortical areas have adapted their integration times to processing requirements. In addition, we propose that neuromodulation enables a fine-tuning of the network, so that local circuits can either decorrelate or integrate, and quench or maintain their input depending on task. We argue that this task-dependent tuning, which we call "dynamic adaptive computation", presents a central organization principle of cortical networks and discuss first experimental evidence.Comment: 6 pages + references, 2 figure

    Quantum chaos of a mixed, open system of kicked cold atoms

    Full text link
    The quantum and classical dynamics of particles kicked by a gaussian attractive potential are studied. Classically, it is an open mixed system (the motion in some parts of the phase space is chaotic, and in some parts it is regular). The fidelity (Lochshmidt echo) is found to exhibit oscillations that can be determined from classical considerations but are sensitive to phase space structures that are smaller than Planck's constant. Families of quasi-energies are determined from classical phase space structures. Substantial differences between the classical and quantum dynamics are found for time dependent scattering. It is argued that the system can be experimentally realized by cold atoms kicked by a gaussian light beam.Comment: 19 pages, 21 figures, (accepted for publication in Phys. Rev. E

    Sparsity in Reservoir Computing Neural Networks

    Get PDF
    Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design under the perspective of the richness of the developed temporal representations. We analyze both sparsity in the recurrent connections, and in the connections from the input to the reservoir. Our results point out that sparsity, in particular in input-reservoir connections, has a major role in developing internal temporal representations that have a longer short-term memory of past inputs and a higher dimension.Comment: This paper is currently under revie

    Echo: Flux, Spring 2017

    Get PDF
    Student-produced magazine formerly published as Chicago Arts and Communication, changed to Echo magazine in 1997. Cover articles: Poetryscopes: no rhymes, just reasons; Surviving the stigma: life post-prison; Psych out: a new role for LSD?; Will I forget?: assessing my risk of Alzheimer\u27s. Local insights: paths of birds; Chicago words; weather nerds. 120 pages.https://digitalcommons.colum.edu/echo/1037/thumbnail.jp

    Bubble, toil, and trouble

    Get PDF
    When people call the dot-com boom a bubble, they imply that investors based their decisions on something other than a good estimate of the future value of the assets theywere buying. But some economists say that is not likely because episodes like the dot-com bust show future value is not always easy to predict, especially when the asset is a new technology. This Commentary shows how both explanations can describe a famous historical bubble that occurred after the introduction of a technology that was new at the beginning of the eighteenth century—a novel macroeconomic theory.Speculation ; Financial crises ; Law, John
    • …
    corecore