589 research outputs found

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Large Scale In Silico Screening on Grid Infrastructures

    Get PDF
    Large-scale grid infrastructures for in silico drug discovery open opportunities of particular interest to neglected and emerging diseases. In 2005 and 2006, we have been able to deploy large scale in silico docking within the framework of the WISDOM initiative against Malaria and Avian Flu requiring about 105 years of CPU on the EGEE, Auvergrid and TWGrid infrastructures. These achievements demonstrated the relevance of large-scale grid infrastructures for the virtual screening by molecular docking. This also allowed evaluating the performances of the grid infrastructures and to identify specific issues raised by large-scale deployment.Comment: 14 pages, 2 figures, 2 tables, The Third International Life Science Grid Workshop, LSGrid 2006, Yokohama, Japan, 13-14 october 2006, to appear in the proceeding

    Information processing using a single dynamical node as complex system

    Get PDF
    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing

    Evolution of associative learning in chemical networks

    Get PDF
    Organisms that can learn about their environment and modify their behaviour appropriately during their lifetime are more likely to survive and reproduce than organisms that do not. While associative learning – the ability to detect correlated features of the environment – has been studied extensively in nervous systems, where the underlying mechanisms are reasonably well understood, mechanisms within single cells that could allow associative learning have received little attention. Here, using in silico evolution of chemical networks, we show that there exists a diversity of remarkably simple and plausible chemical solutions to the associative learning problem, the simplest of which uses only one core chemical reaction. We then asked to what extent a linear combination of chemical concentrations in the network could approximate the ideal Bayesian posterior of an environment given the stimulus history so far? This Bayesian analysis revealed the ’memory traces’ of the chemical network. The implication of this paper is that there is little reason to believe that a lack of suitable phenotypic variation would prevent associative learning from evolving in cell signalling, metabolic, gene regulatory, or a mixture of these networks in cells

    Spike-Train Responses of a Pair of Hodgkin-Huxley Neurons with Time-Delayed Couplings

    Full text link
    Model calculations have been performed on the spike-train response of a pair of Hodgkin-Huxley (HH) neurons coupled by recurrent excitatory-excitatory couplings with time delay. The coupled, excitable HH neurons are assumed to receive the two kinds of spike-train inputs: the transient input consisting of MM impulses for the finite duration (MM: integer) and the sequential input with the constant interspike interval (ISI). The distribution of the output ISI ToT_{\rm o} shows a rich of variety depending on the coupling strength and the time delay. The comparison is made between the dependence of the output ISI for the transient inputs and that for the sequential inputs.Comment: 19 pages, 4 figure

    An associative memory of Hodgkin-Huxley neuron networks with Willshaw-type synaptic couplings

    Full text link
    An associative memory has been discussed of neural networks consisting of spiking N (=100) Hodgkin-Huxley (HH) neurons with time-delayed couplings, which memorize P patterns in their synaptic weights. In addition to excitatory synapses whose strengths are modified after the Willshaw-type learning rule with the 0/1 code for quiescent/active states, the network includes uniform inhibitory synapses which are introduced to reduce cross-talk noises. Our simulations of the HH neuron network for the noise-free state have shown to yield a fairly good performance with the storage capacity of αc=Pmax/N∼0.4−2.4\alpha_c = P_{\rm max}/N \sim 0.4 - 2.4 for the low neuron activity of f∼0.04−0.10f \sim 0.04-0.10. This storage capacity of our temporal-code network is comparable to that of the rate-code model with the Willshaw-type synapses. Our HH neuron network is realized not to be vulnerable to the distribution of time delays in couplings. The variability of interspace interval (ISI) of output spike trains in the process of retrieving stored patterns is also discussed.Comment: 15 pages, 3 figures, changed Titl

    Echo State Property of Deep Reservoir Computing Networks

    Get PDF
    In the last years, the Reservoir Computing (RC) framework has emerged as a state of-the-art approach for efficient learning in temporal domains. Recently, within the RC context, deep Echo State Network (ESN) models have been proposed. Being composed of a stack of multiple non-linear reservoir layers, deep ESNs potentially allow to exploit the advantages of a hierarchical temporal feature representation at different levels of abstraction, at the same time preserving the training efficiency typical of the RC methodology. In this paper, we generalize to the case of deep architectures the fundamental RC conditions related to the Echo State Property (ESP), based on the study of stability and contractivity of the resulting dynamical system. Besides providing a necessary condition and a sufficient condition for the ESP of layered RC networks, the results of our analysis provide also insights on the nature of the state dynamics in hierarchically organized recurrent models. In particular, we find out that by adding layers to a deep reservoir architecture, the regime of network’s dynamics can only be driven towards (equally or) less stable behaviors. Moreover, our investigation shows the intrinsic ability of temporal dynamics differentiation at the different levels in a deep recurrent architecture, with higher layers in the stack characterized by less contractive dynamics. Such theoretical insights are further supported by experimental results that show the effect of layering in terms of a progressively increased short-term memory capacity of the recurrent models

    ATPase Inhibitory Factor-1 Disrupts Mitochondrial Ca2+ Handling and Promotes Pathological Cardiac Hypertrophy through CaMKIIδ

    Get PDF
    ATPase inhibitory factor-1 (IF1) preserves cellular ATP under conditions of respiratory collapse, yet the function of IF1 under normal respiring conditions is unresolved. We tested the hypothesis that IF1 promotes mitochondrial dysfunction and pathological cardiomyocyte hypertrophy in the context of heart failure (HF). Methods and results: Cardiac expression of IF1 was increased in mice and in humans with HF, downstream of neurohumoral signaling pathways and in patterns that resembled the fetal-like gene program. Adenoviral expression of wild-type IF1 in primary cardiomyocytes resulted in pathological hypertrophy and metabolic remodeling as evidenced by enhanced mitochondrial oxidative stress, reduced mitochondrial respiratory capacity, and the augmentation of extramitochondrial glycolysis. Similar perturbations were observed with an IF1 mutant incapable of binding to ATP synthase (E55A mutation), an indication that these effects occurred independent of binding to ATP synthase. Instead, IF1 promoted mitochondrial fragmentation and compromised mitochondrial Ca2+ handling, which resulted in sarcoplasmic reticulum Ca2+ overloading. The effects of IF1 on Ca2+ handling were associated with the cytosolic activation of calcium-calmodulin kinase II (CaMKII) and inhibition of CaMKII or co-expression of catalytically dead CaMKIIδC was sufficient to prevent IF1 induced pathological hypertrophy. Conclusions: IF1 represents a novel member of the fetal-like gene program that contributes to mitochondrial dysfunction and pathological cardiac remodeling in HF. Furthermore, we present evidence for a novel, ATP-synthase-independent, role for IF1 in mitochondrial Ca2+ handling and mitochondrial-to-nuclear crosstalk involving CaMKII

    Representation of Time-Varying Stimuli by a Network Exhibiting Oscillations on a Faster Time Scale

    Get PDF
    Sensory processing is associated with gamma frequency oscillations (30–80 Hz) in sensory cortices. This raises the question whether gamma oscillations can be directly involved in the representation of time-varying stimuli, including stimuli whose time scale is longer than a gamma cycle. We are interested in the ability of the system to reliably distinguish different stimuli while being robust to stimulus variations such as uniform time-warp. We address this issue with a dynamical model of spiking neurons and study the response to an asymmetric sawtooth input current over a range of shape parameters. These parameters describe how fast the input current rises and falls in time. Our network consists of inhibitory and excitatory populations that are sufficient for generating oscillations in the gamma range. The oscillations period is about one-third of the stimulus duration. Embedded in this network is a subpopulation of excitatory cells that respond to the sawtooth stimulus and a subpopulation of cells that respond to an onset cue. The intrinsic gamma oscillations generate a temporally sparse code for the external stimuli. In this code, an excitatory cell may fire a single spike during a gamma cycle, depending on its tuning properties and on the temporal structure of the specific input; the identity of the stimulus is coded by the list of excitatory cells that fire during each cycle. We quantify the properties of this representation in a series of simulations and show that the sparseness of the code makes it robust to uniform warping of the time scale. We find that resetting of the oscillation phase at stimulus onset is important for a reliable representation of the stimulus and that there is a tradeoff between the resolution of the neural representation of the stimulus and robustness to time-warp. Author Summary Sensory processing of time-varying stimuli, such as speech, is associated with high-frequency oscillatory cortical activity, the functional significance of which is still unknown. One possibility is that the oscillations are part of a stimulus-encoding mechanism. Here, we investigate a computational model of such a mechanism, a spiking neuronal network whose intrinsic oscillations interact with external input (waveforms simulating short speech segments in a single acoustic frequency band) to encode stimuli that extend over a time interval longer than the oscillation's period. The network implements a temporally sparse encoding, whose robustness to time warping and neuronal noise we quantify. To our knowledge, this study is the first to demonstrate that a biophysically plausible model of oscillations occurring in the processing of auditory input may generate a representation of signals that span multiple oscillation cycles.National Science Foundation (DMS-0211505); Burroughs Wellcome Fund; U.S. Air Force Office of Scientific Researc
    • …
    corecore