43,281 research outputs found

    Reverse engineering of genetic networks with time delayed recurrent neural networks and clustering techniques

    Get PDF
    In the iterative process of experimentally probing biological networks and computationally inferring models for the networks, fast, accurate and flexible computational frameworks are needed for modeling and reverse engineering biological networks. In this dissertation, I propose a novel model to simulate gene regulatory networks using a specific type of time delayed recurrent neural networks. Also, I introduce a parameter clustering method to select groups of parameter sets from the simulations representing biologically reasonable networks. Additionally, a general purpose adaptive function is used here to decrease and study the connectivity of small gene regulatory networks modules. In this dissertation, the performance of this novel model is shown to simulate the dynamics and to infer the topology of gene regulatory networks derived from synthetic and experimental time series gene expression data. Here, I assess the quality of the inferred networks by the use of graph edit distance measurements in comparison to the synthetic and experimental benchmarks. Additionally, I compare between edition costs of the inferred networks obtained with the time delay recurrent networks and other previously described reverse engineering methods based on continuous time recurrent neural and dynamic Bayesian networks. Furthermore, I address questions of network connectivity and correlation between data fitting and inference power by simulating common experimental limitations of the reverse engineering process as incomplete and highly noisy data. The novel specific type of time delay recurrent neural networks model in combination with parameter clustering substantially improves the inference power of reverse engineered networks. Additionally, some suggestions for future improvements are discussed, particularly under the data driven perspective as the solution for modeling complex biological systems

    Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control

    Get PDF
    It is widely accepted that the complex dynamics characteristic of recurrent neural circuits contributes in a fundamental manner to brain function. Progress has been slow in understanding and exploiting the computational power of recurrent dynamics for two main reasons: nonlinear recurrent networks often exhibit chaotic behavior and most known learning rules do not work in robust fashion in recurrent networks. Here we address both these problems by demonstrating how random recurrent networks (RRN) that initially exhibit chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity that are both complex and robust to noise. The outcome is a novel neural network regime that exhibits both transiently stable and chaotic trajectories. We further show that the recurrent learning rule dramatically increases the ability of RRNs to generate complex spatiotemporal motor patterns, and accounts for recent experimental data showing a decrease in neural variability in response to stimulus onset

    Photonic Delay Systems as Machine Learning Implementations

    Get PDF
    Nonlinear photonic delay systems present interesting implementation platforms for machine learning models. They can be extremely fast, offer great degrees of parallelism and potentially consume far less power than digital processors. So far they have been successfully employed for signal processing using the Reservoir Computing paradigm. In this paper we show that their range of applicability can be greatly extended if we use gradient descent with backpropagation through time on a model of the system to optimize the input encoding of such systems. We perform physical experiments that demonstrate that the obtained input encodings work well in reality, and we show that optimized systems perform significantly better than the common Reservoir Computing approach. The results presented here demonstrate that common gradient descent techniques from machine learning may well be applicable on physical neuro-inspired analog computers
    • …
    corecore