822 research outputs found

    An Online Calibration System for Digital Input Electricity Meters Based on Improved Nuttall Window

    Get PDF
    OAPA This paper proposes an improved online calibration technique for digital input electricity meters. The technique employs a double spectral line interpolation fast Fourier transform algorithm with four-item, three-order Nuttall window to reduce the measurement error caused by spectrum leakage, frequency fluctuation, noise pollution and harmonic interference. A calibration system of friendly human-computer interaction is designed using LabVIEW. Simulation and practical results show that the proposed calibration system with improved Nuttall window algorithm is of high accuracy and reliability when compared with the traditional calibration algorithm currently used by industry practice

    Topological Effects of Synaptic Time Dependent Plasticity

    Full text link
    We show that the local Spike Timing-Dependent Plasticity (STDP) rule has the effect of regulating the trans-synaptic weights of loops of any length within a simulated network of neurons. We show that depending on STDP's polarity, functional loops are formed or eliminated in networks driven to normal spiking conditions by random, partially correlated inputs, where functional loops comprise weights that exceed a non-zero threshold. We further prove that STDP is a form of loop-regulating plasticity for the case of a linear network comprising random weights drawn from certain distributions. Thus a notable local synaptic learning rule makes a specific prediction about synapses in the brain in which standard STDP is present: that under normal spiking conditions, they should participate in predominantly feed-forward connections at all scales. Our model implies that any deviations from this prediction would require a substantial modification to the hypothesized role for standard STDP. Given its widespread occurrence in the brain, we predict that STDP could also regulate long range synaptic loops among individual neurons across all brain scales, up to, and including, the scale of global brain network topology.Comment: 26 pages, 5 figure

    Nonlinear Systems

    Get PDF
    Open Mathematics is a challenging notion for theoretical modeling, technical analysis, and numerical simulation in physics and mathematics, as well as in many other fields, as highly correlated nonlinear phenomena, evolving over a large range of time scales and length scales, control the underlying systems and processes in their spatiotemporal evolution. Indeed, available data, be they physical, biological, or financial, and technologically complex systems and stochastic systems, such as mechanical or electronic devices, can be managed from the same conceptual approach, both analytically and through computer simulation, using effective nonlinear dynamics methods. The aim of this Special Issue is to highlight papers that show the dynamics, control, optimization and applications of nonlinear systems. This has recently become an increasingly popular subject, with impressive growth concerning applications in engineering, economics, biology, and medicine, and can be considered a veritable contribution to the literature. Original papers relating to the objective presented above are especially welcome subjects. Potential topics include, but are not limited to: Stability analysis of discrete and continuous dynamical systems; Nonlinear dynamics in biological complex systems; Stability and stabilization of stochastic systems; Mathematical models in statistics and probability; Synchronization of oscillators and chaotic systems; Optimization methods of complex systems; Reliability modeling and system optimization; Computation and control over networked systems

    Dynamical Systems in Spiking Neuromorphic Hardware

    Get PDF
    Dynamical systems are universal computers. They can perceive stimuli, remember, learn from feedback, plan sequences of actions, and coordinate complex behavioural responses. The Neural Engineering Framework (NEF) provides a general recipe to formulate models of such systems as coupled sets of nonlinear differential equations and compile them onto recurrently connected spiking neural networks – akin to a programming language for spiking models of computation. The Nengo software ecosystem supports the NEF and compiles such models onto neuromorphic hardware. In this thesis, we analyze the theory driving the success of the NEF, and expose several core principles underpinning its correctness, scalability, completeness, robustness, and extensibility. We also derive novel theoretical extensions to the framework that enable it to far more effectively leverage a wide variety of dynamics in digital hardware, and to exploit the device-level physics in analog hardware. At the same time, we propose a novel set of spiking algorithms that recruit an optimal nonlinear encoding of time, which we call the Delay Network (DN). Backpropagation across stacked layers of DNs dramatically outperforms stacked Long Short-Term Memory (LSTM) networks—a state-of-the-art deep recurrent architecture—in accuracy and training time, on a continuous-time memory task, and a chaotic time-series prediction benchmark. The basic component of this network is shown to function on state-of-the-art spiking neuromorphic hardware including Braindrop and Loihi. This implementation approaches the energy-efficiency of the human brain in the former case, and the precision of conventional computation in the latter case

    Reservoir Computing: computation with dynamical systems

    Get PDF
    In het onderzoeksgebied Machine Learning worden systemen onderzocht die kunnen leren op basis van voorbeelden. Binnen dit onderzoeksgebied zijn de recurrente neurale netwerken een belangrijke deelgroep. Deze netwerken zijn abstracte modellen van de werking van delen van de hersenen. Zij zijn in staat om zeer complexe temporele problemen op te lossen maar zijn over het algemeen zeer moeilijk om te trainen. Recentelijk zijn een aantal gelijkaardige methodes voorgesteld die dit trainingsprobleem elimineren. Deze methodes worden aangeduid met de naam Reservoir Computing. Reservoir Computing combineert de indrukwekkende rekenkracht van recurrente neurale netwerken met een eenvoudige trainingsmethode. Bovendien blijkt dat deze trainingsmethoden niet beperkt zijn tot neurale netwerken, maar kunnen toegepast worden op generieke dynamische systemen. Waarom deze systemen goed werken en welke eigenschappen bepalend zijn voor de prestatie is evenwel nog niet duidelijk. Voor dit proefschrift is onderzoek gedaan naar de dynamische eigenschappen van generieke Reservoir Computing systemen. Zo is experimenteel aangetoond dat de idee van Reservoir Computing ook toepasbaar is op niet-neurale netwerken van dynamische knopen. Verder is een maat voorgesteld die gebruikt kan worden om het dynamisch regime van een reservoir te meten. Tenslotte is een adaptatieregel geïntroduceerd die voor een breed scala reservoirtypes de dynamica van het reservoir kan afregelen tot het gewenste dynamisch regime. De technieken beschreven in dit proefschrift zijn gedemonstreerd op verschillende academische en ingenieurstoepassingen

    Information processing in a midbrain visual pathway

    Get PDF
    Visual information is processed in brain via the intricate interactions between neurons. We investigated a midbrain visual pathway: optic tectum and its isthmic nucleus) that is motion sensitive and is thought as part of attentional system. We determined the physiological properties of individual neurons as well as their synaptic connections with intracellular recordings. We reproduced the center-surround receptive field structure of tectal neurons in a dynamical recurrent feedback loop. We reveal in a computational model that the anti-topographic inhibitory feedback could mediate competitive stimulus selection in a complex visual scene. We also investigated the dynamics of the competitive selection in a rate model. The isthmotectal feedback loop gates the information transfer from tectum to thalamic rotundus. We discussed the role of a localized feedback projection in contributing to the gating mechanisms with both experimental and numerical approaches. We further discussed the dynamics of the isthmotectal system by considering the propagation delays between different components. We conclude that the isthmotectal system is involved in attention-like competitive stimulus selection and control the information coding in the motion sensitive SGC-I neurons by modulating the retino-tectal synaptic transmission

    Complex and Adaptive Dynamical Systems: A Primer

    Full text link
    An thorough introduction is given at an introductory level to the field of quantitative complex system science, with special emphasis on emergence in dynamical systems based on network topologies. Subjects treated include graph theory and small-world networks, a generic introduction to the concepts of dynamical system theory, random Boolean networks, cellular automata and self-organized criticality, the statistical modeling of Darwinian evolution, synchronization phenomena and an introduction to the theory of cognitive systems. It inludes chapter on Graph Theory and Small-World Networks, Chaos, Bifurcations and Diffusion, Complexity and Information Theory, Random Boolean Networks, Cellular Automata and Self-Organized Criticality, Darwinian evolution, Hypercycles and Game Theory, Synchronization Phenomena and Elements of Cognitive System Theory.Comment: unformatted version of the textbook; published in Springer, Complexity Series (2008, second edition 2010

    Dynamical systems applied to consciousness and brain rhythms in a neural network

    Get PDF
    This thesis applies the great advances of modern dynamical systems theory (DST) to consciousness. Consciousness, or subjective experience, is faced here in two different ways: from the global dynamics of the human brain and from the integrated information theory (IIT), one of the currently most prestigious theories on consciousness. Before that, a study of a numerical simulation of a network of individual neurons justifies the use of the Lotka-Volterra model for neurons assemblies in both applications. All these proposals are developed following this scheme: • First, summarizing the structure, methods and goal of the thesis. • Second, introducing a general background in neuroscience and the global dynamics of the human brain to better understand those applications. • Third, conducting a study of a numerically simulated network of neurons. This network, which displays brain rhythms, can be employed, among other objectives, to justify the use of the Lotka-Volterra model for applications. • Fourth, summarizing concepts from the mathematical DST such as the global attractor and its informational structure, in addition to its particularization to a Lotka-Volterra system. • Fifth, introducing the new mathematical concepts of model transform and instantaneous parameters that allow the application of simple mathematical models such as Lotka-Volterra to complex empirical systems as the human brain. • Sixth, using the model transform, and specifically the Lotka-Volterra transform, to calculate global attractors and informational structures in global dynamics of the human brain. • Seventh, knowing the probably most prestigious theory on consciousness, the IIT developed by G. Tononi. • Eighth, using informational structures to develop a continuous version of IIT. And ninth, establishing some final conclusions and commenting on new open questions from this work. These nine points of this scheme correspond to the nine chapters of this thesis
    corecore