44 research outputs found

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Computational modeling with spiking neural networks

    Get PDF
    This chapter reviews recent developments in the area of spiking neural networks (SNN) and summarizes the main contributions to this research field. We give background information about the functioning of biological neurons, discuss the most important mathematical neural models along with neural encoding techniques, learning algorithms, and applications of spiking neurons. As a specific application, the functioning of the evolving spiking neural network (eSNN) classification method is presented in detail and the principles of numerous eSNN based applications are highlighted and discussed

    Financial time series prediction using spiking neural networks

    Get PDF
    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments. © 2014 Reid et al

    Applying Heuristic Approaches for Predicting Defect-Prone Software Components

    No full text

    Computational Models for Generic Cortical Microcircuits

    No full text
    The human nervous system processes a continuous stream of multi-modal input from a rapidly changing environment. A key challenge for neural modeling is to explain how the neural microcircuits (columns, minicolumns, etc.) in the cerebral cortex whose anatomical and physiological structure is quite similar in many brain areas and species achieve this enormous computational task. We propose a computational model that could explain the potentially universal computational capabilities and does not require a task-dependent construction of neural circuits. Instead it is based on principles of high dimensional dynamical systems in combination with statistical learning theory, and can be implemented on generic evolved or found recurrent circuitry. This new approach towards understanding neural computation on the micro-level also suggests new ways of modeling cognitive processing in larger neural systems. In particular it questions traditional ways of thinking about neural coding

    The "Liquid Computer": A Novel Strategy for Real-Time Computing on Time Series

    No full text
    We will discuss in this survey article a new framework for analysing computations on time series and in particular on spike trains, introduced in (Maass et. al. 2002). In contrast to common computational models this new framework does not require that information can be stored in some stable states of a computational system. It has recently been shown that such models where all events are transient can be successfully applied to analyse computations in neural systems and (independently) that the basic ideas can also be used to solve engineering tasks such as the design of nonlinear controllers. Using an illustrative example we will develop the main ideas of the proposed model. This illustrative example is generalized and cast into a rigorous mathematical model: the Liquid State Machine. A mathematical analysis shows that there are in principle no computational limitations of liquid state machines in the domain of time series computing. Finally we discuss several successful applications of the framework in the area of computational neuroscience and in the field of artificial neural networks

    Multi-Domain Transfer Component Analysis for Domain Generalization

    No full text
    Contains fulltext : 179118.pdf (Publisher’s version ) (Open Access
    corecore