3,775 research outputs found

    On String Languages Generative Power of Spiking Neural P Systems with a Generalized Use of Rules

    Get PDF
    膜计算作为自然计算的重要分支,主要研究目标是从组织或器官等活细胞的结构与功能当中抽象出的一种计算模型。经过将近20年的发展,膜计算领域已硕果累累:在理论研究方面,有一些计算能力与图灵机等价的计算模型已被提出,还有若干能够在多项式时间内解决NP完全问题的计算模型也被提出;在实际应用方面,已有科学家成功地将膜计算这一学科广泛应用于机器人控制机制、某些系统故障诊断和数据建模等领域。 脉冲神经膜系统是一种新型的膜计算模型,受到神经元之间以脉冲作为信号进行通讯的方法启发。传统的脉冲神经膜系统一般以顺序模式(一个时间单元内某个可用规则只能应用一次)或穷举模式(一个时间单元内某个可用规则尽可能多次使用)来...Membrane computing, as an important branch of natural computing, focuses on computational models that are abstracted from the structure and function of living cells such as tissues or organs. After nearly 20 years of development, the field of membrane computing has made a lot of fruitful results. In theoretical research, there are some computational models equivalent to Turing machines that have b...学位:工程硕士院系专业:信息科学与技术学院_工程硕士(计算机技术)学号:2302014115320

    How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation

    Get PDF
    This paper addresses two questions in the context of neuronal networks dynamics, using methods from dynamical systems theory and statistical physics: (i) How to characterize the statistical properties of sequences of action potentials ("spike trains") produced by neuronal networks ? and; (ii) what are the effects of synaptic plasticity on these statistics ? We introduce a framework in which spike trains are associated to a coding of membrane potential trajectories, and actually, constitute a symbolic coding in important explicit examples (the so-called gIF models). On this basis, we use the thermodynamic formalism from ergodic theory to show how Gibbs distributions are natural probability measures to describe the statistics of spike trains, given the empirical averages of prescribed quantities. As a second result, we show that Gibbs distributions naturally arise when considering "slow" synaptic plasticity rules where the characteristic time for synapse adaptation is quite longer than the characteristic time for neurons dynamics.Comment: 39 pages, 3 figure

    Training Dynamic Exponential Family Models with Causal and Lateral Dependencies for Generalized Neuromorphic Computing

    Full text link
    Neuromorphic hardware platforms, such as Intel's Loihi chip, support the implementation of Spiking Neural Networks (SNNs) as an energy-efficient alternative to Artificial Neural Networks (ANNs). SNNs are networks of neurons with internal analogue dynamics that communicate by means of binary time series. In this work, a probabilistic model is introduced for a generalized set-up in which the synaptic time series can take values in an arbitrary alphabet and are characterized by both causal and instantaneous statistical dependencies. The model, which can be considered as an extension of exponential family harmoniums to time series, is introduced by means of a hybrid directed-undirected graphical representation. Furthermore, distributed learning rules are derived for Maximum Likelihood and Bayesian criteria under the assumption of fully observed time series in the training set.Comment: Published in IEEE ICASSP 2019. Author's Accepted Manuscrip

    Learning Hybrid System Models for Supervisory Decoding of Discrete State, with applications to the Parietal Reach Region

    Get PDF
    Based on Gibbs sampling, a novel method to identify mathematical models of neural activity in response to temporal changes of behavioral or cognitive state is presented. This work is motivated by the developing field of neural prosthetics, where a supervisory controller is required to classify activity of a brain region into suitable discrete modes. Here, neural activity in each discrete mode is modeled with nonstationary point processes, and transitions between modes are modeled as hidden Markov models. The effectiveness of this framework is first demonstrated on a simulated example. The identification algorithm is then applied to extracellular neural activity recorded from multi-electrode arrays in the parietal reach region of a rhesus monkey, and the results demonstrate the ability to decode discrete changes even from small data sets

    Neural Information Processing: between synchrony and chaos

    Get PDF
    The brain is characterized by performing many different processing tasks ranging from elaborate processes such as pattern recognition, memory or decision-making to more simple functionalities such as linear filtering in image processing. Understanding the mechanisms by which the brain is able to produce such a different range of cortical operations remains a fundamental problem in neuroscience. Some recent empirical and theoretical results support the notion that the brain is naturally poised between ordered and chaotic states. As the largest number of metastable states exists at a point near the transition, the brain therefore has access to a larger repertoire of behaviours. Consequently, it is of high interest to know which type of processing can be associated with both ordered and disordered states. Here we show an explanation of which processes are related to chaotic and synchronized states based on the study of in-silico implementation of biologically plausible neural systems. The measurements obtained reveal that synchronized cells (that can be understood as ordered states of the brain) are related to non-linear computations, while uncorrelated neural ensembles are excellent information transmission systems that are able to implement linear transformations (as the realization of convolution products) and to parallelize neural processes. From these results we propose a plausible meaning for Hebbian and non-Hebbian learning rules as those biophysical mechanisms by which the brain creates ordered or chaotic ensembles depending on the desired functionality. The measurements that we obtain from the hardware implementation of different neural systems endorse the fact that the brain is working with two different states, ordered and chaotic, with complementary functionalities that imply non-linear processing (synchronized states) and information transmission and convolution (chaotic states)
    corecore