Article thumbnail

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

By Olivier Faugeras and James MacLaurin

Abstract

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories

Topics: large deviations, good rate function, stationary gaussian processes, stationary measures, spectral representations, neural networks, firing rate neurons, correlated synaptic weights, Science, Q, Astrophysics, QB460-466, Physics, QC1-999
Publisher: MDPI AG
Year: 2015
DOI identifier: 10.3390/e17074701
OAI identifier: oai:doaj.org/article:e7d1cec045784b8ab9a382928e497c20
Journal:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • https://doaj.org/toc/1099-4300 (external link)
  • http://www.mdpi.com/1099-4300/... (external link)
  • https://doaj.org/article/e7d1c... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.