139 research outputs found
Response of electrically coupled spiking neurons: a cellular automaton approach
Experimental data suggest that some classes of spiking neurons in the first
layers of sensory systems are electrically coupled via gap junctions or
ephaptic interactions. When the electrical coupling is removed, the response
function (firing rate {\it vs.} stimulus intensity) of the uncoupled neurons
typically shows a decrease in dynamic range and sensitivity. In order to assess
the effect of electrical coupling in the sensory periphery, we calculate the
response to a Poisson stimulus of a chain of excitable neurons modeled by
-state Greenberg-Hastings cellular automata in two approximation levels. The
single-site mean field approximation is shown to give poor results, failing to
predict the absorbing state of the lattice, while the results for the pair
approximation are in good agreement with computer simulations in the whole
stimulus range. In particular, the dynamic range is substantially enlarged due
to the propagation of excitable waves, which suggests a functional role for
lateral electrical coupling. For probabilistic spike propagation the Hill
exponent of the response function is , while for deterministic spike
propagation we obtain , which is close to the experimental values
of the psychophysical Stevens exponents for odor and light intensities. Our
calculations are in qualitative agreement with experimental response functions
of ganglion cells in the mammalian retina.Comment: 11 pages, 8 figures, to appear in the Phys. Rev.
Physics of Psychophysics: Stevens and Weber-Fechner laws are transfer functions of excitable media
Sensory arrays made of coupled excitable elements can improve both their
input sensitivity and dynamic range due to collective non-linear wave
properties. This mechanism is studied in a neural network of electrically
coupled (e.g. via gap junctions) elements subject to a Poisson signal process.
The network response interpolates between a Weber-Fechner logarithmic law and a
Stevens power law depending on the relative refractory period of the cell.
Therefore, these non-linear transformations of the input level could be
performed in the sensory periphery simply due to a basic property: the transfer
function of excitable media.Comment: 4 pages, 5 figure
Scaling law for the transient behavior of type-II neuron models
We study the transient regime of type-II biophysical neuron models and
determine the scaling behavior of relaxation times near but below the
repetitive firing critical current, . For both
the Hodgkin-Huxley and Morris-Lecar models we find that the critical exponent
is independent of the numerical integration time step and that both systems
belong to the same universality class, with . For appropriately
chosen parameters, the FitzHugh-Nagumo model presents the same generic
transient behavior, but the critical region is significantly smaller. We
propose an experiment that may reveal nontrivial critical exponents in the
squid axon.Comment: 6 pages, 9 figures, accepted for publication in Phys. Rev.
Functional Optimisation of Online Algorithms in Multilayer Neural Networks
We study the online dynamics of learning in fully connected soft committee
machines in the student-teacher scenario. The locally optimal modulation
function, which determines the learning algorithm, is obtained from a
variational argument in such a manner as to maximise the average generalisation
error decay per example. Simulations results for the resulting algorithm are
presented for a few cases. The symmetric phase plateaux are found to be vastly
reduced in comparison to those found when online backpropagation algorithms are
used. A discussion of the implementation of these ideas as practical algorithms
is given
Storage capacity of correlated perceptrons
We consider an ensemble of single-layer perceptrons exposed to random
inputs and investigate the conditions under which the couplings of these
perceptrons can be chosen such that prescribed correlations between the outputs
occur. A general formalism is introduced using a multi-perceptron costfunction
that allows to determine the maximal number of random inputs as a function of
the desired values of the correlations. Replica-symmetric results for and
are compared with properties of two-layer networks of tree-structure and
fixed Boolean function between hidden units and output. The results show which
correlations in the hidden layer of multi-layer neural networks are crucial for
the value of the storage capacity.Comment: 16 pages, Latex2
Functional Optimization in Complex Excitable Networks
We study the effect of varying wiring in excitable random networks in which
connection weights change with activity to mold local resistance or
facilitation due to fatigue. Dynamic attractors, corresponding to patterns of
activity, are then easily destabilized according to three main modes, including
one in which the activity shows chaotic hopping among the patterns. We describe
phase transitions to this regime, and show a monotonous dependence of critical
parameters on the heterogeneity of the wiring distribution. Such correlation
between topology and functionality implies, in particular, that tasks which
require unstable behavior --such as pattern recognition, family discrimination
and categorization-- can be most efficiently performed on highly heterogeneous
networks. It also follows a possible explanation for the abundance in nature of
scale--free network topologies.Comment: 7 pages, 3 figure
Phase-Induced (In)-Stability in Coupled Parametric Oscillators
We report results on a model of two coupled oscillators that undergo periodic
parametric modulations with a phase difference . Being to a large
extent analytically solvable, the model reveals a rich dependence of
the regions of parametric resonance. In particular, the intuitive notion that
anti-phase modulations are less prone to parametric resonance is confirmed for
sufficiently large coupling and damping. We also compare our results to a
recently reported mean field model of collective parametric instability,
showing that the two-oscillator model can capture much of the qualitative
behavior of the infinite system.Comment: 19 pages, 8 figures; a version with better quality figures can be
found in http://hypatia.ucsd.edu/~mauro/English/publications.htm
Retarded Learning: Rigorous Results from Statistical Mechanics
We study learning of probability distributions characterized by an unknown
symmetry direction. Based on an entropic performance measure and the
variational method of statistical mechanics we develop exact upper and lower
bounds on the scaled critical number of examples below which learning of the
direction is impossible. The asymptotic tightness of the bounds suggests an
asymptotically optimal method for learning nonsmooth distributions.Comment: 8 pages, 1 figur
Fast relational learning using bottom clause propositionalization with artificial neural networks
Relational learning can be described as the task of learning first-order logic rules from examples. It has enabled a number of new machine learning applications, e.g. graph mining and link analysis. Inductive Logic Programming (ILP) performs relational learning either directly by manipulating first-order rules or through propositionalization, which translates the relational task into an attribute-value learning task by representing subsets of relations as features. In this paper, we introduce a fast method and system for relational learning based on a novel propositionalization called Bottom Clause Propositionalization (BCP). Bottom clauses are boundaries in the hypothesis search space used by ILP systems Progol and Aleph. Bottom clauses carry semantic meaning and can be mapped directly onto numerical vectors, simplifying the feature extraction process. We have integrated BCP with a well-known neural-symbolic system, C-IL2P, to perform learning from numerical vectors. C-IL2P uses background knowledge in the form of propositional logic programs to build a neural network. The integrated system, which we call CILP++, handles first-order logic knowledge and is available for download from Sourceforge. We have evaluated CILP++ on seven ILP datasets, comparing results with Aleph and a well-known propositionalization method, RSD. The results show that CILP++ can achieve accuracy comparable to Aleph, while being generally faster, BCP achieved statistically significant improvement in accuracy in comparison with RSD when running with a neural network, but BCP and RSD perform similarly when running with C4.5. We have also extended CILP++ to include a statistical feature selection method, mRMR, with preliminary results indicating that a reduction of more than 90 % of features can be achieved with a small loss of accuracy
- …
