104 research outputs found
On the Critical Capacity of the Hopfield Model
We estimate the critical capacity of the zero-temperature Hopfield model by
using a novel and rigorous method. The probability of having a stable fixed
point is one when for a large number of neurons. This result
is an advance on all rigorous results in the literature and the relationship
between the capacity and retrieval errors obtained here for small
coincides with replica calculation results.Comment: Latex 36 page macros:
http://www.springer.de/author/tex/help-journals.htm
Identifying short motifs by means of extreme value analysis
The problem of detecting a binding site -- a substring of DNA where
transcription factors attach -- on a long DNA sequence requires the recognition
of a small pattern in a large background. For short binding sites, the matching
probability can display large fluctuations from one putative binding site to
another. Here we use a self-consistent statistical procedure that accounts
correctly for the large deviations of the matching probability to predict the
location of short binding sites. We apply it in two distinct situations: (a)
the detection of the binding sites for three specific transcription factors on
a set of 134 estrogen-regulated genes; (b) the identification, in a set of 138
possible transcription factors, of the ones binding a specific set of nine
genes. In both instances, experimental findings are reproduced (when available)
and the number of false positives is significantly reduced with respect to the
other methods commonly employed.Comment: 6 pages, 5 figure
Central limit theorem for fluctuations of linear eigenvalue statistics of large random graphs
We consider the adjacency matrix of a large random graph and study
fluctuations of the function
with .
We prove that the moments of fluctuations normalized by in the limit
satisfy the Wick relations for the Gaussian random variables. This
allows us to prove central limit theorem for and then extend
the result on the linear eigenvalue statistics of any
function which increases, together with its
first two derivatives, at infinity not faster than an exponential.Comment: 22 page
Linear and nonlinear post-processing of numerically forecasted surface temperature
International audienceIn this paper we test different approaches to the statistical post-processing of gridded numerical surface air temperatures (provided by the European Centre for Medium-Range Weather Forecasts) onto the temperature measured at surface weather stations located in the Italian region of Puglia. We consider simple post-processing techniques, like correction for altitude, linear regression from different input parameters and Kalman filtering, as well as a neural network training procedure, stabilised (i.e. driven into the absolute minimum of the error function over the learning set) by means of a Simulated Annealing method. A comparative analysis of the results shows that the performance with neural networks is the best. It is encouraging for systematic use in meteorological forecast-analysis service operations
Transition from regular to complex behaviour in a discrete deterministic asymmetric neural network model
We study the long time behaviour of the transient before the collapse on the
periodic attractors of a discrete deterministic asymmetric neural networks
model. The system has a finite number of possible states so it is not possible
to use the term chaos in the usual sense of sensitive dependence on the initial
condition. Nevertheless, at varying the asymmetry parameter, , one observes
a transition from ordered motion (i.e. short transients and short periods on
the attractors) to a ``complex'' temporal behaviour. This transition takes
place for the same value at which one has a change for the mean
transient length from a power law in the size of the system () to an
exponential law in . The ``complex'' behaviour during the transient shows
strong analogies with the chaotic behaviour: decay of temporal correlations,
positive Shannon entropy, non-constant Renyi entropies of different orders.
Moreover the transition is very similar to that one for the intermittent
transition in chaotic systems: scaling law for the Shannon entropy and strong
fluctuations of the ``effective Shannon entropy'' along the transient, for .Comment: 18 pages + 6 figures, TeX dialect: Plain TeX + IOP macros (included
Chaos in neural networks with a nonmonotonic transfer function
Time evolution of diluted neural networks with a nonmonotonic transfer
function is analitically described by flow equations for macroscopic variables.
The macroscopic dynamics shows a rich variety of behaviours: fixed-point,
periodicity and chaos. We examine in detail the structure of the strange
attractor and in particular we study the main features of the stable and
unstable manifolds, the hyperbolicity of the attractor and the existence of
homoclinic intersections. We also discuss the problem of the robustness of the
chaos and we prove that in the present model chaotic behaviour is fragile
(chaotic regions are densely intercalated with periodicity windows), according
to a recently discussed conjecture. Finally we perform an analysis of the
microscopic behaviour and in particular we examine the occurrence of damage
spreading by studying the time evolution of two almost identical initial
configurations. We show that for any choice of the parameters the two initial
states remain microscopically distinct.Comment: 12 pages, 11 figures. Accepted for publication in Physical Review E.
Originally submitted to the neuro-sys archive which was never publicly
announced (was 9905001
- âŠ