104 research outputs found

    On the Critical Capacity of the Hopfield Model

    Full text link
    We estimate the critical capacity of the zero-temperature Hopfield model by using a novel and rigorous method. The probability of having a stable fixed point is one when α≀0.113\alpha\le 0.113 for a large number of neurons. This result is an advance on all rigorous results in the literature and the relationship between the capacity α\alpha and retrieval errors obtained here for small α\alpha coincides with replica calculation results.Comment: Latex 36 page macros: http://www.springer.de/author/tex/help-journals.htm

    Identifying short motifs by means of extreme value analysis

    Full text link
    The problem of detecting a binding site -- a substring of DNA where transcription factors attach -- on a long DNA sequence requires the recognition of a small pattern in a large background. For short binding sites, the matching probability can display large fluctuations from one putative binding site to another. Here we use a self-consistent statistical procedure that accounts correctly for the large deviations of the matching probability to predict the location of short binding sites. We apply it in two distinct situations: (a) the detection of the binding sites for three specific transcription factors on a set of 134 estrogen-regulated genes; (b) the identification, in a set of 138 possible transcription factors, of the ones binding a specific set of nine genes. In both instances, experimental findings are reproduced (when available) and the number of false positives is significantly reduced with respect to the other methods commonly employed.Comment: 6 pages, 5 figure

    Central limit theorem for fluctuations of linear eigenvalue statistics of large random graphs

    Full text link
    We consider the adjacency matrix AA of a large random graph and study fluctuations of the function fn(z,u)=1n∑k=1nexp⁥{−uGkk(z)}f_n(z,u)=\frac{1}{n}\sum_{k=1}^n\exp\{-uG_{kk}(z)\} with G(z)=(z−iA)−1G(z)=(z-iA)^{-1}. We prove that the moments of fluctuations normalized by n−1/2n^{-1/2} in the limit n→∞n\to\infty satisfy the Wick relations for the Gaussian random variables. This allows us to prove central limit theorem for TrG(z)\hbox{Tr}G(z) and then extend the result on the linear eigenvalue statistics Trϕ(A)\hbox{Tr}\phi(A) of any function ϕ:R→R\phi:\mathbb{R}\to\mathbb{R} which increases, together with its first two derivatives, at infinity not faster than an exponential.Comment: 22 page

    Linear and nonlinear post-processing of numerically forecasted surface temperature

    Get PDF
    International audienceIn this paper we test different approaches to the statistical post-processing of gridded numerical surface air temperatures (provided by the European Centre for Medium-Range Weather Forecasts) onto the temperature measured at surface weather stations located in the Italian region of Puglia. We consider simple post-processing techniques, like correction for altitude, linear regression from different input parameters and Kalman filtering, as well as a neural network training procedure, stabilised (i.e. driven into the absolute minimum of the error function over the learning set) by means of a Simulated Annealing method. A comparative analysis of the results shows that the performance with neural networks is the best. It is encouraging for systematic use in meteorological forecast-analysis service operations

    Transition from regular to complex behaviour in a discrete deterministic asymmetric neural network model

    Full text link
    We study the long time behaviour of the transient before the collapse on the periodic attractors of a discrete deterministic asymmetric neural networks model. The system has a finite number of possible states so it is not possible to use the term chaos in the usual sense of sensitive dependence on the initial condition. Nevertheless, at varying the asymmetry parameter, kk, one observes a transition from ordered motion (i.e. short transients and short periods on the attractors) to a ``complex'' temporal behaviour. This transition takes place for the same value kck_{\rm c} at which one has a change for the mean transient length from a power law in the size of the system (NN) to an exponential law in NN. The ``complex'' behaviour during the transient shows strong analogies with the chaotic behaviour: decay of temporal correlations, positive Shannon entropy, non-constant Renyi entropies of different orders. Moreover the transition is very similar to that one for the intermittent transition in chaotic systems: scaling law for the Shannon entropy and strong fluctuations of the ``effective Shannon entropy'' along the transient, for k>kck > k_{\rm c}.Comment: 18 pages + 6 figures, TeX dialect: Plain TeX + IOP macros (included

    Chaos in neural networks with a nonmonotonic transfer function

    Full text link
    Time evolution of diluted neural networks with a nonmonotonic transfer function is analitically described by flow equations for macroscopic variables. The macroscopic dynamics shows a rich variety of behaviours: fixed-point, periodicity and chaos. We examine in detail the structure of the strange attractor and in particular we study the main features of the stable and unstable manifolds, the hyperbolicity of the attractor and the existence of homoclinic intersections. We also discuss the problem of the robustness of the chaos and we prove that in the present model chaotic behaviour is fragile (chaotic regions are densely intercalated with periodicity windows), according to a recently discussed conjecture. Finally we perform an analysis of the microscopic behaviour and in particular we examine the occurrence of damage spreading by studying the time evolution of two almost identical initial configurations. We show that for any choice of the parameters the two initial states remain microscopically distinct.Comment: 12 pages, 11 figures. Accepted for publication in Physical Review E. Originally submitted to the neuro-sys archive which was never publicly announced (was 9905001
    • 

    corecore