4 research outputs found

    Inverse modeling of time-delayed interactions via the dynamic-entropy formalism

    Full text link
    Even though instantaneous interactions are unphysical, a large variety of maximum-entropy statistical-inference methods match the model inferred and the empirically measured equal-time correlation functions. While this constraint holds when the interaction timescale is much faster than that of the interacting units, as, e.g., in starling flocks (where birds see each other via the electromagnetic field), it fails in a number of counter examples, as, e.g., leukocyte coordination (where signalling proteins diffuse among two cells). Here, by relying upon the Akaike Information Criterion, we relax this assumption and develop a dynamical maximum-entropy framework, which copes with delay in signalling. Our method correctly infers the strength of couplings and fields, but also the time required by the couplings to propagate among the units. We demonstrate the validity of our approach providing excellent results on synthetic datasets generated by the Heisemberg-Kuramoto and Vicsek models. As a proof of concept, we also apply the method to experiments on dendritic migration to prove that matching equal-time correlations results in a significant information loss

    Dense Hebbian neural networks: a replica symmetric picture of supervised learning

    Get PDF
    We consider dense, associative neural-networks trained by a teacher (i.e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as quality and quantity of the training dataset, network storage and noise, that is valid in the limit of large network size and structureless datasets: these networks may work in a ultra-storage regime (where they can handle a huge amount of patterns, if compared with shallow neural networks) or in a ultra-detection regime (where they can perform pattern recognition at prohibitive signal-to-noise ratios, if compared with shallow neural networks). Guided by the random theory as a reference framework, we also test numerically learning, storing and retrieval capabilities shown by these networks on structured datasets as MNist and Fashion MNist. As technical remarks, from the analytic side, we implement large deviations and stability analysis within Guerra's interpolation to tackle the not-Gaussian distributions involved in the post-synaptic potentials while, from the computational counterpart, we insert Plefka approximation in the Monte Carlo scheme, to speed up the evaluation of the synaptic tensors, overall obtaining a novel and broad approach to investigate supervised learning in neural networks, beyond the shallow limit, in general.Comment: arXiv admin note: text overlap with arXiv:2211.1406

    Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

    Full text link
    We consider dense, associative neural-networks trained with no supervision and we investigate their computational capabilities analytically, via a statistical-mechanics approach, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as the quality and quantity of the training dataset and the network storage, valid in the limit of large network size and structureless datasets. Moreover, we establish a bridge between macroscopic observables standardly used in statistical mechanics and loss functions typically used in the machine learning. As technical remarks, from the analytic side, we implement large deviations and stability analysis within Guerra's interpolation to tackle the not-Gaussian distributions involved in the post-synaptic potentials while, from the computational counterpart, we insert Plefka approximation in the Monte Carlo scheme, to speed up the evaluation of the synaptic tensors, overall obtaining a novel and broad approach to investigate neural networks in general
    corecore