201 research outputs found
Emerging heterogeneities in Italian customs and comparison with nearby countries
In this work we apply techniques and modus operandi typical of Statistical
Mechanics to a large dataset about key social quantifiers and compare the
resulting behaviours of five European nations, namely France, Germany, Italy,
Spain and Switzerland. The social quantifiers considered are the evolution
of the number of autochthonous marriages (i.e. between two natives) within a
given territorial district and the evolution of the number of mixed
marriages (i.e. between a native and an immigrant) within a given territorial
district. Our investigations are twofold. From a theoretical perspective, we
develop novel techniques, complementary to classical methods (e.g. historical
series and logistic regression), in order to detect possible collective
features underlying the empirical behaviours; from an experimental perspective,
we evidence a clear outline for the evolution of the social quantifiers
considered. The comparison between experimental results and theoretical
predictions is excellent and allows speculating that France, Italy and Spain
display a certain degree of {\em internal heterogeneity}, that is not found in
Germany and Switzerland; such heterogeneity, quite mild in France and in Spain,
is not negligible in Italy and highlights quantitative differences in the
customs of Northern and Southern regions. These findings may suggest the
persistence of two culturally distinct communities, long-term lasting heritages
of different and well-established cultures.Comment: in PLoS One (2015
A walk in the statistical mechanical formulation of neural networks
Neural networks are nowadays both powerful operational tools (e.g., for
pattern recognition, data mining, error correction codes) and complex
theoretical models on the focus of scientific investigation. As for the
research branch, neural networks are handled and studied by psychologists,
neurobiologists, engineers, mathematicians and theoretical physicists. In
particular, in theoretical physics, the key instrument for the quantitative
analysis of neural networks is statistical mechanics. From this perspective,
here, we first review attractor networks: starting from ferromagnets and
spin-glass models, we discuss the underlying philosophy and we recover the
strand paved by Hopfield, Amit-Gutfreund-Sompolinky. One step forward, we
highlight the structural equivalence between Hopfield networks (modeling
retrieval) and Boltzmann machines (modeling learning), hence realizing a deep
bridge linking two inseparable aspects of biological and robotic spontaneous
cognition. As a sideline, in this walk we derive two alternative (with respect
to the original Hebb proposal) ways to recover the Hebbian paradigm, stemming
from ferromagnets and from spin-glasses, respectively. Further, as these notes
are thought of for an Engineering audience, we highlight also the mappings
between ferromagnets and operational amplifiers and between antiferromagnets
and flip-flops (as neural networks -built by op-amp and flip-flops- are
particular spin-glasses and the latter are indeed combinations of ferromagnets
and antiferromagnets), hoping that such a bridge plays as a concrete
prescription to capture the beauty of robotics from the statistical mechanical
perspective.Comment: Contribute to the proceeding of the conference: NCTA 2014. Contains
12 pages,7 figure
Meta-stable states in the hierarchical Dyson model drive parallel processing in the hierarchical Hopfield network
In this paper we introduce and investigate the statistical mechanics of
hierarchical neural networks: First, we approach these systems \`a la Mattis,
by thinking at the Dyson model as a single-pattern hierarchical neural network
and we discuss the stability of different retrievable states as predicted by
the related self-consistencies obtained from a mean-field bound and from a
bound that bypasses the mean-field limitation. The latter is worked out by
properly reabsorbing fluctuations of the magnetization related to higher levels
of the hierarchy into effective fields for the lower levels. Remarkably, mixing
Amit's ansatz technique (to select candidate retrievable states) with the
interpolation procedure (to solve for the free energy of these states) we prove
that (due to gauge symmetry) the Dyson model accomplishes both serial and
parallel processing. One step forward, we extend this scenario toward multiple
stored patterns by implementing the Hebb prescription for learning within the
couplings. This results in an Hopfield-like networks constrained on a
hierarchical topology, for which, restricting to the low storage regime (where
the number of patterns grows at most logarithmical with the amount of neurons),
we prove the existence of the thermodynamic limit for the free energy and we
give an explicit expression of its mean field bound and of the related improved
boun
Topological properties of hierarchical networks
Hierarchical networks are attracting a renewal interest for modelling the
organization of a number of biological systems and for tackling the complexity
of statistical mechanical models beyond mean-field limitations. Here we
consider the Dyson hierarchical construction for ferromagnets, neural networks
and spin-glasses, recently analyzed from a statistical-mechanics perspective,
and we focus on the topological properties of the underlying structures. In
particular, we find that such structures are weighted graphs that exhibit high
degree of clustering and of modularity, with small spectral gap; the robustness
of such features with respect to link removal is also studied. These outcomes
are then discussed and related to the statistical mechanics scenario in full
consistency. Lastly, we look at these weighted graphs as Markov chains and we
show that in the limit of infinite size, the emergence of ergodicity breakdown
for the stochastic process mirrors the emergence of meta-stabilities in the
corresponding statistical mechanical analysis
Hierarchical neural networks perform both serial and parallel processing
In this work we study a Hebbian neural network, where neurons are arranged
according to a hierarchical architecture such that their couplings scale with
their reciprocal distance. As a full statistical mechanics solution is not yet
available, after a streamlined introduction to the state of the art via that
route, the problem is consistently approached through signal- to-noise
technique and extensive numerical simulations. Focusing on the low-storage
regime, where the amount of stored patterns grows at most logarithmical with
the system size, we prove that these non-mean-field Hopfield-like networks
display a richer phase diagram than their classical counterparts. In
particular, these networks are able to perform serial processing (i.e. retrieve
one pattern at a time through a complete rearrangement of the whole ensemble of
neurons) as well as parallel processing (i.e. retrieve several patterns
simultaneously, delegating the management of diff erent patterns to diverse
communities that build network). The tune between the two regimes is given by
the rate of the coupling decay and by the level of noise affecting the system.
The price to pay for those remarkable capabilities lies in a network's capacity
smaller than the mean field counterpart, thus yielding a new budget principle:
the wider the multitasking capabilities, the lower the network load and
viceversa. This may have important implications in our understanding of
biological complexity
From Dyson to Hopfield: Processing on hierarchical networks
We consider statistical-mechanical models for spin systems built on
hierarchical structures, which provide a simple example of non-mean-field
framework. We show that the coupling decay with spin distance can give rise to
peculiar features and phase diagrams much richer that their mean-field
counterpart. In particular, we consider the Dyson model, mimicking
ferromagnetism in lattices, and we prove the existence of a number of
meta-stabilities, beyond the ordered state, which get stable in the
thermodynamic limit. Such a feature is retained when the hierarchical structure
is coupled with the Hebb rule for learning, hence mimicking the modular
architecture of neurons, and gives rise to an associative network able to
perform both as a serial processor as well as a parallel processor, depending
crucially on the external stimuli and on the rate of interaction decay with
distance; however, those emergent multitasking features reduce the network
capacity with respect to the mean-field counterpart. The analysis is
accomplished through statistical mechanics, graph theory, signal-to-noise
technique and numerical simulations in full consistency. Our results shed light
on the biological complexity shown by real networks, and suggest future
directions for understanding more realistic models
Multitasking network with fast noise
We consider the multitasking associative network in the low-storage limit and
we study its phase diagram with respect to the noise level and the degree
of dilution in pattern entries. We find that the system is characterized by
a rich variety of stable states, among which pure states, parallel retrieval
states, hierarchically organized states and symmetric mixtures (remarkably,
both even and odd), whose complexity increases as the number of patterns
grows. The analysis is performed both analytically and numerically: Exploiting
techniques based on partial differential equations, allows us to get the
self-consistencies for the order parameters. Such self-consistence equations
are then solved and the solutions are further checked through stability theory
to catalog their organizations into the phase diagram, which is completely
outlined at the end. This is a further step toward the understanding of
spontaneous parallel processing in associative networks
- …