809 research outputs found
What if you know it all? Quantifying human behavior from a virtual world
We use a massive multiplayer online game to study human interactions and social behaviour. We have complete information on every action carried out by each of the 480.000 players in the game. This complete information on a human society, in particular its time varying social networks of several types allows us to quantify how humans form social bounds, how humans organise, how behaviour is gender specific, and how wealth of players is related to positions in their social multiplex networks
How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems
The maximum entropy principle (MEP) is a method for obtaining the most likely
distribution functions of observables from statistical systems, by maximizing
entropy under constraints. The MEP has found hundreds of applications in
ergodic and Markovian systems in statistical mechanics, information theory, and
statistics. For several decades there exists an ongoing controversy whether the
notion of the maximum entropy principle can be extended in a meaningful way to
non-extensive, non-ergodic, and complex statistical systems and processes. In
this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related
to multiplicities of independent random processes. We then show how the
relaxation of independence naturally leads to the most general entropies that
are compatible with the first three Shannon-Khinchin axioms, the
(c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept
for non-ergodic and complex statistical systems if their relative entropy can
be factored into a generalized multiplicity and a constraint term. The problem
of finding such a factorization reduces to finding an appropriate
representation of relative entropy in a linear basis. In a particular example
we show that path-dependent random processes with memory naturally require
specific generalized entropies. The example is the first exact derivation of a
generalized entropy from the microscopic properties of a path-dependent random
process.Comment: 6 pages, 1 figure. To appear in PNA
Scaling-violation phenomena and fractality in the human posture control systems
By analyzing the movements of quiet standing persons by means of wavelet
statistics, we observe multiple scaling regions in the underlying body
dynamics. The use of the wavelet-variance function opens the possibility to
relate scaling violations to different modes of posture control. We show that
scaling behavior becomes close to perfect, when correctional movements are
dominated by the vestibular system.Comment: 12 pages, 4 figures, to appear in Phys. Rev.
On the robustness of q-expectation values and Renyi entropy
We study the robustness of functionals of probability distributions such as
the R\'enyi and nonadditive S_q entropies, as well as the q-expectation values
under small variations of the distributions. We focus on three important types
of distribution functions, namely (i) continuous bounded (ii) discrete with
finite number of states, and (iii) discrete with infinite number of states. The
physical concept of robustness is contrasted with the mathematically stronger
condition of stability and Lesche-stability for functionals. We explicitly
demonstrate that, in the case of continuous distributions, once unbounded
distributions and those leading to negative entropy are excluded, both Renyi
and nonadditive S_q entropies as well as the q-expectation values are robust.
For the discrete finite case, the Renyi and nonadditive S_q entropies and the
q-expectation values are robust. For the infinite discrete case, where both
Renyi entropy and q-expectations are known to violate Lesche-stability and
stability respectively, we show that one can nevertheless state conditions
which guarantee physical robustness.Comment: 6 pages, to appear in Euro Phys Let
Opinion Formation in Laggard Societies
We introduce a statistical physics model for opinion dynamics on random
networks where agents adopt the opinion held by the majority of their direct
neighbors only if the fraction of these neighbors exceeds a certain threshold,
p_u. We find a transition from total final consensus to a mixed phase where
opinions coexist amongst the agents. The relevant parameters are the relative
sizes in the initial opinion distribution within the population and the
connectivity of the underlying network. As the order parameter we define the
asymptotic state of opinions. In the phase diagram we find regions of total
consensus and a mixed phase. As the 'laggard parameter' p_u increases the
regions of consensus shrink. In addition we introduce rewiring of the
underlying network during the opinion formation process and discuss the
resulting consequences in the phase diagram.Comment: 5 pages, eps fig
Basel III capital surcharges for G-SIBs are far less effective in managing systemic risk in comparison to network-based, systemic risk-dependent financial transaction taxes
In addition to constraining bilateral exposures of financial institutions, there exist essentially two options for future financial regulation of systemic risk: First, regulation could attempt to reduce the financial fragility of global or domestic systemically important financial institutions (G-SIBs or D-SIBs), as for instance proposed by Basel III. Second, it could focus on strengthening the financial system as a whole by reducing the probability of large-scale cascading events. This can be achieved by re-shaping the topology of financial networks. We use an agent-based model of a financial system and the real economy to study and compare the consequences of these two options. By conducting three computer experiments with the agent-based model we find that re-shaping financial networks is more effective and efficient than reducing financial fragility. Capital surcharges for G-SIBs could reduce systemic risk, but they would have to be substantially larger than those specified in the current Basel III proposal in order to have a measurable impact. This would cause a loss of efficiency
Topology without cooling: instantons and monopoles near to deconfinement
In an attempt to describe the change of topological structure of pure SU(2)
gauge theory near deconfinement a renormalization group inspired method is
tested. Instead of cooling, blocking and subsequent inverse blocking is applied
to Monte Carlo configurations to capture topological features at a well-defined
scale. We check that this procedure largely conserves long range physics like
string tension. UV fluctuations and lattice artefacts are removed which
otherwise spoil topological charge density and Abelian monopole currents. We
report the behaviour of topological susceptibility and monopole current
densities across the deconfinement transition and relate the two faces of
topology to each other. First results of a cluster analysis are described.Comment: 6 pages, 8 figures, LaTeX with espcrc2.sty. Talk and poster presented
at Lattice97, Edinburgh, 22-26 July 1997, to appear in Nucl. Phys. B
(Proc.Suppl.
- …