993 research outputs found
How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems
The maximum entropy principle (MEP) is a method for obtaining the most likely
distribution functions of observables from statistical systems, by maximizing
entropy under constraints. The MEP has found hundreds of applications in
ergodic and Markovian systems in statistical mechanics, information theory, and
statistics. For several decades there exists an ongoing controversy whether the
notion of the maximum entropy principle can be extended in a meaningful way to
non-extensive, non-ergodic, and complex statistical systems and processes. In
this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related
to multiplicities of independent random processes. We then show how the
relaxation of independence naturally leads to the most general entropies that
are compatible with the first three Shannon-Khinchin axioms, the
(c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept
for non-ergodic and complex statistical systems if their relative entropy can
be factored into a generalized multiplicity and a constraint term. The problem
of finding such a factorization reduces to finding an appropriate
representation of relative entropy in a linear basis. In a particular example
we show that path-dependent random processes with memory naturally require
specific generalized entropies. The example is the first exact derivation of a
generalized entropy from the microscopic properties of a path-dependent random
process.Comment: 6 pages, 1 figure. To appear in PNA
Generalized entropies and the transformation group of superstatistics
Superstatistics describes statistical systems that behave like superpositions
of different inverse temperatures , so that the probability distribution
is , where the `kernel' is nonnegative and normalized
(). We discuss the relation between this distribution
and the generalized entropic form . The first three
Shannon-Khinchin axioms are assumed to hold. It then turns out that for a given
distribution there are two different ways to construct the entropy. One
approach uses escort probabilities and the other does not; the question of
which to use must be decided empirically. The two approaches are related by a
duality. The thermodynamic properties of the system can be quite different for
the two approaches. In that connection we present the transformation laws for
the superstatistical distributions under macroscopic state changes. The
transformation group is the Euclidean group in one dimension.Comment: 5 pages, no figur
Parkinson's Law Quantified: Three Investigations on Bureaucratic Inefficiency
We formulate three famous, descriptive essays of C.N. Parkinson on
bureaucratic inefficiency in a quantifiable and dynamical socio-physical
framework. In the first model we show how the use of recent opinion formation
models for small groups can be used to understand Parkinson's observation that
decision making bodies such as cabinets or boards become highly inefficient
once their size exceeds a critical 'Coefficient of Inefficiency', typically
around 20. A second observation of Parkinson - which is sometimes referred to
as Parkinson's Law - is that the growth of bureaucratic or administrative
bodies usually goes hand in hand with a drastic decrease of its overall
efficiency. In our second model we view a bureaucratic body as a system of a
flow of workers, which enter, become promoted to various internal levels within
the system over time, and leave the system after having served for a certain
time. Promotion usually is associated with an increase of subordinates. Within
the proposed model it becomes possible to work out the phase diagram under
which conditions bureaucratic growth can be confined. In our last model we
assign individual efficiency curves to workers throughout their life in
administration, and compute the optimum time to send them to old age pension,
in order to ensure a maximum of efficiency within the body - in Parkinson's
words we compute the 'Pension Point'.Comment: 15 pages, 5 figure
The Effect of Active Video Games on the Heart Rate of Older Adults
Background: Heart rate is used as a health biomarker. This aim of this study was to investigate the effects of playing active video games on the heart rate of older adults, in comparison to the heart rate after common table recreational activity.
Methods: An experimental study with 40 participants was conducted: a control group (n=20) participated in common Pokeno® card games; an experimental group (n=20) played WiiTM bowling. The participants’ pre- and post-activity heart rates were measured and compared between and within groups using t-tests.
Results: The findings signified an 11.9% increase (p
Conclusions: The inclusion of active video games in older adults’ recreational activities can increase their daily activity level to bring long-term health benefits
Schumpeterian economic dynamics as a quantifiable minimum model of evolution
We propose a simple quantitative model of Schumpeterian economic dynamics.
New goods and services are endogenously produced through combinations of
existing goods. As soon as new goods enter the market they may compete against
already existing goods, in other words new products can have destructive
effects on existing goods. As a result of this competition mechanism existing
goods may be driven out from the market - often causing cascades of secondary
defects (Schumpeterian gales of destruction). The model leads to a generic
dynamics characterized by phases of relative economic stability followed by
phases of massive restructuring of markets - which could be interpreted as
Schumpeterian business `cycles'. Model timeseries of product diversity and
productivity reproduce several stylized facts of economics timeseries on long
timescales such as GDP or business failures, including non-Gaussian fat tailed
distributions, volatility clustering etc. The model is phrased in an open,
non-equilibrium setup which can be understood as a self organized critical
system. Its diversity dynamics can be understood by the time-varying topology
of the active production networks.Comment: 21 pages, 11 figure
Is the Tsallis entropy stable?
The question of whether the Tsallis entropy is Lesche-stable is revisited. It
is argued that when physical averages are computed with the escort
probabilities, the correct application of the concept of Lesche-stability
requires use of the escort probabilities. As a consequence, as shown here, the
Tsallis entropy is unstable but the thermodynamic averages are stable. We
further show that Lesche stability as well as thermodynamic stability can be
obtained if the homogeneous entropy is used as the basis of the formulation of
non-extensive thermodynamics. In this approach, the escort distribution arises
naturally as a secondary structure.Comment: 6 page
The phase transition in random catalytic sets
The notion of (auto) catalytic networks has become a cornerstone in
understanding the possibility of a sudden dramatic increase of diversity in
biological evolution as well as in the evolution of social and economical
systems. Here we study catalytic random networks with respect to the final
outcome diversity of products. We show that an analytical treatment of this
longstanding problem is possible by mapping the problem onto a set of
non-linear recurrence equations. The solution of these equations show a crucial
dependence of the final number of products on the initial number of products
and the density of catalytic production rules. For a fixed density of rules we
can demonstrate the existence of a phase transition from a practically
unpopulated regime to a fully populated and diverse one. The order parameter is
the number of final products. We are able to further understand the origin of
this phase transition as a crossover from one set of solutions from a quadratic
equation to the other.Comment: 7 pages, ugly eps files due to arxiv restriction
Generalized information entropies depending only on the probability distribution
Systems with a long-term stationary state that possess as a spatio-temporally
fluctuation quantity can be described by a superposition of several
statistics, a "super statistics". We consider first, the Gamma, log-normal and
-distributions of . It is assumed that they depend only on , the
probability associated with the microscopic configuration of the system. For
each of the three distributions we calculate the Boltzmann factors and
show that they coincide for small variance of the fluctuations. For the Gamma
distribution it is possible to calculate the entropy in a closed form,
depending on , and to obtain then an equation relating with . We also propose, as other examples, new entropies close related with the
Kaniadakis and two possible Sharma-Mittal entropies. The entropies presented in
this work do not depend on a constant parameter but on . For the
-Gamma distribution and its corresponding Boltzmann factor
and the associated entropy, we show the validity of the saddle-point
approximation. We also briefly discuss the generalization of one of the four
Khinchin axioms to get this proposed entropy.Comment: 13 pages, 3 figure
Peer-review in a world with rational scientists: Toward selection of the average
One of the virtues of peer review is that it provides a self-regulating
selection mechanism for scientific work, papers and projects. Peer review as a
selection mechanism is hard to evaluate in terms of its efficiency. Serious
efforts to understand its strengths and weaknesses have not yet lead to clear
answers. In theory peer review works if the involved parties (editors and
referees) conform to a set of requirements, such as love for high quality
science, objectiveness, and absence of biases, nepotism, friend and clique
networks, selfishness, etc. If these requirements are violated, what is the
effect on the selection of high quality work? We study this question with a
simple agent based model. In particular we are interested in the effects of
rational referees, who might not have any incentive to see high quality work
other than their own published or promoted. We find that a small fraction of
incorrect (selfish or rational) referees can drastically reduce the quality of
the published (accepted) scientific standard. We quantify the fraction for
which peer review will no longer select better than pure chance. Decline of
quality of accepted scientific work is shown as a function of the fraction of
rational and unqualified referees. We show how a simple quality-increasing
policy of e.g. a journal can lead to a loss in overall scientific quality, and
how mutual support-networks of authors and referees deteriorate the system.Comment: 5 pages 4 figure
- …