38,819 research outputs found
Modelling linguistic taxonomic dynamics
This paper presents the results of the application of a bit-string model of
languages (Schulze and Stauffer 2005) to problems of taxonomic patterns. The
questions addressed include the following: (1) Which parameters are minimally
ne eded for the development of a taxonomic dynamics leading to the type of
distribution of language family sizes currently attested (as measured in the i
number of languages per family), which appears to be a power-law? (2) How may
such a model be coupled with one of the dynamics of speaker populations leading
to the type of language size seen today, which appears to follow a log-normal
distribution?Comment: 18 pages including 9 figure
Dynamical systems theory for music dynamics
We show that, when music pieces are cast in the form of time series of pitch
variations, the concepts and tools of dynamical systems theory can be applied
to the analysis of {\it temporal dynamics} in music. (i) Phase space portraits
are constructed from the time series wherefrom the dimensionality is evaluated
as a measure of the {\pit global} dynamics of each piece. (ii) Spectral
analysis of the time series yields power spectra () close to
{\pit red noise} () in the low frequency range. (iii) We define an
information entropy which provides a measure of the {\pit local} dynamics in
the musical piece; the entropy can be interpreted as an evaluation of the
degree of {\it complexity} in the music, but there is no evidence of an
analytical relation between local and global dynamics. These findings are based
on computations performed on eighty sequences sampled in the music literature
from the 18th to the 20th century.Comment: To appear in CHAOS. Figures and Tables (not included) can be obtained
from [email protected]
Social interaction as a heuristic for combinatorial optimization problems
We investigate the performance of a variant of Axelrod's model for
dissemination of culture - the Adaptive Culture Heuristic (ACH) - on solving an
NP-Complete optimization problem, namely, the classification of binary input
patterns of size by a Boolean Binary Perceptron. In this heuristic,
agents, characterized by binary strings of length which represent possible
solutions to the optimization problem, are fixed at the sites of a square
lattice and interact with their nearest neighbors only. The interactions are
such that the agents' strings (or cultures) become more similar to the low-cost
strings of their neighbors resulting in the dissemination of these strings
across the lattice. Eventually the dynamics freezes into a homogeneous
absorbing configuration in which all agents exhibit identical solutions to the
optimization problem. We find through extensive simulations that the
probability of finding the optimal solution is a function of the reduced
variable so that the number of agents must increase with the fourth
power of the problem size, , to guarantee a fixed probability
of success. In this case, we find that the relaxation time to reach an
absorbing configuration scales with which can be interpreted as the
overall computational cost of the ACH to find an optimal set of weights for a
Boolean Binary Perceptron, given a fixed probability of success
Growth Kinetics in Systems with Local Symmetry
The phase transition kinetics of Ising gauge models are investigated. Despite
the absence of a local order parameter, relevant topological excitations that
control the ordering kinetics can be identified. Dynamical scaling holds in the
approach to equilibrium, and the growth of typical length scale is
characteristic of a new universality class with . We suggest that the asymptotic kinetics of the 2D Ising gauge
model is dual to that of the 2D annihilating random walks, a process also known
as the diffusion-reaction .Comment: 10 pages in Tex, 2 Postscript figures appended, NSF-ITP-93-4
Minority Game of price promotions in fast moving consumer goods markets
A variation of the Minority Game has been applied to study the timing of
promotional actions at retailers in the fast moving consumer goods market. The
underlying hypotheses for this work are that price promotions are more
effective when fewer than average competitors do a promotion, and that a
promotion strategy can be based on past sales data. The first assumption has
been checked by analysing 1467 promotional actions for three products on the
Dutch market (ketchup, mayonnaise and curry sauce) over a 120-week period, both
on an aggregated level and on retailer chain level.
The second assumption was tested by analysing past sales data with the
Minority Game. This revealed that high or low competitor promotional pressure
for actual ketchup, mayonnaise, curry sauce and barbecue sauce markets is to
some extent predictable up to a forecast of some 10 weeks. Whereas a random
guess would be right 50% of the time, a single-agent game can predict the
market with a success rate of 56% for a 6 to 9 week forecast. This number is
the same for all four mentioned fast moving consumer markets. For a multi-agent
game a larger variability in the success rate is obtained, but predictability
can be as high as 65%.
Contrary to expectation, the actual market does the opposite of what game
theory would predict. This points at a systematic oscillation in the market.
Even though this result is not fully understood, merely observing that this
trend is present in the data could lead to exploitable trading benefits. As a
check, random history strings were generated from which the statistical
variation in the game prediction was studied. This shows that the odds are
1:1,000,000 that the observed pattern in the market is based on coincidence.Comment: 19 pages, 10 figures, accepted for publication in Physica
Von Neumann Normalisation of a Quantum Random Number Generator
In this paper we study von Neumann un-biasing normalisation for ideal and
real quantum random number generators, operating on finite strings or infinite
bit sequences. In the ideal cases one can obtain the desired un-biasing. This
relies critically on the independence of the source, a notion we rigorously
define for our model. In real cases, affected by imperfections in measurement
and hardware, one cannot achieve a true un-biasing, but, if the bias "drifts
sufficiently slowly", the result can be arbitrarily close to un-biasing. For
infinite sequences, normalisation can both increase or decrease the
(algorithmic) randomness of the generated sequences. A successful application
of von Neumann normalisation---in fact, any un-biasing transformation---does
exactly what it promises, un-biasing, one (among infinitely many) symptoms of
randomness; it will not produce "true" randomness.Comment: 27 pages, 2 figures. Updated to published versio
- …