19,742 research outputs found
The Apparent Fractal Conjecture
This short communication advances the hypothesis that the observed fractal
structure of large-scale distribution of galaxies is due to a geometrical
effect, which arises when observational quantities relevant for the
characterization of a cosmological fractal structure are calculated along the
past light cone. If this hypothesis proves, even partially, correct, most, if
not all, objections raised against fractals in cosmology may be solved. For
instance, under this view the standard cosmology has zero average density, as
predicted by an infinite fractal structure, with, at the same time, the
cosmological principle remaining valid. The theoretical results which suggest
this conjecture are reviewed, as well as possible ways of checking its
validity.Comment: 6 pages, LaTeX. Text unchanged. Two references corrected. Contributed
paper presented at the "South Africa Relativistic Cosmology Conference in
Honour of George F. R. Ellis 60th Birthday"; University of Cape Town,
February 1-5, 199
The Quest for Bandwidth Estimation Techniques for large-scale Distributed Systems
In recent years the research community has developed many techniques to estimate the end-to-end available bandwidth of an Internet path. This important metric has been proposed for use in several distributed systems and, more recently, has even been considered to improve the congestion control mechanism of TCP. Thus, it has been suggested that some existing estimation techniques could be used for this purpose. However, existing tools were not designed for large-scale deployments and were mostly validated in controlled settings, considering only one measurement running at a time. In this paper, we argue that current tools, while offering good estimates when used alone, might not work in large-scale systems where several estimations severely interfere with each other. We analyze the properties of the measurement paradigms employed today and discuss their functioning, study their overhead and analyze their interference. Our testbed results show that current techniques are insufficient as they are. Finally, we will discuss and propose some principles that should be taken into account for including available bandwidth measurements in large-scale distributed systems. 1
Dirac points merging and wandering in a model Chern insulator
We present a model for a Chern insulator on the square lattice with complex
first and second neighbor hoppings and a sublattice potential which displays an
unexpectedly rich physics. Similarly to the celebrated Haldane model, the
proposed Chern insulator has two topologically non-trivial phases with Chern
numbers . As a distinctive feature of the present model, phase
transitions are associated to Dirac points that can move, merge and split in
momentum space, at odds with Haldane's Chern insulator where Dirac points are
bound to the corners of the hexagonal Brillouin zone. Additionally, the
obtained phase diagram reveals a peculiar phase transition line between two
distinct topological phases, in contrast to the Haldane model where such
transition is reduced to a point with zero sublattice potential. The model is
amenable to be simulated in optical lattices, facilitating the study of phase
transitions between two distinct topological phases and the experimental
analysis of Dirac points merging and wandering
Mesonic states in the generalised Nambu-Jona-Lasinio theories
For any Nambu-Jona-Lasinio model of QCD with arbitrary nonlocal,
instantaneous, quark current-current confining kernels, we use a generalised
Bogoliubov technique to go beyond BCS level (in the large-Nc limit) so as to
explicitly build quark-antiquark compound operators for creating/annihilating
mesons. In the Hamiltonian approach, the mesonic bound-state equations appear
(from the generalised Bogoliubov transformation) as mass-gap-like equations
which, in turn, ensure the absence, in the Hamiltonian, of mesonic Bogoliubov
anomalous terms. We go further to demonstrate the one-to-one correspondence
between Hamiltonian and Bethe-Salpeter approaches to non-local NJL-type models
for QCD and give the corresponding "dictionary" necessary to "translate" the
amplitudes built using the graphical Feynman rules to the terms of the
Hamiltonian, and vice versa. We comment on the problem of multiple vacua
existence in such type of models and argue that mesonic states in the theory
should be prescribed to have an extra index - the index of the replica in which
they are created. Then the completely diagonalised Hamiltonian should contain a
sum over this new index. The method is proved to be general and valid for any
instantaneous quark kernel.Comment: LaTeX2e, uses aipproc class, Talk given at the conference "Quark
Confinement and the Hadron Spectrum VI", 21-25 September 2004, Sardinia,
Italy, to appear in Proceeding
Quantum field theory approach to the vacuum replica in QCD
Quantum field theory is used to describe the contribution of possible new QCD
vacuum replica to hadronic processes. This sigma-like new state has been
recently shown to be likely to appear for any realistic four-quark interaction
kernel as a consequence of chiral symmetry. The local operator creating the
replica vacuum state is constructed explicitly. Applications to physical
processes are outlined.Comment: LaTeX2e, 2 EPS figures, uses ws-procs9x6 (included) and epsfig
classes, Talk given at the conference "Quark Confinement and the Hadron
Spectrum V", 10-14 September 2002, Gargnano, Italy, to appear in Proceeding
History of art paintings through the lens of entropy and complexity
Art is the ultimate expression of human creativity that is deeply influenced
by the philosophy and culture of the corresponding historical epoch. The
quantitative analysis of art is therefore essential for better understanding
human cultural evolution. Here we present a large-scale quantitative analysis
of almost 140 thousand paintings, spanning nearly a millennium of art history.
Based on the local spatial patterns in the images of these paintings, we
estimate the permutation entropy and the statistical complexity of each
painting. These measures map the degree of visual order of artworks into a
scale of order-disorder and simplicity-complexity that locally reflects
qualitative categories proposed by art historians. The dynamical behavior of
these measures reveals a clear temporal evolution of art, marked by transitions
that agree with the main historical periods of art. Our research shows that
different artistic styles have a distinct average degree of entropy and
complexity, thus allowing a hierarchical organization and clustering of styles
according to these metrics. We have further verified that the identified groups
correspond well with the textual content used to qualitatively describe the
styles, and that the employed complexity-entropy measures can be used for an
effective classification of artworks.Comment: 10 two-column pages, 5 figures; accepted for publication in PNAS
[supplementary information available at
http://www.pnas.org/highwire/filestream/824089/field_highwire_adjunct_files/0/pnas.1800083115.sapp.pdf
The Advantage of Playing Home in NBA: Microscopic, Team-Specific and Evolving Features
The idea that the success rate of a team increases when playing home is
broadly accepted and documented for a wide variety of sports. Investigations on
the so-called home advantage phenomenon date back to the 70's and every since
has attracted the attention of scholars and sport enthusiasts. These studies
have been mainly focused on identifying the phenomenon and trying to correlate
it with external factors such as crowd noise and referee bias. Much less is
known about the effects of home advantage in the microscopic dynamics of the
game (within the game) or possible team-specific and evolving features of this
phenomenon. Here we present a detailed study of these previous features in the
National Basketball Association (NBA). By analyzing play-by-play events of more
than sixteen thousand games that span thirteen NBA seasons, we have found that
home advantage affects the microscopic dynamics of the game by increasing the
scoring rates and decreasing the time intervals between scores of teams playing
home. We verified that these two features are different among the NBA teams,
for instance, the scoring rate of the Cleveland Cavaliers team is increased
0.16 points per minute (on average the seasons 2004-05 to 2013-14) when playing
home, whereas for the New Jersey Nets (now the Brooklyn Nets) this rate
increases in only 0.04 points per minute. We further observed that these
microscopic features have evolved over time in a non-trivial manner when
analyzing the results team-by-team. However, after averaging over all teams
some regularities emerge; in particular, we noticed that the average
differences in the scoring rates and in the characteristic times (related to
the time intervals between scores) have slightly decreased over time,
suggesting a weakening of the phenomenon.Comment: Accepted for publication in PLoS ON
- …