7,774 research outputs found
Hide and seek on complex networks
Signaling pathways and networks determine the ability to communicate in
systems ranging from living cells to human society. We investigate how the
network structure constrains communication in social-, man-made and biological
networks. We find that human networks of governance and collaboration are
predictable on teat-a-teat level, reflecting well defined pathways, but
globally inefficient. In contrast, the Internet tends to have better overall
communication abilities, more alternative pathways, and is therefore more
robust. Between these extremes the molecular network of Saccharomyces cerevisea
is more similar to the simpler social systems, whereas the pattern of
interactions in the more complex Drosophilia melanogaster, resembles the robust
Internet.Comment: 5 pages, 5 figure
Simple observations concerning black holes and probability
It is argued that black holes and the limit distributions of probability
theory share several properties when their entropy and information content are
compared. In particular the no-hair theorem, the entropy maximization and
holographic bound, and the quantization of entropy of black holes have their
respective analogues for stable limit distributions. This observation suggests
that the central limit theorem can play a fundamental role in black hole
statistical mechanics and in a possibly emergent nature of gravity.Comment: 6 pages Latex, final version. Essay awarded "Honorable Mention" in
the Gravity Research Foundation 2009 Essay Competitio
High-pressure synthesis of rock salt LiMeO2-ZnO (Me = Fe3+, Ti3+) solid solutions
Metastable LiMeO2-ZnO (Me = Fe3+, Ti3+) solid solutions with rock salt
crystal structure have been synthesized by solid state reaction of ZnO with
LiMeO2 complex oxides at 7.7 GPa and 1350-1450 K. Structure, phase composition,
thermal stability and thermal expansion of the recovered samples have been
studied by X-ray diffraction with synchrotron radiation. At ambient pressure
rock salt LiMeO2-ZnO solid solutions are kinetically stable up to 670-800 K
depending on the composition.Comment: 11 pages, 3 figures, 1 tabl
A Classical Bound on Quantum Entropy
A classical upper bound for quantum entropy is identified and illustrated,
, involving the variance
in phase space of the classical limit distribution of a given system. A
fortiori, this further bounds the corresponding information-theoretical
generalizations of the quantum entropy proposed by Renyi.Comment: Latex2e, 7 pages, publication versio
Heat transport in Bi_{2+x}Sr_{2-x}CuO_{6+\delta}: departure from the Wiedemann-Franz law in the vicinity of the metal-insulator transition
We present a study of heat transport in the cuprate superconductor
Bi_{2+x}Sr_{2-x}CuO_{6+\delta} at subkelvin temperatures and in magnetic fields
as high as 25T. In several samples with different doping levels close to
optimal, the linear-temperature term of thermal conductivity was measured both
at zero-field and in presence of a magnetic field strong enough to quench
superconductivity. The zero-field data yields a superconducting gap of
reasonable magnitude displaying a doping dependence similar to the one reported
in other families of cuprate. The normal-state data together with the results
of the resistivity measurements allows us to test the Wiedemann-Franz(WF) law,
the validity of which was confirmed in an overdoped sample in agreement with
previous studies. In contrast, a systematic deviation from the WF law was
resolved for samples displaying either a lower doping content or a higher
disorder. Thus, in the vicinity of the metal-insulator cross-over, heat
conduction in the zero-temperature limit appears to become significantly larger
than predicted by the WF law. Possible origins of this observation are
discussed.Comment: 9 pages including 7 figures, submitted to Phys. Rev.
A typical reconstruction limit of compressed sensing based on Lp-norm minimization
We consider the problem of reconstructing an -dimensional continuous
vector \bx from constraints which are generated by its linear
transformation under the assumption that the number of non-zero elements of
\bx is typically limited to (). Problems of this
type can be solved by minimizing a cost function with respect to the -norm
||\bx||_p=\lim_{\epsilon \to +0}\sum_{i=1}^N |x_i|^{p+\epsilon}, subject to
the constraints under an appropriate condition. For several , we assess a
typical case limit , which represents a critical relation
between and for successfully reconstructing the original
vector by minimization for typical situations in the limit
with keeping finite, utilizing the replica method. For ,
is considerably smaller than its worst case counterpart, which
has been rigorously derived by existing literature of information theory.Comment: 12 pages, 2 figure
H-theorem for classical matter around a black hole
We propose a classical solution for the kinetic description of matter falling
into a black hole, which permits to evaluate both the kinetic entropy and the
entropy production rate of classical infalling matter at the event horizon. The
formulation is based on a relativistic kinetic description for classical
particles in the presence of an event horizon. An H-theorem is established
which holds for arbitrary models of black holes and is valid also in the
presence of contracting event horizons
The elusive source of quantum effectiveness
We discuss two qualities of quantum systems: various correlations existing
between their subsystems and the distingushability of different quantum states.
This is then applied to analysing quantum information processing. While quantum
correlations, or entanglement, are clearly of paramount importance for
efficient pure state manipulations, mixed states present a much richer arena
and reveal a more subtle interplay between correlations and distinguishability.
The current work explores a number of issues related with identifying the
important ingredients needed for quantum information processing. We discuss the
Deutsch-Jozsa algorithm, the Shor algorithm, the Grover algorithm and the power
of a single qubit class of algorithms. One section is dedicated to cluster
states where entanglement is crucial, but its precise role is highly
counter-intuitive. Here we see that distinguishability becomes a more useful
concept.Comment: 8 pages, no figure
Exploring the randomness of Directed Acyclic Networks
The feed-forward relationship naturally observed in time-dependent processes
and in a diverse number of real systems -such as some food-webs and electronic
and neural wiring- can be described in terms of so-called directed acyclic
graphs (DAGs). An important ingredient of the analysis of such networks is a
proper comparison of their observed architecture against an ensemble of
randomized graphs, thereby quantifying the {\em randomness} of the real systems
with respect to suitable null models. This approximation is particularly
relevant when the finite size and/or large connectivity of real systems make
inadequate a comparison with the predictions obtained from the so-called {\em
configuration model}. In this paper we analyze four methods of DAG
randomization as defined by the desired combination of topological invariants
(directed and undirected degree sequence and component distributions) aimed to
be preserved. A highly ordered DAG, called \textit{snake}-graph and a
Erd\:os-R\'enyi DAG were used to validate the performance of the algorithms.
Finally, three real case studies, namely, the \textit{C. elegans} cell lineage
network, a PhD student-advisor network and the Milgram's citation network were
analyzed using each randomization method. Results show how the interpretation
of degree-degree relations in DAGs respect to their randomized ensembles depend
on the topological invariants imposed. In general, real DAGs provide disordered
values, lower than the expected by chance when the directedness of the links is
not preserved in the randomization process. Conversely, if the direction of the
links is conserved throughout the randomization process, disorder indicators
are close to the obtained from the null-model ensemble, although some
deviations are observed.Comment: 13 pages, 5 figures and 5 table
A Theory of Change for One-on-One Peer support for older adolescents and young adults
Peer support has become increasingly available as a formal mental health service. However, high quality research and implementation of peer support has been hampered over the years by the lack of theory that clarifies peer support roles and explains exactly how these roles foster positive outcomes for peer support users. Observers have noted that theory is particularly sparse in regard to peer support for older adolescents and young adults, and they have called for theory that not only clarifies roles and mechanisms of impact, but also identifies how peer support for young people might differ from peer support for older adults This qualitative study brought young people with experience providing and using peer support together in small group discussions focused on understanding the activities and outcomes of peer support. This information was used to develop a theory of change that outlines key activities that constitute a one-on-one peer support role for young people, and describes how and why carrying out these activities should lead to positive outcomes. The theory highlights the characteristics of a successful “peerness-based relationship,” and proposes that the development of this kind of relationship mediates other positive outcomes from peer support. The article concludes with a discussion of how this theory can usefully inform the development and specification of peer support roles, training and supervision, and other organizational supports
- …