108,064 research outputs found
Random copying in space
Random copying is a simple model for population dynamics in the absence of
selection, and has been applied to both biological and cultural evolution. In
this work, we investigate the effect that spatial structure has on the
dynamics. We focus in particular on how a measure of the diversity in the
population changes over time. We show that even when the vast majority of a
population's history may be well-described by a spatially-unstructured model,
spatial structure may nevertheless affect the expected level of diversity seen
at a local scale. We demonstrate this phenomenon explicitly by examining the
random copying process on small-world networks, and use our results to comment
on the use of simple random-copying models in an empirical context.Comment: 26 pages, 11 figures. Based on invited talk at AHRC CECD Conference
on "Cultural Evolution in Spatially Structured Populations" at UCL, September
2010. To appear in ACS - Advances in Complex System
Spatially self-organized resilient networks by a distributed cooperative mechanism
The robustness of connectivity and the efficiency of paths are incompatible
in many real networks. We propose a self-organization mechanism for
incrementally generating onion-like networks with positive degree-degree
correlations whose robustness is nearly optimal. As a spatial extension of the
generation model based on cooperative copying and adding shortcut, we show that
the growing networks become more robust and efficient through enhancing the
onion-like topological structure on a space. The reasonable constraint for
locating nodes on the perimeter in typical surface growth as a self-propagation
does not affect these properties of the tolerance and the path length.
Moreover, the robustness can be recovered in the random growth damaged by
insistent sequential attacks even without any remedial measures.Comment: 34 pages, 12 figures, 2 table
Randomness and Complexity in Networks
I start by reviewing some basic properties of random graphs. I then consider
the role of random walks in complex networks and show how they may be used to
explain why so many long tailed distributions are found in real data sets. The
key idea is that in many cases the process involves copying of properties of
near neighbours in the network and this is a type of short random walk which in
turn produce a natural preferential attachment mechanism. Applying this to
networks of fixed size I show that copying and innovation are processes with
special mathematical properties which include the ability to solve a simple
model exactly for any parameter values and at any time. I finish by looking at
variations of this basic model.Comment: Survey paper based on talk given at the workshop on ``Stochastic
Networks and Internet Technology'', Centro di Ricerca Matematica Ennio De
Giorgi, Matematica nelle Scienze Naturali e Sociali, Pisa, 17th - 21st
September 2007. To appear in proceeding
Quasispecies distribution of Eigen model
We study sharp peak landscapes (SPL) of Eigen model from a new perspective
about how the quasispecies distribute in the sequence space. To analyze the
distribution more carefully, we bring forth two tools. One tool is the variance
of Hamming distance of the sequences at a given generation. It not only offers
us a different avenue for accurately locating the error threshold and
illustrates how the configuration of the distribution varies with copying
fidelity in the sequence space, but also divides the copying fidelity into
three distinct regimes. The other tool is the similarity network of a certain
Hamming distance , by which we can get a visual and in-depth result
about how the sequences distribute. We find that there are several local optima
around the center (global optimum) in the distribution of the sequences
reproduced near the threshold. Furthermore, it is interesting that the
distribution of clustering coefficient follows lognormal distribution
and the curve of clustering coefficient of the network versus
appears as linear behavior near the threshold.Comment: 13 pages, 6 figure
Adiabatic quantum algorithm for search engine ranking
We propose an adiabatic quantum algorithm for generating a quantum pure state
encoding of the PageRank vector, the most widely used tool in ranking the
relative importance of internet pages. We present extensive numerical
simulations which provide evidence that this algorithm can prepare the quantum
PageRank state in a time which, on average, scales polylogarithmically in the
number of webpages. We argue that the main topological feature of the
underlying web graph allowing for such a scaling is the out-degree
distribution. The top ranked entries of the quantum PageRank state
can then be estimated with a polynomial quantum speedup. Moreover, the quantum
PageRank state can be used in "q-sampling" protocols for testing properties of
distributions, which require exponentially fewer measurements than all
classical schemes designed for the same task. This can be used to decide
whether to run a classical update of the PageRank.Comment: 7 pages, 5 figures; closer to published versio
The Degree Distribution of Random k-Trees
A power law degree distribution is established for a graph evolution model
based on the graph class of k-trees. This k-tree-based graph process can be
viewed as an idealized model that captures some characteristics of the
preferential attachment and copying mechanisms that existing evolving graph
processes fail to model due to technical obstacles. The result also serves as a
further cautionary note reinforcing the point of view that a power law degree
distribution should not be regarded as the only important characteristic of a
complex network, as has been previously argued
Quantum copying can increase the practically available information
While it is known that copying a quantum system does not increase the amount
of information obtainable about the originals, it may increase the amount
available in practice, when one is restricted to imperfect measurements. We
present a detection scheme which using imperfect detectors, and possibly noisy
quantum copying machines (that entangle the copies), allows one to extract more
information from an incoming signal, than with the imperfect detectors alone.
The case of single-photon detection with noisy, inefficient detectors and
copiers (single controlled-NOT gates in this case) is investigated in detail.
The improvement in distinguishability between a photon and vacuum is found to
occur for a wide range of parameters, and to be quite robust to random noise.
The properties that a quantum copying device must have to be useful in this
scheme are investigated.Comment: 10 pages, 6 figures, accepted PR
- …