795 research outputs found
Natural genetic engineering: intelligence & design in evolution?
There are many things that I like about James Shapiro's new book "Evolution: A View from the 21st Century" (FT Press Science, 2011). He begins the book by saying that it is the creation of novelty, and not selection, that is important in the history of life. In the presence of heritable traits that vary, selection results in the evolution of a population towards an optimal composition of those traits. But selection can only act on changes - and where does this variation come from? Historically, the creation of novelty has been assumed to be the result of random chance or accident. And yet, organisms seem 'designed'. When one examines the data from sequenced genomes, the changes appear NOT to be random or accidental, but one observes that whole chunks of the genome come and go. These 'chunks' often contain functional units, encoding sets of genes that together can perform some specific function. Shapiro argues that what we see in genomes is 'Natural Genetic Engineering', or designed evolution: "Thinking about genomes from an informatics perspective, it is apparent that systems engineering is a better metaphor for the evolutionary process than the conventional view of evolution as a select-biased random walk through limitless space of possible DNA configurations" (page 6)
Quantum enhanced positioning and clock synchronization
A wide variety of positioning and ranging procedures are based on repeatedly
sending electromagnetic pulses through space and measuring their time of
arrival. This paper shows that quantum entanglement and squeezing can be
employed to overcome the classical power/bandwidth limits on these procedures,
enhancing their accuracy. Frequency entangled pulses could be used to construct
quantum positioning systems (QPS), to perform clock synchronization, or to do
ranging (quantum radar): all of these techniques exhibit a similar enhancement
compared with analogous protocols that use classical light. Quantum
entanglement and squeezing have been exploited in the context of
interferometry, frequency measurements, lithography, and algorithms. Here, the
problem of positioning a party (say Alice) with respect to a fixed array of
reference points will be analyzed.Comment: 4 pages, 2 figures. Accepted for publication by Natur
Theory of Photon Blockade by an Optical Cavity with One Trapped Atom
In our recent paper [1], we reported observations of photon blockade by one
atom strongly coupled to an optical cavity. In support of these measurements,
here we provide an expanded discussion of the general phenomenology of photon
blockade as well as of the theoretical model and results that were presented in
Ref. [1]. We describe the general condition for photon blockade in terms of the
transmission coefficients for photon number states. For the atom-cavity system
of Ref. [1], we present the model Hamiltonian and examine the relationship of
the eigenvalues to the predicted intensity correlation function. We explore the
effect of different driving mechanisms on the photon statistics. We also
present additional corrections to the model to describe cavity birefringence
and ac-Stark shifts. [1] K. M. Birnbaum, A. Boca, R. Miller, A. D. Boozer, T.
E. Northup, and H. J. Kimble, Nature 436, 87 (2005).Comment: 10 pages, 6 figure
Quantum catastrophe of slow light
Catastrophes are at the heart of many fascinating optical phenomena. The
rainbow, for example, is a ray catastrophe where light rays become infinitely
intense. The wave nature of light resolves the infinities of ray catastrophes
while drawing delicate interference patterns such as the supernumerary arcs of
the rainbow. Black holes cause wave singularities. Waves oscillate with
infinitely small wave lengths at the event horizon where time stands still. The
quantum nature of light avoids this higher level of catastrophic behaviour
while producing a quantum phenomenon known as Hawking radiation. As this letter
describes, light brought to a standstill in laboratory experiments can suffer a
similar wave singularity caused by a parabolic profile of the group velocity.
In turn, the quantum vacuum is forced to create photon pairs with a
characteristic spectrum. The idea may initiate a theory of quantum
catastrophes, in addition to classical catastrophe theory, and the proposed
experiment may lead to the first direct observation of a phenomenon related to
Hawking radiation.Comment: Published as "A laboratory analogue of the event horizon using slow
light in an atomic medium
Bayesian astrostatistics: a backward look to the future
This perspective chapter briefly surveys: (1) past growth in the use of
Bayesian methods in astrophysics; (2) current misconceptions about both
frequentist and Bayesian statistical inference that hinder wider adoption of
Bayesian methods by astronomers; and (3) multilevel (hierarchical) Bayesian
modeling as a major future direction for research in Bayesian astrostatistics,
exemplified in part by presentations at the first ISI invited session on
astrostatistics, commemorated in this volume. It closes with an intentionally
provocative recommendation for astronomical survey data reporting, motivated by
the multilevel Bayesian perspective on modeling cosmic populations: that
astronomers cease producing catalogs of estimated fluxes and other source
properties from surveys. Instead, summaries of likelihood functions (or
marginal likelihood functions) for source properties should be reported (not
posterior probability density functions), including nontrivial summaries (not
simply upper limits) for candidate objects that do not pass traditional
detection thresholds.Comment: 27 pp, 4 figures. A lightly revised version of a chapter in
"Astrostatistical Challenges for the New Astronomy" (Joseph M. Hilbe, ed.,
Springer, New York, forthcoming in 2012), the inaugural volume for the
Springer Series in Astrostatistics. Version 2 has minor clarifications and an
additional referenc
Cracking in asphalt materials
This chapter provides a comprehensive review of both laboratory characterization and modelling of bulk material fracture in asphalt mixtures. For the purpose of organization, this chapter is divided into a section on laboratory tests and a section on models. The laboratory characterization section is further subdivided on the basis of predominant loading conditions (monotonic vs. cyclic). The section on constitutive models is subdivided into two sections, the first one containing fracture mechanics based models for crack initiation and propagation that do not include material degradation due to cyclic loading conditions. The second section discusses phenomenological models that have been developed for crack growth through the use of dissipated energy and damage accumulation concepts. These latter models have the capability to simulate degradation of material capacity upon exceeding a threshold number of loading cycles.Peer ReviewedPostprint (author's final draft
Bayesian Methods for Exoplanet Science
Exoplanet research is carried out at the limits of the capabilities of
current telescopes and instruments. The studied signals are weak, and often
embedded in complex systematics from instrumental, telluric, and astrophysical
sources. Combining repeated observations of periodic events, simultaneous
observations with multiple telescopes, different observation techniques, and
existing information from theory and prior research can help to disentangle the
systematics from the planetary signals, and offers synergistic advantages over
analysing observations separately. Bayesian inference provides a
self-consistent statistical framework that addresses both the necessity for
complex systematics models, and the need to combine prior information and
heterogeneous observations. This chapter offers a brief introduction to
Bayesian inference in the context of exoplanet research, with focus on time
series analysis, and finishes with an overview of a set of freely available
programming libraries.Comment: Invited revie
- …