208 research outputs found

    The Covariant Entropy Bound, Brane Cosmology, and the Null Energy Condition

    Get PDF
    In discussions of Bousso's Covariant Entropy Bound, the Null Energy Condition is always assumed, as a sufficient {\em but not necessary} condition which helps to ensure that the entropy on any lightsheet shall necessarily be finite. The spectacular failure of the Strong Energy Condition in cosmology has, however, led many astrophysicists and cosmologists to consider models of dark energy which violate {\em all} of the energy conditions, and indeed the current data do not completely rule out such models. The NEC also has a questionable status in brane cosmology: it is probably necessary to violate the NEC in the bulk in order to obtain a "self-tuning" theory of the cosmological constant. In order to investigate these proposals, we modify the Karch-Randall model by introducing NEC-violating matter into AdS5AdS_5 in such a way that the brane cosmological constant relaxes to zero. The entropy on lightsheets remains finite. However, we still find that the spacetime is fundamentally incompatible with the Covariant Entropy Bound machinery, in the sense that it fails the Bousso-Randall consistency condition. We argue that holography probably forbids all {\em cosmological} violations of the NEC, and that holography is in fact the fundamental physical principle underlying the cosmological version of the NEC.Comment: 21 pages, 3 figures, version 2:corrected and greatly improved discussion of the Bousso-Randall consistency check, references added; version3: more references added, JHEP versio

    BF models, Duality and Bosonization on higher genus surfaces

    Full text link
    The generating functional of two dimensional BFBF field theories coupled to fermionic fields and conserved currents is computed in the general case when the base manifold is a genus g compact Riemann surface. The lagrangian density L=dBAL=dB{\wedge}A is written in terms of a globally defined 1-form AA and a multi-valued scalar field BB. Consistency conditions on the periods of dBdB have to be imposed. It is shown that there exist a non-trivial dependence of the generating functional on the topological restrictions imposed to BB. In particular if the periods of the BB field are constrained to take values 4πn4\pi n, with nn any integer, then the partition function is independent of the chosen spin structure and may be written as a sum over all the spin structures associated to the fermions even when one started with a fixed spin structure. These results are then applied to the functional bosonization of fermionic fields on higher genus surfaces. A bosonized form of the partition function which takes care of the chosen spin structure is obtainedComment: 17 page

    Physics, Topology, Logic and Computation: A Rosetta Stone

    Full text link
    In physics, Feynman diagrams are used to reason about quantum processes. In the 1980s, it became clear that underlying these diagrams is a powerful analogy between quantum physics and topology: namely, a linear operator behaves very much like a "cobordism". Similar diagrams can be used to reason about logic, where they represent proofs, and computation, where they represent programs. With the rise of interest in quantum cryptography and quantum computation, it became clear that there is extensive network of analogies between physics, topology, logic and computation. In this expository paper, we make some of these analogies precise using the concept of "closed symmetric monoidal category". We assume no prior knowledge of category theory, proof theory or computer science.Comment: 73 pages, 8 encapsulated postscript figure

    Risk factors for primary congenital glaucoma in the National Birth Defects Prevention Study

    Get PDF
    Primary congenital glaucoma (PCG) is a rare but serious birth defect. Genetic mutations have been implicated in the development of PCG, but little is known about nongenetic risk factors. This study investigates potential risk factors for PCG in the National Birth Defects Prevention Study (NBDPS), a large population-based case–control study of major birth defects in the United States. The analysis includes case infants with PCG (N = 107) and control infants without birth defects (N = 10,084) enrolled in NBDPS from birth years 2000–2011. Pregnancy/infant clinical characteristics, demographics, and parental health history were collected through maternal interview. Adjusted odds ratios (aORs) and 95% confidence intervals (CIs) were computed to examine associations with all PCG cases and isolated PCG cases without other major malformations. Associations with all the cases included term low birth weight (<2,500 g; aOR = 2.80, CI 1.59–4.94), non-Hispanic black maternal race/ethnicity (aOR = 2.42, CI 1.42–4.13), maternal history of seizure (aOR = 2.73, CI 1.25–5.97), maternal antihypertensive use (aOR = 3.60, CI 1.52–8.53), and maternal sexually transmitted infection (aOR = 2.75, CI 1.17–6.44). These factors were also associated with isolated PCG, as was maternal use of nonsteroidal anti-inflammatory drugs (aOR = 2.70, CI 1.15–6.34). This study is among the first to examine a wide array of potential risk factors for PCG in a population-based sample

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Gamma-Ray Bursts: The Underlying Model

    Full text link
    A pedagogical derivation is presented of the ``fireball'' model of gamma-ray bursts, according to which the observable effects are due to the dissipation of the kinetic energy of a relativistically expanding wind, a ``fireball.'' The main open questions are emphasized, and key afterglow observations, that provide support for this model, are briefly discussed. The relativistic outflow is, most likely, driven by the accretion of a fraction of a solar mass onto a newly born (few) solar mass black hole. The observed radiation is produced once the plasma has expanded to a scale much larger than that of the underlying ``engine,'' and is therefore largely independent of the details of the progenitor, whose gravitational collapse leads to fireball formation. Several progenitor scenarios, and the prospects for discrimination among them using future observations, are discussed. The production in gamma- ray burst fireballs of high energy protons and neutrinos, and the implications of burst neutrino detection by kilometer-scale telescopes under construction, are briefly discussed.Comment: In "Supernovae and Gamma Ray Bursters", ed. K. W. Weiler, Lecture Notes in Physics, Springer-Verlag (in press); 26 pages, 2 figure

    Toward an internally consistent astronomical distance scale

    Full text link
    Accurate astronomical distance determination is crucial for all fields in astrophysics, from Galactic to cosmological scales. Despite, or perhaps because of, significant efforts to determine accurate distances, using a wide range of methods, tracers, and techniques, an internally consistent astronomical distance framework has not yet been established. We review current efforts to homogenize the Local Group's distance framework, with particular emphasis on the potential of RR Lyrae stars as distance indicators, and attempt to extend this in an internally consistent manner to cosmological distances. Calibration based on Type Ia supernovae and distance determinations based on gravitational lensing represent particularly promising approaches. We provide a positive outlook to improvements to the status quo expected from future surveys, missions, and facilities. Astronomical distance determination has clearly reached maturity and near-consistency.Comment: Review article, 59 pages (4 figures); Space Science Reviews, in press (chapter 8 of a special collection resulting from the May 2016 ISSI-BJ workshop on Astronomical Distance Determination in the Space Age

    Cosmological distance indicators

    Full text link
    We review three distance measurement techniques beyond the local universe: (1) gravitational lens time delays, (2) baryon acoustic oscillation (BAO), and (3) HI intensity mapping. We describe the principles and theory behind each method, the ingredients needed for measuring such distances, the current observational results, and future prospects. Time delays from strongly lensed quasars currently provide constraints on H0H_0 with < 4% uncertainty, and with 1% within reach from ongoing surveys and efforts. Recent exciting discoveries of strongly lensed supernovae hold great promise for time-delay cosmography. BAO features have been detected in redshift surveys up to z <~ 0.8 with galaxies and z ~ 2 with Ly-α\alpha forest, providing precise distance measurements and H0H_0 with < 2% uncertainty in flat Λ\LambdaCDM. Future BAO surveys will probe the distance scale with percent-level precision. HI intensity mapping has great potential to map BAO distances at z ~ 0.8 and beyond with precisions of a few percent. The next years ahead will be exciting as various cosmological probes reach 1% uncertainty in determining H0H_0, to assess the current tension in H0H_0 measurements that could indicate new physics.Comment: Review article accepted for publication in Space Science Reviews (Springer), 45 pages, 10 figures. Chapter of a special collection resulting from the May 2016 ISSI-BJ workshop on Astronomical Distance Determination in the Space Ag

    Transport Properties of the Quark-Gluon Plasma -- A Lattice QCD Perspective

    Full text link
    Transport properties of a thermal medium determine how its conserved charge densities (for instance the electric charge, energy or momentum) evolve as a function of time and eventually relax back to their equilibrium values. Here the transport properties of the quark-gluon plasma are reviewed from a theoretical perspective. The latter play a key role in the description of heavy-ion collisions, and are an important ingredient in constraining particle production processes in the early universe. We place particular emphasis on lattice QCD calculations of conserved current correlators. These Euclidean correlators are related by an integral transform to spectral functions, whose small-frequency form determines the transport properties via Kubo formulae. The universal hydrodynamic predictions for the small-frequency pole structure of spectral functions are summarized. The viability of a quasiparticle description implies the presence of additional characteristic features in the spectral functions. These features are in stark contrast with the functional form that is found in strongly coupled plasmas via the gauge/gravity duality. A central goal is therefore to determine which of these dynamical regimes the quark-gluon plasma is qualitatively closer to as a function of temperature. We review the analysis of lattice correlators in relation to transport properties, and tentatively estimate what computational effort is required to make decisive progress in this field.Comment: 54 pages, 37 figures, review written for EPJA and APPN; one parag. added end of section 3.4, and one at the end of section 3.2.2; some Refs. added, and some other minor change
    corecore