175 research outputs found

    A Century of Cosmology

    Full text link
    In the century since Einstein's anno mirabilis of 1905, our concept of the Universe has expanded from Kapteyn's flattened disk of stars only 10 kpc across to an observed horizon about 30 Gpc across that is only a tiny fraction of an immensely large inflated bubble. The expansion of our knowledge about the Universe, both in the types of data and the sheer quantity of data, has been just as dramatic. This talk will summarize this century of progress and our current understanding of the cosmos.Comment: Talk presented at the "Relativistic Astrophysics and Cosmology - Einstein's Legacy" meeting in Munich, Nov 2005. Proceedings will be published in the Springer-Verlag "ESO Astrophysics Symposia" series. 10 pages Latex with 2 figure

    Nucleosynthesis Constraints on Scalar-Tensor Theories of Gravity

    Get PDF
    We study the cosmological evolution of massless single-field scalar-tensor theories of gravitation from the time before the onset of e+ee^+e^- annihilation and nucleosynthesis up to the present. The cosmological evolution together with the observational bounds on the abundances of the lightest elements (those mostly produced in the early universe) place constraints on the coefficients of the Taylor series expansion of a(ϕ)a(\phi), which specifies the coupling of the scalar field to matter and is the only free function in the theory. In the case when a(ϕ)a(\phi) has a minimum (i.e., when the theory evolves towards general relativity) these constraints translate into a stronger limit on the Post-Newtonian parameters γ\gamma and β\beta than any other observational test. Moreover, our bounds imply that, even at the epoch of annihilation and nucleosynthesis, the evolution of the universe must be very close to that predicted by general relativity if we do not want to over- or underproduce 4^{4}He. Thus the amount of scalar field contribution to gravity is very small even at such an early epoch.Comment: 15 pages, 2 figures, ReVTeX 3.1, submitted to Phys. Rev. D1

    How much will we learn from the CMB ?

    Full text link
    The purpose of this article is to give a brief account of what we hope to learn from the future CMB experiments, essentially from the point of view of primordial cosmology. After recalling what we have already learnt, the principles of parameter extraction from the data are summarized. The discussion is then devoted to the information we could gain about the early universe, in the framework of the inflationary scenario, or in more exotic scenarios like brane cosmology.Comment: Invited Talk at "The Early Universe and Cosmological Observations: a Critical Review", UCT, Cape Town, July 2001; to appear in Class. Quant. Gra

    The clustering of galaxies in the SDSS-III Baryon Oscillation Spectroscopic Survey: constraints on the time variation of fundamental constants from the large-scale two-point correlation function

    Full text link
    We obtain constraints on the variation of the fundamental constants from the full shape of the redshift-space correlation function of a sample of luminous galaxies drawn from the Data Release 9 of the Baryonic Oscillations Spectroscopic Survey. We combine this information with data from recent CMB, BAO and H_0 measurements. We focus on possible variations of the fine structure constant \alpha and the electron mass m_e in the early universe, and study the degeneracies between these constants and other cosmological parameters, such as the dark energy equation of state parameter w_DE, the massive neutrinos fraction f_\nu, the effective number of relativistic species N_eff, and the primordial helium abundance Y_He. When only one of the fundamental constants is varied, our final bounds are \alpha / \alpha_0 = 0.9957_{-0.0042}^{+0.0041} and m_e /(m_e)_0 = 1.006_{-0.013}^{+0.014}. For their joint variation, our results are \alpha / \alpha_0 = 0.9901_{-0.0054}^{+0.0055} and m_e /(m_e)_0 = 1.028 +/- 0.019. Although when m_e is allowed to vary our constraints on w_DE are consistent with a cosmological constant, when \alpha is treated as a free parameter we find w_DE = -1.20 +/- 0.13; more than 1 \sigma away from its standard value. When f_\nu and \alpha are allowed to vary simultaneously, we find f_\nu < 0.043 (95% CL), implying a limit of \sum m_\nu < 0.46 eV (95% CL), while for m_e variation, we obtain f_nu < 0.086 (95% CL), which implies \sum m_\nu < 1.1 eV (95% CL). When N_eff or Y_He are considered as free parameters, their simultaneous variation with \alpha provides constraints close to their standard values (when the H_0 prior is not included in the analysis), while when m_e is allowed to vary, their preferred values are significantly higher. In all cases, our results are consistent with no variations of \alpha or m_e at the 1 or 2 \sigma level.Comment: 18 pages, 16 figures. Submitted to MNRA

    The Beginning and Evolution of the Universe

    Full text link
    We review the current standard model for the evolution of the Universe from an early inflationary epoch to the complex hierarchy of structure seen today. We summarize and provide key references for the following topics: observations of the expanding Universe; the hot early Universe and nucleosynthesis; theory and observations of the cosmic microwave background; Big Bang cosmology; inflation; dark matter and dark energy; theory of structure formation; the cold dark matter model; galaxy formation; cosmological simulations; observations of galaxies, clusters, and quasars; statistical measures of large-scale structure; and measurement of cosmological parameters. We conclude with discussion of some open questions in cosmology. This review is designed to provide a graduate student or other new worker in the field an introduction to the cosmological literature.Comment: 69 pages. Invited review article for Publications of the Astronomical Society of the Pacific. Supplementary references, tables, and more concise PDF file at http://www.physics.drexel.edu/univers

    Can the Universe Create Itself?

    Full text link
    The question of first-cause has troubled philosophers and cosmologists alike. Now that it is apparent that our universe began in a Big Bang explosion, the question of what happened before the Big Bang arises. Inflation seems like a very promising answer, but as Borde and Vilenkin have shown, the inflationary state preceding the Big Bang must have had a beginning also. Ultimately, the difficult question seems to be how to make something out of nothing. This paper explores the idea that this is the wrong question --- that that is not how the Universe got here. Instead, we explore the idea of whether there is anything in the laws of physics that would prevent the Universe from creating itself. Because spacetimes can be curved and multiply connected, general relativity allows for the possibility of closed timelike curves (CTCs). Thus, tracing backwards in time through the original inflationary state we may eventually encounter a region of CTCs giving no first-cause. This region of CTCs, may well be over by now (being bounded toward the future by a Cauchy horizon). We illustrate that such models --- with CTCs --- are not necessarily inconsistent by demonstrating self-consistent vacuums for Misner space and a multiply connected de Sitter space in which the renormalized energy-momentum tensor does not diverge as one approaches the Cauchy horizon and solves Einstein's equations. We show such a Universe can be classically stable and self-consistent if and only if the potentials are retarded, giving a natural explanation of the arrow of time. Some specific scenarios (out of many possible ones) for this type of model are described. For example: an inflationary universe gives rise to baby universes, one of which turns out to be itself. Interestingly, the laws of physics may allow the Universe to be its own mother.Comment: 48 pages, 8 figure

    The microwave background temperature at the redshift of 2.33771

    Full text link
    The Cosmic Microwave Background radiation is a fundamental prediction of Hot Big Bang cosmology. The temperature of its black-body spectrum has been measured at the present time, TCMBR,0T_{\rm CMBR,0} = 2.726±\pm 0.010 K, and is predicted to have been higher in the past. At earlier time, the temperature can be measured, in principle, using the excitation of atomic fine structure levels by the radiation field. All previous measurements however give only upper limits as they assume that no other significant source of excitation is present. Here we report the detection of absorption from the first {\sl and} second fine-structure levels of neutral carbon atoms in an isolated remote cloud at a redshift of 2.33771. In addition, the unusual detection of molecular hydrogen in several rotational levels and the presence of ionized carbon in its excited fine structure level make the absorption system unique to constrain, directly from observation, the different excitation processes at play. It is shown for the first time that the cosmic radiation was warmer in the past. We find 6.0 < T_{\rm CMBR} < 14 K at z = 2.33771 when 9.1 K is expected in the Hot Big Bang cosmology.Comment: 20 pages, 5 figures, accepted for publication in Nature, Press embargo until 1900 hrs London time (GMT) on 20 Dec 200

    “Excellence R Us”: university research and the fetishisation of excellence

    Get PDF
    The rhetoric of “excellence” is pervasive across the academy. It is used to refer to research outputs as well as researchers, theory and education, individuals and organisations, from art history to zoology. But does “excellence” actually mean anything? Does this pervasive narrative of “excellence” do any good? Drawing on a range of sources we interrogate “excellence” as a concept and find that it has no intrinsic meaning in academia. Rather it functions as a linguistic interchange mechanism. To investigate whether this linguistic function is useful we examine how the rhetoric of excellence combines with narratives of scarcity and competition to show that the hypercompetition that arises from the performance of “excellence” is completely at odds with the qualities of good research. We trace the roots of issues in reproducibility, fraud, and homophily to this rhetoric. But we also show that this rhetoric is an internal, and not primarily an external, imposition. We conclude by proposing an alternative rhetoric based on soundness and capacity-building. In the final analysis, it turns out that that “excellence” is not excellent. Used in its current unqualified form it is a pernicious and dangerous rhetoric that undermines the very foundations of good research and scholarship

    Stellar structure and compact objects before 1940: Towards relativistic astrophysics

    Full text link
    Since the mid-1920s, different strands of research used stars as "physics laboratories" for investigating the nature of matter under extreme densities and pressures, impossible to realize on Earth. To trace this process this paper is following the evolution of the concept of a dense core in stars, which was important both for an understanding of stellar evolution and as a testing ground for the fast-evolving field of nuclear physics. In spite of the divide between physicists and astrophysicists, some key actors working in the cross-fertilized soil of overlapping but different scientific cultures formulated models and tentative theories that gradually evolved into more realistic and structured astrophysical objects. These investigations culminated in the first contact with general relativity in 1939, when J. Robert Oppenheimer and his students George Volkoff and Hartland Snyder systematically applied the theory to the dense core of a collapsing neutron star. This pioneering application of Einstein's theory to an astrophysical compact object can be regarded as a milestone in the path eventually leading to the emergence of relativistic astrophysics in the early 1960s.Comment: 83 pages, 4 figures, submitted to the European Physical Journal
    corecore