130 research outputs found

    Cosmic distance-duality as probe of exotic physics and acceleration

    Get PDF
    In cosmology, distances based on standard candles (e.g. supernovae) and standard rulers (e.g. baryon oscillations) agree as long as three conditions are met: (1) photon number is conserved, (2) gravity is described by a metric theory with (3) photons travelling on unique null geodesics. This is the content of distance-duality (the reciprocity relation) which can be violated by exotic physics. Here we analyse the implications of the latest cosmological data sets for distance-duality. While broadly in agreement and confirming acceleration we find a 2-sigma violation caused by excess brightening of SN-Ia at z > 0.5, perhaps due to lensing magnification bias. This brightening has been interpreted as evidence for a late-time transition in the dark energy but because it is not seen in the d_A data we argue against such an interpretation. Our results do, however, rule out significant SN-Ia evolution and extinction: the "replenishing" grey-dust model with no cosmic acceleration is excluded at more than 4-sigma despite this being the best-fit to SN-Ia data alone, thereby illustrating the power of distance-duality even with current data sets.Comment: 6 pages, 4 colour figures. Version accepted as a Rapid Communication in PR

    Theory of Stellar Oscillations

    Full text link
    In recent years, astronomers have witnessed major progresses in the field of stellar physics. This was made possible thanks to the combination of a solid theoretical understanding of the phenomena of stellar pulsations and the availability of a tremendous amount of exquisite space-based asteroseismic data. In this context, this chapter reviews the basic theory of stellar pulsations, considering small, adiabatic perturbations to a static, spherically symmetric equilibrium. It starts with a brief discussion of the solar oscillation spectrum, followed by the setting of the theoretical problem, including the presentation of the equations of hydrodynamics, their perturbation, and a discussion of the functional form of the solutions. Emphasis is put on the physical properties of the different types of modes, in particular acoustic (p-) and gravity (g-) modes and their propagation cavities. The surface (f-) mode solutions are also discussed. While not attempting to be comprehensive, it is hoped that the summary presented in this chapter addresses the most important theoretical aspects that are required for a solid start in stellar pulsations research.Comment: Lecture presented at the IVth Azores International Advanced School in Space Sciences on "Asteroseismology and Exoplanets: Listening to the Stars and Searching for New Worlds" (arXiv:1709.00645), which took place in Horta, Azores Islands, Portugal in July 201

    Influence of indomethacin on lens regeneration in the newt notophthalmus viridescens

    Full text link
    Following lentectomy newts were injected with indomethacin in a variety of carrier solutions at doses ranging from 1.2–120 mg/kg body weight every other day for 15–17 days. The results show that injection of this drug according to the regimen used has no significant effect on regeneration of the lens. The data suggest, but do not prove, that prostaglandins may not play a major role in the early phases of lens regeneration in the newt.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/47503/1/427_2004_Article_BF00848434.pd

    Understanding Galaxy Formation and Evolution

    Get PDF
    The old dream of integrating into one the study of micro and macrocosmos is now a reality. Cosmology, astrophysics, and particle physics intersect in a scenario (but still not a theory) of cosmic structure formation and evolution called Lambda Cold Dark Matter (LCDM) model. This scenario emerged mainly to explain the origin of galaxies. In these lecture notes, I first present a review of the main galaxy properties, highlighting the questions that any theory of galaxy formation should explain. Then, the cosmological framework and the main aspects of primordial perturbation generation and evolution are pedagogically detached. Next, I focus on the ``dark side'' of galaxy formation, presenting a review on LCDM halo assembling and properties, and on the main candidates for non-baryonic dark matter. It is shown how the nature of elemental particles can influence on the features of galaxies and their systems. Finally, the complex processes of baryon dissipation inside the non-linearly evolving CDM halos, formation of disks and spheroids, and transformation of gas into stars are briefly described, remarking on the possibility of a few driving factors and parameters able to explain the main body of galaxy properties. A summary and a discussion of some of the issues and open problems of the LCDM paradigm are given in the final part of these notes.Comment: 50 pages, 10 low-resolution figures (for normal-resolution, DOWNLOAD THE PAPER (PDF, 1.9 Mb) FROM http://www.astroscu.unam.mx/~avila/avila.pdf). Lectures given at the IV Mexican School of Astrophysics, July 18-25, 2005 (submitted to the Editors on March 15, 2006

    New constraints on H_0 and Omega_M from SZE/X-RAY data and Baryon Acoustic Oscillations

    Full text link
    The Hubble constant, H0H_0, sets the scale of the size and age of the Universe and its determination from independent methods is still worthwhile to be investigated. In this article, by using the Sunyaev-Zel`dovich effect and X-ray surface brightness data from 38 galaxy clusters observed by Bonamente {\it{et al.}} (2006), we obtain a new estimate of H0H_0 in the context of a flat Λ\LambdaCDM model. There is a degeneracy on the mass density parameter (Ωm\Omega_{m}) which is broken by applying a joint analysis involving the baryon acoustic oscillations (BAO) as given by Sloan Digital Sky Survey (SDSS). This happens because the BAO signature does not depend on H0H_0. Our basic finding is that a joint analysis involving these tests yield H0=0.765−0.033+0.035H_0= 0.765^{+0.035}_{-0.033} km s−1^{-1} Mpc−1^{-1} and Ωm=0.27−0.02+0.03\Omega_{m}=0.27^{+0.03}_{-0.02}. Since the hypothesis of spherical geometry assumed by Bonamente {\it {et al.}} is questionable, we have also compared the above results to a recent work where a sample of triaxial galaxy clusters has been considered.Comment: 8 pages, 4 figures, 1 table, accepted version in the general relativity and gravitatio

    Testing dark energy beyond the cosmological constant barrier

    Full text link
    Although well motivated from theoretical arguments, the cosmological constant \emph{barrier}, i.e., the imposition that the equation-of-state parameter of dark energy (ωx≡px/ρx\omega_x \equiv p_x/\rho_x) is ≄−1\geq -1, seems to introduce bias in the parameter determination from statistical analyses of observational data. In this regard, \emph{phantom} dark energy or \emph{superquintessence} has been proposed in which the usual imposition ω≄−1\omega \geq -1 is relaxed. Here, we study possible observational limits to the \emph{phantom} behavior of the dark energy from recent distance estimates of galaxy clusters obtained from interferometric measurements of the Sunyaev-Zel'dovich effect/X-ray observations, Type Ia supernova data and CMB measurements. We find that there is much \emph{observationally} acceptable parameter space beyond the Λ\Lambda \emph{barrier}, thus opening the possibility of existence of more exotic forms of energy in the Universe.Comment: 5 pages, 5 figures, to appear in Phys. Rev.

    Constraints on cosmological models from strong gravitational lensing systems

    Full text link
    Strong lensing has developed into an important astrophysical tool for probing both cosmology and galaxies (their structure, formation, and evolution). Using the gravitational lensing theory and cluster mass distribution model, we try to collect a relatively complete observational data concerning the Hubble constant independent ratio between two angular diameter distances Dds/DsD_{ds}/D_s from various large systematic gravitational lens surveys and lensing by galaxy clusters combined with X-ray observations, and check the possibility to use it in the future as complementary to other cosmological probes. On one hand, strongly gravitationally lensed quasar-galaxy systems create such a new opportunity by combining stellar kinematics (central velocity dispersion measurements) with lensing geometry (Einstein radius determination from position of images). We apply such a method to a combined gravitational lens data set including 70 data points from Sloan Lens ACS (SLACS) and Lens Structure and Dynamics survey (LSD). On the other hand, a new sample of 10 lensing galaxy clusters with redshifts ranging from 0.1 to 0.6 carefully selected from strong gravitational lensing systems with both X-ray satellite observations and optical giant luminous arcs, is also used to constrain three dark energy models (Λ\LambdaCDM, constant ww and CPL) under a flat universe assumption. For the full sample (n=80n=80) and the restricted sample (n=46n=46) including 36 two-image lenses and 10 strong lensing arcs, we obtain relatively good fitting values of basic cosmological parameters, which generally agree with the results already known in the literature. This results encourages further development of this method and its use on larger samples obtained in the future.Comment: 22 pages, 5 figures, 2 tables; accepted by JCA

    A method for comparing multiple imputation techniques: A case study on the U.S. national COVID cohort collaborative

    Get PDF
    Healthcare datasets obtained from Electronic Health Records have proven to be extremely useful for assessing associations between patients’ predictors and outcomes of interest. However, these datasets often suffer from missing values in a high proportion of cases, whose removal may introduce severe bias. Several multiple imputation algorithms have been proposed to attempt to recover the missing information under an assumed missingness mechanism. Each algorithm presents strengths and weaknesses, and there is currently no consensus on which multiple imputation algorithm works best in a given scenario. Furthermore, the selection of each algorithm's parameters and data-related modeling choices are also both crucial and challenging. In this paper we propose a novel framework to numerically evaluate strategies for handling missing data in the context of statistical analysis, with a particular focus on multiple imputation techniques. We demonstrate the feasibility of our approach on a large cohort of type-2 diabetes patients provided by the National COVID Cohort Collaborative (N3C) Enclave, where we explored the influence of various patient characteristics on outcomes related to COVID-19. Our analysis included classic multiple imputation techniques as well as simple complete-case Inverse Probability Weighted models. Extensive experiments show that our approach can effectively highlight the most promising and performant missing-data handling strategy for our case study. Moreover, our methodology allowed a better understanding of the behavior of the different models and of how it changed as we modified their parameters. Our method is general and can be applied to different research fields and on datasets containing heterogeneous types
    • 

    corecore