158 research outputs found

    Bounds from Primordial Black Holes with a Near Critical Collapse Initial Mass Function

    Get PDF
    Recent numerical evidence suggests that a mass spectrum of primordial black holes (PBHs) is produced as a consequence of near critical gravitational collapse. Assuming that these holes formed from the initial density perturbations seeded by inflation, we calculate model independent upper bounds on the mass variance at the reheating temperature by requiring the mass density not exceed the critical density and the photon emission not exceed current diffuse gamma-ray measurements. We then translate these results into bounds on the spectral index n by utilizing the COBE data to normalize the mass variance at large scales, assuming a constant power law, then scaling this result to the reheating temperature. We find that our bounds on n differ substantially (\delta n > 0.05) from those calculated using initial mass functions derived under the assumption that the black hole mass is proportional to the horizon mass at the collapse epoch. We also find a change in the shape of the diffuse gamma-ray spectrum which results from the Hawking radiation. Finally, we study the impact of a nonzero cosmological constant and find that the bounds on n are strengthened considerably if the universe is indeed vacuum-energy dominated today.Comment: 24 pages, REVTeX, 5 figures; minor typos fixed, two refs added, version to be published in PR

    First Stars. I. Evolution without mass loss

    Full text link
    The first generation of stars was formed from primordial gas. Numerical simulations suggest that the first stars were predominantly very massive, with typical masses M > 100 Mo. These stars were responsible for the reionization of the universe, the initial enrichment of the intergalactic medium with heavy elements, and other cosmological consequences. In this work, we study the structure of Zero Age Main Sequence stars for a wide mass and metallicity range and the evolution of 100, 150, 200, 250 and 300 Mo galactic and pregalactic Pop III very massive stars without mass loss, with metallicity Z=10E-6 and 10E-9, respectively. Using a stellar evolution code, a system of 10 equations together with boundary conditions are solved simultaneously. For the change of chemical composition, which determines the evolution of a star, a diffusion treatment for convection and semiconvection is used. A set of 30 nuclear reactions are solved simultaneously with the stellar structure and evolution equations. Several results on the main sequence, and during the hydrogen and helium burning phases, are described. Low metallicity massive stars are hotter and more compact and luminous than their metal enriched counterparts. Due to their high temperatures, pregalactic stars activate sooner the triple alpha reaction self-producing their own heavy elements. Both galactic and pregalactic stars are radiation pressure dominated and evolve below the Eddington luminosity limit with short lifetimes. The physical characteristics of the first stars have an important influence in predictions of the ionizing photon yields from the first luminous objects; also they develop large convective cores with important helium core masses which are important for explosion calculations.Comment: 17 pages, 24 figures, 2 table

    Observational Constraints of Modified Chaplygin Gas in Loop Quantum Cosmology

    Full text link
    We have considered the FRW universe in loop quantum cosmology (LQC) model filled with the dark matter (perfect fluid with negligible pressure) and the modified Chaplygin gas (MCG) type dark energy. We present the Hubble parameter in terms of the observable parameters Ωm0\Omega_{m0}, Ωx0\Omega_{x0} and H0H_{0} with the redshift zz and the other parameters like AA, BB, CC and α\alpha. From Stern data set (12 points), we have obtained the bounds of the arbitrary parameters by minimizing the χ2\chi^{2} test. The best-fit values of the parameters are obtained by 66%, 90% and 99% confidence levels. Next due to joint analysis with BAO and CMB observations, we have also obtained the bounds of the parameters (B,CB,C) by fixing some other parameters α\alpha and AA. From the best fit of distance modulus ÎŒ(z)\mu(z) for our theoretical MCG model in LQC, we concluded that our model is in agreement with the union2 sample data.Comment: 14 pages, 10 figures, Accepted in EPJC. arXiv admin note: text overlap with arXiv:astro-ph/0311622 by other author

    Regulatory objectivity in action: Mild cognitive impairment and the collective production of uncertainty

    Get PDF
    In this paper, we investigate recent changes in the definition and approach to Alzheimer’s disease brought about by growing clinical, therapeutic and regulatory interest in the prodromal or preclinical aspects of this condition. In the last decade, there has been an increased interest in the biomolecular and epidemiological characterization of pre-clinical dementia. It is argued that early diagnosis of dementia, and particularly of Alzheimer‘s disease, will facilitate the prevention of dementing processes and lower the prevalence of the condition in the general population. The search for a diagnostic category or biomarker that would serve this purpose is an ongoing but problematic endeavour for research and clinical communities in this area. In this paper, we explore how clinical and research actors, in collaboration with regulatory institutions and pharmaceutical companies, come to frame these domains as uncertainties and how they re-deploy uncertainty in the ‘collective production’ of new diagnostic conventions and bioclinical standards. While drawing as background on ethnographic, documentary and interview data, the paper proposes an in-depth, contextual analysis of the proceedings of an international meeting organized by the Peripheral and Central Nervous System Drug Advisory Committee of the US Food and Drug Administration to discuss whether or not a particular diagnostic convention — mild cognitive impairment — exists and how best it ought to be studied. Based on this analysis we argue that the deployment of uncertainty is reflexively implicated in bioclinical collectives’ search for rules and conventions, and furthermore that the collective production of uncertainty is central to the ‘knowledge machinery’ of regulatory objectivity

    Important marine areas for the conservation of northern rockhopper penguins within the Tristan da Cunha Exclusive Economic Zone

    Get PDF
    The designation of Marine Protected Areas has become an important approach to conserving marine ecosystems that relies on robust information on the spatial distribution of biodiversity. We used GPS tracking data to identify marine Important Bird and Biodiversity Areas (IBAs) for the endangered northern rockhopper penguin Eudyptes moseleyi within the Exclusive Economic Zone (EEZ) of Tristan da Cunha in the South Atlantic. Penguins were tracked throughout their breeding season from 3 of the 4 main islands in the Tristan da Cunha group. Foraging trips remained largely within the EEZ, with the exception of those from Gough Island during the incubation stage. We found substantial variability in trip duration and foraging range among breeding stages and islands, consistent use of areas among years and spatial segregation of the areas used by neighbouring islands. For colonies with no or insufficient tracking data, we defined marine IBAs based on the mean maximum foraging range and merged the areas identified to propose IBAs around the Tristan da Cunha archipelago and Gough Island. The 2 proposed marine IBAs encompass 2% of Tristan da Cunha’s EEZ, and are used by all northern rockhopper penguins breeding in the Tristan da Cunha group, representing ~90% of the global population. Currently, the main threat to northern rockhopper penguins within the Tristan da Cunha EEZ is marine pollution from shipping, and the risk of this would be reduced by declaring waters within 50 nautical miles of the coast as ‘Areas To Be Avoided

    The Atacama Cosmology Telescope: A Catalog of >4000 Sunyaev–Zel’dovich Galaxy Clusters

    Get PDF
    We present a catalog of 4195 optically confirmed Sunyaev–Zel'dovich (SZ) selected galaxy clusters detected with signal-to-noise ratio >4 in 13,211 deg2 of sky surveyed by the Atacama Cosmology Telescope (ACT). Cluster candidates were selected by applying a multifrequency matched filter to 98 and 150 GHz maps constructed from ACT observations obtained from 2008 to 2018 and confirmed using deep, wide-area optical surveys. The clusters span the redshift range 0.04 1 clusters, and a total of 868 systems are new discoveries. Assuming an SZ signal versus mass-scaling relation calibrated from X-ray observations, the sample has a 90% completeness mass limit of M500c > 3.8 × 1014 M⊙, evaluated at z = 0.5, for clusters detected at signal-to-noise ratio >5 in maps filtered at an angular scale of 2farcm4. The survey has a large overlap with deep optical weak-lensing surveys that are being used to calibrate the SZ signal mass-scaling relation, such as the Dark Energy Survey (4566 deg2), the Hyper Suprime-Cam Subaru Strategic Program (469 deg2), and the Kilo Degree Survey (825 deg2). We highlight some noteworthy objects in the sample, including potentially projected systems, clusters with strong lensing features, clusters with active central galaxies or star formation, and systems of multiple clusters that may be physically associated. The cluster catalog will be a useful resource for future cosmological analyses and studying the evolution of the intracluster medium and galaxies in massive clusters over the past 10 Gyr

    Cosmology with clusters of galaxies

    Get PDF
    In this Chapter I review the role that galaxy clusters play as tools to constrain cosmological parameters. I will concentrate mostly on the application of the mass function of galaxy clusters, while other methods, such as that based on the baryon fraction, are covered by other Chapters of the book. Since most of the cosmological applications of galaxy clusters rely on precise measurements of their masses, a substantial part of my Lectures concentrates on the different methods that have been applied so far to weight galaxy clusters. I provide in Section 2 a short introduction to the basics of cosmic structure formation. In Section 3 I describe the Press--Schechter (PS) formalism to derive the cosmological mass function, then discussing extensions of the PS approach and the most recent calibrations from N--body simulations. In Section 4 I review the methods to build samples of galaxy clusters at different wavelengths. Section 5 is devoted to the discussion of different methods to derive cluster masses. In Section 6 I describe the cosmological constraints, which have been obtained so far by tracing the cluster mass function with a variety of methods. Finally, I describe in Section 7 the future perspectives for cosmology with galaxy clusters and the challenges for clusters to keep playing an important role in the era of precision cosmology.Comment: 49 pages, 19 figures, Lectures for 2005 Guillermo Haro Summer School on Clusters, to appear in "Lecture notes in Physics" (Springer

    TRY plant trait database – enhanced coverage and open access

    Get PDF
    Plant traits—the morphological, anatomical, physiological, biochemical and phenological characteristics of plants—determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits—almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives
    • 

    corecore