84 research outputs found

    Bayesian power-spectrum inference with foreground and target contamination treatment

    Get PDF
    This work presents a joint and self-consistent Bayesian treatment of various foreground and target contaminations when inferring cosmological power-spectra and three dimensional density fields from galaxy redshift surveys. This is achieved by introducing additional block sampling procedures for unknown coefficients of foreground and target contamination templates to the previously presented ARES framework for Bayesian large scale structure analyses. As a result the method infers jointly and fully self-consistently three dimensional density fields, cosmological power-spectra, luminosity dependent galaxy biases, noise levels of respective galaxy distributions and coefficients for a set of a priori specified foreground templates. In addition this fully Bayesian approach permits detailed quantification of correlated uncertainties amongst all inferred quantities and correctly marginalizes over observational systematic effects. We demonstrate the validity and efficiency of our approach in obtaining unbiased estimates of power-spectra via applications to realistic mock galaxy observations subject to stellar contamination and dust extinction. While simultaneously accounting for galaxy biases and unknown noise levels our method reliably and robustly infers three dimensional density fields and corresponding cosmological power-spectra from deep galaxy surveys. Further our approach correctly accounts for joint and correlated uncertainties between unknown coefficients of foreground templates and the amplitudes of the power-spectrum. An effect amounting up to 1010 percent correlations and anti-correlations across large ranges in Fourier space.Comment: 15 pages, 11 figure

    Bayesian inference from photometric redshift surveys

    Full text link
    We show how to enhance the redshift accuracy of surveys consisting of tracers with highly uncertain positions along the line of sight. Photometric surveys with redshift uncertainty delta_z ~ 0.03 can yield final redshift uncertainties of delta_z_f ~ 0.003 in high density regions. This increased redshift precision is achieved by imposing an isotropy and 2-point correlation prior in a Bayesian analysis and is completely independent of the process that estimates the photometric redshift. As a byproduct, the method also infers the three dimensional density field, essentially super-resolving high density regions in redshift space. Our method fully takes into account the survey mask and selection function. It uses a simplified Poissonian picture of galaxy formation, relating preferred locations of galaxies to regions of higher density in the matter field. The method quantifies the remaining uncertainties in the three dimensional density field and the true radial locations of galaxies by generating samples that are constrained by the survey data. The exploration of this high dimensional, non-Gaussian joint posterior is made feasible using multiple-block Metropolis-Hastings sampling. We demonstrate the performance of our implementation on a simulation containing 2.0 x 10^7 galaxies. These results bear out the promise of Bayesian analysis for upcoming photometric large scale structure surveys with tens of millions of galaxies.Comment: 17 pages, 12 figure

    Methods for Bayesian power spectrum inference with galaxy surveys

    Full text link
    We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves over previous Bayesian methods by performing a joint inference of the three dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate sub samples. The method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal to noise regimes by using a deterministic reversible jump algorithm. We test our method on an artificial mock galaxy survey, emulating characteristic features of the Sloan Digital Sky Survey data release 7, such as its survey geometry and luminosity dependent biases. These tests demonstrate the numerical feasibility of our large scale Bayesian inference frame work when the parameter space has millions of dimensions. The method reveals and correctly treats the anti-correlation between bias amplitudes and power spectrum, which are not taken into account in current approaches to power spectrum estimation, a 20 percent effect across large ranges in k-space. In addition, the method results in constrained realizations of density fields obtained without assuming the power spectrum or bias parameters in advance

    Past and present cosmic structure in the SDSS DR7 main sample

    Full text link
    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structure formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than  3\sim~3 Mpc/h/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.Comment: 27 pages, 9 figure

    Comparing cosmic web classifiers using information theory

    Get PDF
    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-web, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.Comment: 20 pages, 8 figures, 6 tables. Matches JCAP published version. Public data available from the first author's website (currently http://icg.port.ac.uk/~leclercq/

    Reconstructing the gravitational field of the local universe

    Full text link
    Tests of gravity at the galaxy scale are in their infancy. As a first step to systematically uncovering the gravitational significance of galaxies, we map three fundamental gravitational variables -- the Newtonian potential, acceleration and curvature -- over the galaxy environments of the local universe to a distance of approximately 200 Mpc. Our method combines the contributions from galaxies in an all-sky redshift survey, halos from an N-body simulation hosting low-luminosity objects, and linear and quasi-linear modes of the density field. We use the ranges of these variables to determine the extent to which galaxies expand the scope of generic tests of gravity and are capable of constraining specific classes of model for which they have special significance. Finally, we investigate the improvements afforded by upcoming galaxy surveys.Comment: 12 pages, 4 figures; revised to match MNRAS accepted versio

    Dark matter voids in the SDSS galaxy survey

    Full text link
    What do we know about voids in the dark matter distribution given the Sloan Digital Sky Survey (SDSS) and assuming the ΛCDM\Lambda\mathrm{CDM} model? Recent application of the Bayesian inference algorithm BORG to the SDSS Data Release 7 main galaxy sample has generated detailed Eulerian and Lagrangian representations of the large-scale structure as well as the possibility to accurately quantify corresponding uncertainties. Building upon these results, we present constrained catalogs of voids in the Sloan volume, aiming at a physical representation of dark matter underdensities and at the alleviation of the problems due to sparsity and biasing on galaxy void catalogs. To do so, we generate data-constrained reconstructions of the presently observed large-scale structure using a fully non-linear gravitational model. We then find and analyze void candidates using the VIDE toolkit. Our methodology therefore predicts the properties of voids based on fusing prior information from simulations and data constraints. For usual void statistics (number function, ellipticity distribution and radial density profile), all the results obtained are in agreement with dark matter simulations. Our dark matter void candidates probe a deeper void hierarchy than voids directly based on the observed galaxies alone. The use of our catalogs therefore opens the way to high-precision void cosmology at the level of the dark matter field. We will make the void catalogs used in this work available at http://www.cosmicvoids.net.Comment: 15 pages, 6 figures, matches JCAP published version, void catalogs publicly available at http://www.cosmicvoids.ne

    Matched filter optimization of kSZ measurements with a reconstructed cosmological flow field

    Full text link
    We develop and test a new statistical method to measure the kinematic Sunyaev-Zel'dovich (kSZ) effect. A sample of independently detected clusters is combined with the cosmic flow field predicted from a galaxy redshift survey in order to derive a matched filter that optimally weights the kSZ signal for the sample as a whole given the noise involved in the problem. We apply this formalism to realistic mock microwave skies based on cosmological NN-body simulations, and demonstrate its robustness and performance. In particular, we carefully assess the various sources of uncertainty, cosmic microwave background primary fluctuations, instrumental noise, uncertainties in the determination of the velocity field, and effects introduced by miscentring of clusters and by uncertainties of the mass-observable relation (normalization and scatter). We show that available data (\plk\ maps and the MaxBCG catalogue) should deliver a 7.7σ7.7\sigma detection of the kSZ. A similar cluster catalogue with broader sky coverage should increase the detection significance to 13σ\sim 13\sigma. We point out that such measurements could be binned in order to study the properties of the cosmic gas and velocity fields, or combined into a single measurement to constrain cosmological parameters or deviations of the law of gravity from General Relativity.Comment: 17 pages, 10 figures, 3 tables. Submitted to MNRAS. Comments are welcome

    Bayesian Methods for Analyzing the Large Scale Structure of the Universe

    Get PDF
    The cosmic large scale structure is of special relevance for testing current cosmological theories about the origin and evolution of the Universe. Throughout cosmic history, it evolved from tiny quantum fluctuations, generated during the early epoch of inflation, to the filamentary cosmic web presently observed by our telescopes. Observations and analyses of this large scale structure will hence test this picture, and will provide valuable information on the processes of cosmic structure formation as well as they will reveal the cosmological parameters governing the dynamics of the Universe. Beside measurements of the cosmic microwave backround, galaxy observations are of particular interest to modern precision cosmology. They are complementary to many other sources of information, such as cosmic microwave background experiments, since they probe a different epoch. Galaxies report the cosmic evolution over an enormous period ranging from the end of the epoch of reionization, when luminous objects first appeared, till today. For this reason, galaxy surveys are excellent probes of the dynamics and evolution of the Universe. Especially the Sloan Digital Sky Survey is one of the most ambitious surveys in the history of astronomy. It provides measurements of 930,000 galaxy spectra as well as the according angular and redshift positions of galaxies over an area which covers more than a quarter of the sky. This enormous amount of precise data allows for an unprecedented access to the three dimensional cosmic matter distribution and its evolution. However, observables, such as positions and properties of galaxies, provide only an inaccurate picture of the cosmic large scale structure due to a variety of statistical and systematic observational uncertainties. In particular, the continuous cosmic density field is only traced by a set of discrete galaxies introducing statistical uncertainties in the form of Poisson distributed noise. Further, galaxy surveys are subject to a variety of complications such as instrumental limitations or the nature of the observation itself. The solution to the underlying problem of characterizing the large scale structure in the Universe therefore requires a statistical approach. The main theme of this PhD-thesis is the development of new Bayesian data analysis methods which provide a complete statistical characterization and a detailed cosmographic description of the large scale structure in our Universe. The required epistemological concepts, the mathematical framework of Bayesian statistics as well as numerical considerations are thoroughly discussed. On this basis two Bayesian data analysis computer algorithms are developed. The first of which is called ARES (Algorithm for REconstruction and Sampling). It aims at the joint inference of the three dimensional density field and its power-spectrum from galaxy observations. The ARES algorithm accurately treats many observational systematics and statistical uncertainties, such as the survey geometry, galaxy selection effects, blurring effects and noise. Further, ARES provides a full statistical characterization of the three dimensional density field, the power-spectrum and their joint uncertainties by exploring the high dimensional space of their joint posterior via a very efficient Gibbs sampling scheme. The posterior is the probability of the model given the observations and all other available informations. As a result, ARES provides a sampled representation of the joint posterior, which conclusively characterizes many of the statistical properties of the large scale structure. This probability distribution allows for a variety of scientific applications, such as reporting any desired statistical summary or testing of cosmological models via Bayesian model comparison or Bayesian odds factors. The second computer algorithm, HADES (Hamiltonian Density Estimation and Sampling), is specifically designed to infer the fully evolved cosmic density field deep into the non-linear regime. In particular, HADES accurately treats the non-linear relationship between the observed galaxy distribution and the underlying continuous density field by correctly accounting for the Poissonian nature of the observables. This allows for very precise recovery of the density field even in sparsely sampled regions. HADES also provides a complete statistical description of the non-linear cosmic density field in the form of a sampled representation of a cosmic density posterior. Beside the possibility of reporting any desired statistical summary of the density field or power-spectrum, such representations of the according posterior distributions also allow for simple non-linear and non-Gaussian error propagation to any quantity finally inferred from the analysis results. The application of HADES to the latest Sloan Digital Sky Survey data denotes the first fully Bayesian non-linear density inference conducted so far. The results obtained from this procedure represent the filamentary structure of our cosmic neighborhood in unprecedented accuracy
    corecore