241 research outputs found

    Cosmological Parameters from Observations of Galaxy Clusters

    Full text link
    Studies of galaxy clusters have proved crucial in helping to establish the standard model of cosmology, with a universe dominated by dark matter and dark energy. A theoretical basis that describes clusters as massive, multi-component, quasi-equilibrium systems is growing in its capability to interpret multi-wavelength observations of expanding scope and sensitivity. We review current cosmological results, including contributions to fundamental physics, obtained from observations of galaxy clusters. These results are consistent with and complementary to those from other methods. We highlight several areas of opportunity for the next few years, and emphasize the need for accurate modeling of survey selection and sources of systematic error. Capitalizing on these opportunities will require a multi-wavelength approach and the application of rigorous statistical frameworks, utilizing the combined strengths of observers, simulators and theorists.Comment: 53 pages, 21 figures. To appear in Annual Review of Astronomy & Astrophysic

    A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Full text link
    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/), an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement (AMR) data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation and topologically-connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.Comment: 18 pages, 6 figures, emulateapj format. Resubmitted to Astrophysical Journal Supplement Series with revisions from referee. yt can be found at http://yt.enzotools.org

    Constraining the growth rate of structure with phase correlations

    Full text link
    We show that correlations between the phases of the galaxy density field in redshift space provide additional information about the growth rate of large-scale structure that is complementary to the power spectrum multipoles. In particular, we consider the multipoles of the line correlation function (LCF), which correlates phases between three collinear points, and use the Fisher forecasting method to show that the LCF multipoles can break the degeneracy between the measurement of the growth rate of structure ff and the amplitude of perturbations σ8\sigma_8 that is present in the power spectrum multipoles at large scales. This leads to an improvement in the measurement of ff and σ8\sigma_8 by up to 220 per cent for kmax=0.15 hMpc−1k_{\rm max} = 0.15 \, h\mathrm{Mpc}^{-1} and up to 50 per cent for kmax=0.30 hMpc−1k_{\rm max} = 0.30 \, h\mathrm{Mpc}^{-1} at redshift z=0.25z=0.25, with respect to power spectrum measurements alone for the upcoming generation of galaxy surveys like DESI and Euclid. The average improvements in the constraints on ff and σ8\sigma_8 for kmax=0.15 hMpc−1k_{\rm max} = 0.15 \, h\mathrm{Mpc}^{-1} are ∼90\sim 90 per cent for the DESI BGS sample with mean redshift z‾=0.25\overline{z}=0.25, ∼40\sim 40 per cent for the DESI ELG sample with z‾=1.25\overline{z}=1.25, and ∼40\sim 40 per cent for the Euclid Hα\alpha galaxies with z‾=1.3\overline{z}=1.3. For kmax=0.30 hMpc−1k_{\rm max} = 0.30 \, h\mathrm{Mpc}^{-1}, the average improvements are ∼40\sim 40 per cent for the DESI BGS sample and ∼20\sim 20 per cent for both the DESI ELG and Euclid Hα\alpha galaxies.Comment: 28 pages, 13 figures, 2 tables. v2 has additional discussion on model-independence of the forecasts. v3 matches the MNRAS accepted versio

    Interactive Visual Analytics for Large-scale Particle Simulations

    Get PDF
    Particle based model simulations are widely used in scientific visualization. In cosmology, particles are used to simulate the evolution of dark matter in the universe. Clusters of particles (that have special statistical properties) are called halos. From a visualization point of view, halos are clusters of particles, each having a position, mass and velocity in three dimensional space, and they can be represented as point clouds that contain various structures of geometric interest such as filaments, membranes, satellite of points, clusters, and cluster of clusters. The thesis investigates methods for interacting with large scale data-sets represented as point clouds. The work mostly aims at the interactive visualization of cosmological simulation based on large particle systems. The study consists of three components: a) two human factors experiments into the perceptual factors that make it possible to see features in point clouds; b) the design and implementation of a user interface making it possible to rapidly navigate through and visualize features in the point cloud, c) software development and integration to support visualization

    Including parameter dependence in the data and covariance for cosmological inference

    Full text link
    The final step of most large-scale structure analyses involves the comparison of power spectra or correlation functions to theoretical models. It is clear that the theoretical models have parameter dependence, but frequently the measurements and the covariance matrix depend upon some of the parameters as well. We show that a very simple interpolation scheme from an unstructured mesh allows for an efficient way to include this parameter dependence self-consistently in the analysis at modest computational expense. We describe two schemes for covariance matrices. The scheme which uses the geometric structure of such matrices performs roughly twice as well as the simplest scheme, though both perform very well.Comment: 17 pages, 4 figures, matches version published in JCA

    Doctor of Philosophy

    Get PDF
    dissertationA broad range of applications capture dynamic data at an unprecedented scale. Independent of the application area, finding intuitive ways to understand the dynamic aspects of these increasingly large data sets remains an interesting and, to some extent, unsolved research problem. Generically, dynamic data sets can be described by some, often hierarchical, notion of feature of interest that exists at each moment in time, and those features evolve across time. Consequently, exploring the evolution of these features is considered to be one natural way of studying these data sets. Usually, this process entails the ability to: 1) define and extract features from each time step in the data set; 2) find their correspondences over time; and 3) analyze their evolution across time. However, due to the large data sizes, visualizing the evolution of features in a comprehensible manner and performing interactive changes are challenging. Furthermore, feature evolution details are often unmanageably large and complex, making it difficult to identify the temporal trends in the underlying data. Additionally, many existing approaches develop these components in a specialized and standalone manner, thus failing to address the general task of understanding feature evolution across time. This dissertation demonstrates that interactive exploration of feature evolution can be achieved in a non-domain-specific manner so that it can be applied across a wide variety of application domains. In particular, a novel generic visualization and analysis environment that couples a multiresolution unified spatiotemporal representation of features with progressive layout and visualization strategies for studying the feature evolution across time is introduced. This flexible framework enables on-the-fly changes to feature definitions, their correspondences, and other arbitrary attributes while providing an interactive view of the resulting feature evolution details. Furthermore, to reduce the visual complexity within the feature evolution details, several subselection-based and localized, per-feature parameter value-based strategies are also enabled. The utility and generality of this framework is demonstrated by using several large-scale dynamic data sets

    The Persistence of Large Scale Structures I: Primordial non-Gaussianity

    Full text link
    We develop an analysis pipeline for characterizing the topology of large scale structure and extracting cosmological constraints based on persistent homology. Persistent homology is a technique from topological data analysis that quantifies the multiscale topology of a data set, in our context unifying the contributions of clusters, filament loops, and cosmic voids to cosmological constraints. We describe how this method captures the imprint of primordial local non-Gaussianity on the late-time distribution of dark matter halos, using a set of N-body simulations as a proxy for real data analysis. For our best single statistic, running the pipeline on several cubic volumes of size 40 (Gpc/h)340~(\rm{Gpc/h})^{3}, we detect fNLloc=10f_{\rm NL}^{\rm loc}=10 at 97.5%97.5\% confidence on ∼85%\sim 85\% of the volumes. Additionally we test our ability to resolve degeneracies between the topological signature of fNLlocf_{\rm NL}^{\rm loc} and variation of σ8\sigma_8 and argue that correctly identifying nonzero fNLlocf_{\rm NL}^{\rm loc} in this case is possible via an optimal template method. Our method relies on information living at O(10)\mathcal{O}(10) Mpc/h, a complementary scale with respect to commonly used methods such as the scale-dependent bias in the halo/galaxy power spectrum. Therefore, while still requiring a large volume, our method does not require sampling long-wavelength modes to constrain primordial non-Gaussianity. Moreover, our statistics are interpretable: we are able to reproduce previous results in certain limits and we make new predictions for unexplored observables, such as filament loops formed by dark matter halos in a simulation box.Comment: 33+11 pages, 19 figures, code available at https://gitlab.com/mbiagetti/persistent_homology_ls
    • …
    corecore