11 research outputs found
Optimization and Quality Assessment of Baryon Pasting for Intracluster Gas using the Borg Cube Simulation
Synthetic datasets generated from large-volume gravity-only simulations are
an important tool in the calibration of cosmological analyses. Their creation
often requires accurate inference of baryonic observables from the dark matter
field. We explore the effectiveness of a baryon pasting algorithm in providing
precise estimations of three-dimensional gas thermodynamic properties based on
gravity-only simulations. We use the Borg Cube, a pair of simulations
originating from identical initial conditions, with one run evolved as a
gravity-only simulation, and the other incorporating non-radiative
hydrodynamics. Matching halos in both simulations enables comparisons of gas
properties on an individual halo basis. This comparative analysis allows us to
fit for the model parameters that yield the closest agreement between the gas
properties in both runs. To capture the redshift evolution of these parameters,
we perform the analysis at five distinct redshift steps, spanning from to
. We find that the investigated algorithm, utilizing information solely from
the gravity-only simulation, achieves few-percent accuracy in reproducing the
median intracluster gas pressure and density, albeit with a scatter of
approximately 20%, for cluster-scale objects up to . We measure the
scaling relation between integrated Compton parameter and cluster mass
(), and find that the imprecision of baryon pasting adds
less than 5% to the intrinsic scatter measured in the hydrodynamic simulation.
We provide best-fitting values and their redshift evolution, and discuss future
investigations that will be undertaken to extend this work.Comment: 14 pages, 8 figures, 3 tables; accepted in the Open Journal of
Astrophysic
ASCR/HEP Exascale Requirements Review Report
This draft report summarizes and details the findings, results, and
recommendations derived from the ASCR/HEP Exascale Requirements Review meeting
held in June, 2015. The main conclusions are as follows. 1) Larger, more
capable computing and data facilities are needed to support HEP science goals
in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of
the demand at the 2025 timescale is at least two orders of magnitude -- and in
some cases greater -- than that available currently. 2) The growth rate of data
produced by simulations is overwhelming the current ability, of both facilities
and researchers, to store and analyze it. Additional resources and new
techniques for data analysis are urgently needed. 3) Data rates and volumes
from HEP experimental facilities are also straining the ability to store and
analyze large and complex data volumes. Appropriately configured
leadership-class facilities can play a transformational role in enabling
scientific discovery from these datasets. 4) A close integration of HPC
simulation and data analysis will aid greatly in interpreting results from HEP
experiments. Such an integration will minimize data movement and facilitate
interdependent workflows. 5) Long-range planning between HEP and ASCR will be
required to meet HEP's research needs. To best use ASCR HPC resources the
experimental HEP program needs a) an established long-term plan for access to
ASCR computational and data resources, b) an ability to map workflows onto HPC
resources, c) the ability for ASCR facilities to accommodate workflows run by
collaborations that can have thousands of individual members, d) to transition
codes to the next-generation HPC platforms that will be available at ASCR
facilities, e) to build up and train a workforce capable of developing and
using simulations and analysis to support HEP scientific research on
next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio
Recommended from our members
Simulating Hydrodynamics in Cosmology with CRK-HACC
We introduce CRK-HACC, an extension of the Hardware/Hybrid Accelerated Cosmology Code (HACC), to resolve gas hydrodynamics in large-scale structure formation simulations of the universe. The new framework couples the HACC gravitational N-body solver with a modern smoothed-particle hydrodynamics (SPH) approach called conservative reproducing kernel SPH (CRKSPH). CRKSPH utilizes smoothing functions that exactly interpolate linear fields while manifestly preserving conservation laws (momentum, mass, and energy). The CRKSPH method has been incorporated to accurately model baryonic effects in cosmology simulations—an important addition targeting the generation of precise synthetic sky predictions for upcoming observational surveys. CRK-HACC inherits the codesign strategies of the HACC solver and is built to run on modern GPU-accelerated supercomputers. In this work, we summarize the primary solver components and present a number of standard validation tests to demonstrate code accuracy, including idealized hydrodynamic and cosmological setups, as well as self-similarity measurements. © 2023. The Author(s). Published by the American Astronomical Society.Open access journalThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
CosmoDC2: A Synthetic Sky Catalog for Dark Energy Science with LSST
This paper introduces cosmoDC2, a large synthetic galaxy catalog designed to support precision dark energy science with the Large Synoptic Survey Telescope (LSST). CosmoDC2 is the starting point for the second data challenge (DC2) carried out by the LSST Dark Energy Science Collaboration (LSST DESC). The catalog is based on a trillion-particle, (4.225 Gpc)3 box cosmological N-body simulation, the Outer Rim run. It covers 440 deg2 of sky area to a redshift of z = 3 and matches expected number densities from contemporary surveys to a magnitude depth of 28 in the r band. Each galaxy is characterized by a multitude of galaxy properties including stellar mass, morphology, spectral energy distributions, broadband filter magnitudes, host halo information, and weak lensing shear. The size and complexity of cosmoDC2 requires an efficient catalog generation methodology; our approach is based on a new hybrid technique that combines data-based empirical approaches with semianalytic galaxy modeling. A wide range of observation-based validation tests has been implemented to ensure that cosmoDC2 enables the science goals of the planned LSST DESC DC2 analyses. This paper also represents the official release of the cosmoDC2 data set, including an efficient reader that facilitates interaction with the data