8 research outputs found

    Modeling the Galaxy Distribution in Clusters using Halo Cores

    Full text link
    The galaxy distribution in dark matter-dominated halos is expected to approximately trace the details of the underlying dark matter substructure. In this paper we introduce halo `core-tracking' as a way to efficiently follow the small-scale substructure in cosmological simulations and apply the technique to model the galaxy distribution in observed clusters. The method relies on explicitly tracking the set of particles identified as belonging to a halo's central density core, once a halo has attained a certain threshold mass. The halo cores are then followed throughout the entire evolution of the simulation. The aim of core-tracking is to simplify substructure analysis tasks by avoiding the use of subhalos and, at the same time, to more easily account for the so-called ``orphan'' galaxies, which have lost substantial dark mass due to tidal stripping. We show that simple models based on halo cores can reproduce the number and spatial distribution of galaxies found in optically-selected clusters in the Sloan Digital Sky Survey. We also discuss future applications of the core-tracking methodology in studying the galaxy-halo connection.Comment: 17 pages, 20 figures, 1 Appendix; version accepted by OJ

    Marine Fisheries Stock Assessment Improvement Plan: report of the National Marine Fisheries Service National Task Force for Improving Fish Stock Assessments

    Get PDF
    This report argues for greatly increased resources in terms of data collection facilities and staff to collect, process, and analyze the data, and to communicate the results, in order for NMFS to fulfill its mandate to conserve and manage marine resources. In fact, the authors of this report had great difficulty defining the "ideal" situation to which fisheries stock assessments and management should aspire. One of the primary objectives of fisheries management is to develop sustainable harvest policies that minimize the risks of overfishing both target species and associated species. This can be achieved in a wide spectrum of ways, ranging between the following two extremes. The first is to implement only simple management measures with correspondingly simple assessment demands, which will usually mean setting fishing mortality targets at relatively low levels in order to reduce the risk of unknowingly overfishing or driving ecosystems towards undesirable system states. The second is to expand existing data collection and analysis programs to provide an adequate knowledge base that can support higher fishing mortality targets while still ensuring low risk to target and associated species and ecosystems. However, defining "adequate" is difficult, especially when scientists have not even identified all marine species, and information on catches, abundances, and life histories of many target species, and most associated species, is sparse. Increasing calls from the public, stakeholders, and the scientific community to implement ecosystem-based stock assessment and management make it even more difficult to define "adequate," especially when "ecosystem-based management" is itself not well-defined. In attempting to describe the data collection and assessment needs for the latter, the authors took a pragmatic approach, rather than trying to estimate the resources required to develop a knowledge base about the fine-scale detailed distributions, abundances, and associations of all marine species. Thus, the specified resource requirements will not meet the expectations of some stakeholders. In addition, the Stock Assessment Improvement Plan is designed to be complementary to other related plans, and therefore does not duplicate the resource requirements detailed in those plans, except as otherwise noted

    CosmoDC2: A Synthetic Sky Catalog for Dark Energy Science with LSST

    Get PDF
    This paper introduces cosmoDC2, a large synthetic galaxy catalog designed to support precision dark energy science with the Large Synoptic Survey Telescope (LSST). CosmoDC2 is the starting point for the second data challenge (DC2) carried out by the LSST Dark Energy Science Collaboration (LSST DESC). The catalog is based on a trillion-particle, 4.225 Gpc^3 box cosmological N-body simulation, the `Outer Rim' run. It covers 440 deg^2 of sky area to a redshift of z=3 and is complete to a magnitude depth of 28 in the r-band. Each galaxy is characterized by a multitude of properties including stellar mass, morphology, spectral energy distributions, broadband filter magnitudes, host halo information and weak lensing shear. The size and complexity of cosmoDC2 requires an efficient catalog generation methodology; our approach is based on a new hybrid technique that combines data-driven empirical approaches with semi-analytic galaxy modeling. A wide range of observation-based validation tests has been implemented to ensure that cosmoDC2 enables the science goals of the planned LSST DESC DC2 analyses. This paper also represents the official release of the cosmoDC2 data set, including an efficient reader that facilitates interaction with the data

    DESC DC2 Data Release Note

    No full text
    In preparation for cosmological analyses of the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST), the LSST Dark Energy Science Collaboration (LSST DESC) has created a 300 deg2^2 simulated survey as part of an effort called Data Challenge 2 (DC2). The DC2 simulated sky survey, in six optical bands with observations following a reference LSST observing cadence, was processed with the LSST Science Pipelines (19.0.0). In this Note, we describe the public data release of the resulting object catalogs for the coadded images of five years of simulated observations along with associated truth catalogs. We include a brief description of the major features of the available data sets. To enable convenient access to the data products, we have developed a web portal connected to Globus data services. We describe how to access the data and provide example Jupyter Notebooks in Python to aid first interactions with the data. We welcome feedback and questions about the data release via a GitHub repository

    The LSST DESC DC2 Simulated Sky Survey

    No full text
    International audienceWe describe the simulated sky survey underlying the second data challenge (DC2) carried out in preparation for analysis of the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) by the LSST Dark Energy Science Collaboration (LSST DESC). Significant connections across multiple science domains will be a hallmark of LSST; the DC2 program represents a unique modeling effort that stresses this interconnectivity in a way that has not been attempted before. This effort encompasses a full end-to-end approach: starting from a large N-body simulation, through setting up LSST-like observations including realistic cadences, through image simulations, and finally processing with Rubin’s LSST Science Pipelines. This last step ensures that we generate data products resembling those to be delivered by the Rubin Observatory as closely as is currently possible. The simulated DC2 sky survey covers six optical bands in a wide-fast-deep area of approximately 300 deg2, as well as a deep drilling field of approximately 1 deg2. We simulate 5 yr of the planned 10 yr survey. The DC2 sky survey has multiple purposes. First, the LSST DESC working groups can use the data set to develop a range of DESC analysis pipelines to prepare for the advent of actual data. Second, it serves as a realistic test bed for the image processing software under development for LSST by the Rubin Observatory. In particular, simulated data provide a controlled way to investigate certain image-level systematic effects. Finally, the DC2 sky survey enables the exploration of new scientific ideas in both static and time domain cosmology
    corecore