4,460 research outputs found

    Halo assembly bias and the tidal anisotropy of the local halo environment

    Full text link
    We study the role of the local tidal environment in determining the assembly bias of dark matter haloes. Previous results suggest that the anisotropy of a halo's environment (i.e, whether it lies in a filament or in a more isotropic region) can play a significant role in determining the eventual mass and age of the halo. We statistically isolate this effect using correlations between the large-scale and small-scale environments of simulated haloes at z=0z=0 with masses between 1011.6≲(m/h−1M⊙)≲1014.910^{11.6}\lesssim (m/h^{-1}M_{\odot})\lesssim10^{14.9}. We probe the large-scale environment using a novel halo-by-halo estimator of linear bias. For the small-scale environment, we identify a variable αR\alpha_R that captures the tidal anisotropy\textit{tidal anisotropy} in a region of radius R=4R200bR=4R_{\textrm{200b}} around the halo and correlates strongly with halo bias at fixed mass. Segregating haloes by αR\alpha_R reveals two distinct populations. Haloes in highly isotropic local environments (αR≲0.2\alpha_R\lesssim0.2) behave as expected from the simplest, spherically averaged analytical models of structure formation, showing a negative\textit{negative} correlation between their concentration and large-scale bias at all\textit{all} masses. In contrast, haloes in anisotropic, filament-like environments (αR≳0.5\alpha_R\gtrsim0.5) tend to show a positive\textit{positive} correlation between bias and concentration at any mass. Our multi-scale analysis cleanly demonstrates how the overall assembly bias trend across halo mass emerges as an average over these different halo populations, and provides valuable insights towards building analytical models that correctly incorporate assembly bias. We also discuss potential implications for the nature and detectability of galaxy assembly bias.Comment: 19 pages, 15 figures; v2: revised in response to referee comments, added references and discussion, conclusions unchanged. Accepted in MNRA

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik

    VM-MAD: a cloud/cluster software for service-oriented academic environments

    Full text link
    The availability of powerful computing hardware in IaaS clouds makes cloud computing attractive also for computational workloads that were up to now almost exclusively run on HPC clusters. In this paper we present the VM-MAD Orchestrator software: an open source framework for cloudbursting Linux-based HPC clusters into IaaS clouds but also computational grids. The Orchestrator is completely modular, allowing flexible configurations of cloudbursting policies. It can be used with any batch system or cloud infrastructure, dynamically extending the cluster when needed. A distinctive feature of our framework is that the policies can be tested and tuned in a simulation mode based on historical or synthetic cluster accounting data. In the paper we also describe how the VM-MAD Orchestrator was used in a production environment at the FGCZ to speed up the analysis of mass spectrometry-based protein data by cloudbursting to the Amazon EC2. The advantages of this hybrid system are shown with a large evaluation run using about hundred large EC2 nodes.Comment: 16 pages, 5 figures. Accepted at the International Supercomputing Conference ISC13, June 17--20 Leipzig, German

    SKIRT: hybrid parallelization of radiative transfer simulations

    Full text link
    We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modeling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behavior of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.Comment: 21 pages, 20 figure

    Hydrogen Epoch of Reionization Array (HERA)

    Get PDF
    The Hydrogen Epoch of Reionization Array (HERA) is a staged experiment to measure 21 cm emission from the primordial intergalactic medium (IGM) throughout cosmic reionization (z=6−12z=6-12), and to explore earlier epochs of our Cosmic Dawn (z∼30z\sim30). During these epochs, early stars and black holes heated and ionized the IGM, introducing fluctuations in 21 cm emission. HERA is designed to characterize the evolution of the 21 cm power spectrum to constrain the timing and morphology of reionization, the properties of the first galaxies, the evolution of large-scale structure, and the early sources of heating. The full HERA instrument will be a 350-element interferometer in South Africa consisting of 14-m parabolic dishes observing from 50 to 250 MHz. Currently, 19 dishes have been deployed on site and the next 18 are under construction. HERA has been designated as an SKA Precursor instrument. In this paper, we summarize HERA's scientific context and provide forecasts for its key science results. After reviewing the current state of the art in foreground mitigation, we use the delay-spectrum technique to motivate high-level performance requirements for the HERA instrument. Next, we present the HERA instrument design, along with the subsystem specifications that ensure that HERA meets its performance requirements. Finally, we summarize the schedule and status of the project. We conclude by suggesting that, given the realities of foreground contamination, current-generation 21 cm instruments are approaching their sensitivity limits. HERA is designed to bring both the sensitivity and the precision to deliver its primary science on the basis of proven foreground filtering techniques, while developing new subtraction techniques to unlock new capabilities. The result will be a major step toward realizing the widely recognized scientific potential of 21 cm cosmology.Comment: 26 pages, 24 figures, 2 table
    • …
    corecore