21 research outputs found

    High Performance P3M N-body code: CUBEP3M

    Full text link
    This paper presents CUBEP3M, a publicly-available high performance cosmological N-body code and describes many utilities and extensions that have been added to the standard package. These include a memory-light runtime SO halo finder, a non-Gaussian initial conditions generator, and a system of unique particle identification. CUBEP3M is fast, its accuracy is tuneable to optimize speed or memory, and has been run on more than 27,000 cores, achieving within a factor of two of ideal weak scaling even at this problem size. The code can be run in an extra-lean mode where the peak memory imprint for large runs is as low as 37 bytes per particles, which is almost two times leaner than other widely used N-body codes. However, load imbalances can increase this requirement by a factor of two, such that fast configurations with all the utilities enabled and load imbalances factored in require between 70 and 120 bytes per particles. CUBEP3M is well designed to study large scales cosmological systems, where imbalances are not too large and adaptive time-stepping not essential. It has already been used for a broad number of science applications that require either large samples of non-linear realizations or very large dark matter N-body simulations, including cosmological reionization, halo formation, baryonic acoustic oscillations, weak lensing or non-Gaussian statistics. We discuss the structure, the accuracy, known systematic effects and the scaling performance of the code and its utilities, when applicable.Comment: 20 pages, 17 figures, added halo profiles, updated to match MNRAS accepted versio

    Simulating Cosmic Reionization

    Full text link
    The Cosmic Dark Ages and the Epoch of Reionization constitute a crucial missing link in our understanding of the evolution of the intergalactic medium and the formation and evolution of galaxies. Due to the complex nature of this global process it is best studied through large-scale numerical simulations. This presents considerable computational challenges. The dominant contributors of ionizing radiation were dwarf galaxies. These tiny galaxies must be resolved in very large cosmological volumes in order to derive their clustering properties and the corresponding observational signatures correctly, which makes this one of the most challenging problems of numerical cosmology. We have recently performed the largest and most detailed simulations of the formation of early cosmological large-scale structures and their radiative feedback leading to cosmic reionization. This was achieved by running extremely large (up to 29 billion-particle) N-body simulations of the formation of the Cosmic Web, with enough particles and sufficient force resolution to resolve all the galactic halos with total masses larger than 10^8 Solar masses in computational volumes of up to (163 Mpc)^3. These results were then post-processed by propagating the ionizing radiation from all sources by using fast and accurate ray-tracing radiative transfer method. Both of our codes are parallelized using a combination of MPI and OpenMP and to this date have been run efficiently on up to 2048 cores (N-body) and up to 10000 cores (radiative transfer) on the newly-deployed Sun Constellation Linux Cluster at the Texas Advanced Computing Center. In this paper we describe our codes, parallelization strategies, scaling and some preliminary scientific results. (abridged)Comment: Accepted refereed contribution to the TeraGrid08 proceeding

    Towards optimal parallel PM N-body codes: PMFAST

    Full text link
    We present a new parallel PM N-body code named PMFAST that is freely available to the public. PMFAST is based on a two-level mesh gravity solver where the gravitational forces are separated into long and short range components. The decomposition scheme minimizes communication costs and allows tolerance for slow networks. The code approaches optimality in several dimensions. The force computations are local and exploit highly optimized vendor FFT libraries. It features minimal memory overhead, with the particle positions and velocities being the main cost. The code features support for distributed and shared memory parallelization through the use of MPI and OpenMP, respectively. The current release version uses two grid levels on a slab decomposition, with periodic boundary conditions for cosmological applications. Open boundary conditions could be added with little computational overhead. We present timing information and results from a recent cosmological production run of the code using a 3712^3 mesh with 6.4 x 10^9 particles. PMFAST is cost-effective, memory-efficient, and is publicly available.Comment: 18 pages, 11 figure

    Unnatural selection of salmon life histories in a modified riverscape

    Get PDF
    Altered river flows and fragmented habitats often simplify riverine communities and favor non‐native fishes, but their influence on life‐history expression and survival is less clear. Here, we quantified the expression and ultimate success of diverse salmon emigration behaviors in an anthropogenically altered California river system. We analyzed two decades of Chinook salmon monitoring data to explore the influence of regulated flows on juvenile emigration phenology, abundance, and recruitment. We then followed seven cohorts into adulthood using otolith (ear stone) chemical archives to identify patterns in time‐ and size‐selective mortality along the migratory corridor. Suppressed winter flow cues were associated with delayed emigration timing, particularly in warm, dry years, which was also when selection against late migrants was the most extreme. Lower, less variable flows were also associated with reduced juvenile and adult production, highlighting the importance of streamflow for cohort success in these southernmost populations. While most juveniles emigrated from the natal stream as fry or smolts, the survivors were dominated by the rare few that left at intermediate sizes and times, coinciding with managed flows released before extreme summer temperatures. The consistent selection against early (small) and late (large) migrants counters prevailing ecological theory that predicts different traits to be favored under varying environmental conditions. Yet, even with this weakened portfolio, maintaining a broad distribution in migration traits still increased adult production and reduced variance. In years exhibiting large fry pulses, even marginal increases in their survival would have significantly boosted recruitment. However, management actions favoring any single phenotype could have negative evolutionary and demographic consequences, potentially reducing adaptability and population stability. To recover fish populations and support viable fisheries in a warming and increasingly unpredictable climate, coordinating flow and habitat management within and among watersheds will be critical to balance trait optimization versus diversification

    Mitochondrial physiology

    Get PDF
    As the knowledge base and importance of mitochondrial physiology to evolution, health and disease expands, the necessity for harmonizing the terminology concerning mitochondrial respiratory states and rates has become increasingly apparent. The chemiosmotic theory establishes the mechanism of energy transformation and coupling in oxidative phosphorylation. The unifying concept of the protonmotive force provides the framework for developing a consistent theoretical foundation of mitochondrial physiology and bioenergetics. We follow the latest SI guidelines and those of the International Union of Pure and Applied Chemistry (IUPAC) on terminology in physical chemistry, extended by considerations of open systems and thermodynamics of irreversible processes. The concept-driven constructive terminology incorporates the meaning of each quantity and aligns concepts and symbols with the nomenclature of classical bioenergetics. We endeavour to provide a balanced view of mitochondrial respiratory control and a critical discussion on reporting data of mitochondrial respiration in terms of metabolic flows and fluxes. Uniform standards for evaluation of respiratory states and rates will ultimately contribute to reproducibility between laboratories and thus support the development of data repositories of mitochondrial respiratory function in species, tissues, and cells. Clarity of concept and consistency of nomenclature facilitate effective transdisciplinary communication, education, and ultimately further discovery

    Mitochondrial physiology

    Get PDF
    As the knowledge base and importance of mitochondrial physiology to evolution, health and disease expands, the necessity for harmonizing the terminology concerning mitochondrial respiratory states and rates has become increasingly apparent. The chemiosmotic theory establishes the mechanism of energy transformation and coupling in oxidative phosphorylation. The unifying concept of the protonmotive force provides the framework for developing a consistent theoretical foundation of mitochondrial physiology and bioenergetics. We follow the latest SI guidelines and those of the International Union of Pure and Applied Chemistry (IUPAC) on terminology in physical chemistry, extended by considerations of open systems and thermodynamics of irreversible processes. The concept-driven constructive terminology incorporates the meaning of each quantity and aligns concepts and symbols with the nomenclature of classical bioenergetics. We endeavour to provide a balanced view of mitochondrial respiratory control and a critical discussion on reporting data of mitochondrial respiration in terms of metabolic flows and fluxes. Uniform standards for evaluation of respiratory states and rates will ultimately contribute to reproducibility between laboratories and thus support the development of data repositories of mitochondrial respiratory function in species, tissues, and cells. Clarity of concept and consistency of nomenclature facilitate effective transdisciplinary communication, education, and ultimately further discovery

    A new role for psychographics in media selection

    No full text
    Matching between media audience and target market membership in media selection often involves the use of demographics to define the target markets. This is an indirect matching process and it has been argued that media selection directly matched to the target markets may be more appropriate. A possible method for better indirect matching is the use of psychographics (self-concept and buying-style) when conditions necessary for direct matching are not present.The efficiency of psychographic indirect matching is tested using 20 randomly selected product usage categories as target markets. Data were gathered from the 1975 Target Group Index report. The research followed a 3-step approach: 1. Direct matching was used to select the 25 most efficient media vehicles for each target market. 2. Indirect matching using psychographics was used to select 25 different media vehicles for each market. 3. The direct and indirect media selections were compared. Results show that the change from direct to indirect matching causes a 65% loss of matching efficiency. Psychographics do not appear to perform any better than demographics as target market surrogates. The fact that they perform as well as demographics indicates that psychographics can be used to supplement demographics when indirect matching must be used

    The Theory and Simulation of the 21-cm Background from the Epoch of Reionization.

    Get PDF
    The redshifted 21-cm line of distant neutral H atoms provides a probe of the cosmic "dark ages" and the epoch of reionization ("EOR") which ended them, within the first billion years of cosmic time. The radio continuum produced by this redshifted line can be seen in absorption or emission against the cosmic microwave background ("CMB") at meterwaves, yielding information about the thermal and ionization history of the universe and the primordial density perturbation spectrum that led to galaxy and large-scale structure formation. Observing this 21-cm background is a great challenge, as it is necessary to detect a diffuse signal at a brightness temperature that differs from that of the CMB at millikelvin levels and distinguish this from foreground continuum sources. A new generation of low-frequency radio arrays is currently under development to search for this background. Accurate theoretical predictions of the spectrum and anisotropy of this background, necessary to guide and interpret future observations, are also quite challenging. Toward this end, it is necessary to model the inhomogeneous reionization of the intergalactic medium and determine the spin temperature of the 21-cm transition and its variations in time and space as it decouples from the temperature of the CMB. In my talk, I summarized some of the theoretical progress in this area. Here, I will focus on just a few of the predictions for the 21-cm background from the EOR, based on our newest, large-scale simulations of patchy reionization. These simulations are the first with enough N-body particles (from 5 to 29 billion) and radiative transfer rays to resolve the formation of and trace the ionizing radiation from each of the millions of dwarf galaxies believed responsible for reionization, down to 108 M[sun], in a cubic volume large enough (90 and 163 comoving Mpc on a side) to make meaningful statistical predictions of the fluctuating 21-cm background

    MEO dynamics and GNSS disposal strategies

    Get PDF
    In the last years significant understandings in theknowledge of the dynamics of the MEO satellites wereachieved. Much work was done in the analysis of viabledisposal strategies and technologies for the spacecraftbelonging to the Global Navigation Satellite Systems(GNSS), with particular emphasis on the EuropeanGalileo system. In the framework of an ESA-ESOCContract an extensive numerical simulations of differentlong term evolution scenarios, implementing differentdisposal strategies were performed. A detailed analysisof the collision risk and manoeuvres need, related to thedifferent scenarios, was performed. In terms of the longterm evolution, the scenarios where the orbitalinstabilities are exploited to remove the objects from theoperational regions seems favourite. That is, if the focusis on the long term sustainability of the spaceenvironment, the possibility to dilute the collision riskand to aim at the re-entry in the atmosphere of a subset ofthe disposed GNSS spacecraft is the most attractive. Themost "problematic" constellations are Glonass andBeidou. This conclusion is driven by the future launchtraffic hypothesized for these constellations and by thepast practices that left already a significant number oflarge uncontrolled spacecraft in the constellation orbitalzone, in the case of Glonass. On the other hand, theGalileo constellation is well detached from the others andfaces the lowest collision risks. The Stable scenariosseems to minimize the interactions (crossings) with theoperational constellations and, therefore, might bepreferred for operational reasons. In particular, in theStable scenarios the inter-constellations interaction isnegligible. Particular care should be devoted to theefficiency and reliability of the disposal manoeuvers. Asignificant share of the collision risk faced by theoperational satellites in every simulated scenario can betraced back to the "failed'' satellites (the success rate ofthe disposal manoeuvers was assumed to be 90 % for allthe constellations)
    corecore