52 research outputs found

    High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    Get PDF
    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net

    The Formation of Spheroids in Early-Type Spirals: Clues From Their Globular Clusters

    Full text link
    We use deep Hubble Space Telescope images taken with the Advanced Camera for Surveys (ACS) in the F475W and F814W filters to investigate the globular cluster systems in four edge-on Sa spiral galaxies covering a factor of 4 in luminosity. The specific frequencies of the blue globular clusters in the galaxies in our sample fall in the range 0.34 -- 0.84, similar to typical values found for later-type spirals. The number of red globular clusters associated with the bulges generally increases with the bulge luminosity, similar to what is observed for elliptical galaxies, although the specific frequency of bulge clusters is a factor of 2-3 lower for the lowest luminosity bulges than for the higher luminosity bulges. We present a new empirical relation between the fraction of red globular clusters and total bulge luminosity based on the elliptical galaxies studied by ACSVCS (ACS Virgo Cluster Survey), and discuss how this diagram can be used to assess the importance that dissipative processes played in building spiral bulges. Our results suggest a picture where dissipative processes, which are expected during gas-rich major mergers, were more important for building luminous bulges of Sa galaxies, whereas secular evolution may have played a larger role in building lower-luminosity bulges in spirals.Comment: accepted for publication in Ap

    The Fundamental Scaling Relations of Elliptical Galaxies

    Full text link
    (ABRIDGED) We examine the fundamental scaling relations of elliptical galaxies formed through mergers. Using hundreds of simulations to judge the impact of progenitor galaxy properties on merger remnants, we find that gas dissipation provides an important contribution to tilt in the Fundamental Plane relation. Dissipationless mergers of disks produce remnants that occupy the virial plane. As the gas content of disk galaxies is increased, the tilt of the Fundamental Plane relation increases and the slope of the Re-M_* relation steepens. For gas fractions fgas > 30%, the simulated Fundamental Plane scalings approach those observed in the K-band. In our simulations, feedback from supermassive black hole growth has only a minor influence on the stellar-mass scaling relations of spheroidal galaxies, but may play a role in maintaining the observed Fundamental Plane tilt at optical wavelengths by suppressing residual star formation in merger remnants. We estimate that \approx 40-100% of the Fundamental Plane tilt induced by structural properties owes to trends in the central total-to-stellar mass ratio M_total/M_* produced by dissipation. Lower mass systems obtain greater phase- space densities than higher mass systems, producing a galaxy mass-dependent central M_total/M_* and a corresponding tilt in the Fundamental Plane.Comment: Version accepted by ApJ, 20 pages, 18 figures, resolution reduced for siz

    CIV Emission and the Ultraviolet through X-ray Spectral Energy Distribution of Radio-Quiet Quasars

    Full text link
    In the restframe UV, two of the parameters that best characterize the range of emission-line properties in quasar broad emission-line regions are the equivalent width and the blueshift of the CIV line relative to the quasar rest frame. We explore the connection between these emission-line properties and the UV through X-ray spectral energy distribution (SED) for radio-quiet (RQ) quasars. Our sample consists of a heterogeneous compilation of 406 quasars from the Sloan Digital Sky Survey and Palomar-Green survey that have well-measured CIV emission-line and X-ray properties (including 164 objects with measured Gamma). We find that RQ quasars with both strong CIV emission and small CIV blueshifts can be classified as "hard-spectrum" sources that are (relatively) strong in the X-ray as compared to the UV. On the other hand, RQ quasars with both weak CIV emission and large CIV blueshifts are instead "soft-spectrum" sources that are (relatively) weak in the X-ray as compared to the UV. This work helps to further bridge optical/soft X-ray "Eigenvector 1" relationships to the UV and hard X-ray. Based on these findings, we argue that future work should consider systematic errors in bolometric corrections (and thus accretion rates) that are derived from a single mean SED. Detailed analysis of the CIV emission line may allow for SED-dependent corrections to these quantities.Comment: AJ, in press; 39 pages, 11 figures, 3 table

    Report on IOCCG Workshop Phytoplankton Composition from Space: towards a validation\ud strategy for satellite algorithms

    Get PDF
    The IOCCG-supported workshop “Phytoplankton Composition from Space: towards a validation strategy for satellite algorithms” was organized as a follow-up to the Phytoplankton Functional Types from Space splinter session, held at the International Ocean Colour Science Meeting (Germany, 2013). The specific goals of the workshop were to: 1. Provide a summary of the status of activities from relevant IOCCG working groups, the 2nd PFT intercomparison working group, PFT validation data sets and other research developments. 2. Provide a PFT validation strategy that considers the different applications of PFT products: and seeks community consensus on datasets and analysis protocols. 3. Discuss possibilities for sustaining ongoing PFT algorithm validation and intercomparison activities. The workshop included 15 talks, breakout sessions and plenary discussions. Talks covered community algorithm intercomparison activity updates, review of established and novel methods for PFT validation, validation activities for specific applications and space-agency requirements for PFT products and validation. These were followed by general discussions on (a) major recommendations for global intercomparison initiative in respect to validation, intercomparison and user’s guide; (b) developing a community consensus on which data sets for validation are optimal and which measurement and analysis protocols should be followed to support sustained validation of PFT products considering different applications; (c) the status of different validation data bases and measurement protocols for different PFT applications, and (d) engagement of the various user communities for PFT algorithms in developing PFT product specifications. From these discussions, two breakout groups provided in depth discussion and recommendations on (1) validation of current algorithms and (2) work plan to prepare for validation of future missions. Breakout group 1 provided an action list for progressing the current international community validation and intercomparison activity. Breakout group 2 provided the following recommendations towards developing a future validation strategy for satellite PFT products: 1. Establish a number of validation sites that maintain measurements of a key set of variables. 2. This set of variables should include: • Phytoplankton pigments from HPLC, phycobilins from spectrofluorometry • Phytoplankton cell counts and ID, volume / carbon estimation and imaging (e.g. from flow cytometry, FlowCam, FlowCytobot type technologies) • Inherent optical properties (e.g. absorption, backscattering, VSF) • Hyperspectral radiometry (both above and in-water) • Particle size distribution • Size-fractionated measurements of pigments and absorption • Genetic / -omics data 3. Undertake an intercomparison of methods / instruments over several years at a few sites to understand our capabilities to fully characterize the phytoplankton community. 4. Organise workshops to address the following topics: • Techniques for particle analysis, characterization and classification • Engagement with modellers and understanding end-user requirements • Data storage and management, standards for data contributors, data challenges In conclusion, the workshop was assessed to have fulfilled its goals. A follow-on meeting will be organized during the International Ocean Colour Science Meeting 2015 in San Francisco. Specific follow-on actions are listed at the end of the report

    The Millennium Galaxy Catalogue: Bulge/Disc Decomposition of 10095 Nearby Galaxies

    Get PDF
    We have modelled the light distribution in 10095 galaxies from the Millennium Galaxy Catalogue (MGC), providing publically available structural catalogues for a large, representative sample of galaxies in the local Universe. Three different models were used: (1) a single Sersic function for the whole galaxy, (2) a bulge-disc decomposition model using a de Vaucouleurs (R^{1/4}) bulge plus exponential disc, (3) a bulge-disc decomposition model using a Sersic (R^{1/n}) bulge plus exponential disc. Repeat observations for 700 galaxies demonstrate that stable measurements can be obtained for object components with a half-light radius comparable to, or larger than, the seeing half-width at half maximum. We show that with careful quality control, robust measurements can be obtained for large samples such as the MGC. We use the catalogues to show that the galaxy colour bimodality is due to the two-component nature of galaxies (i.e. bulges and discs) and not to two distinct galaxy populations. We conclude that understanding galaxy evolution demands the routine bulge-disc decomposition of the giant galaxy population at all redshifts.Comment: Accepted for publication in MNRAS. 23 pages, 20 figure

    Tools for Assessment of Country Preparedness for Public Health Emergencies: A Critical Review

    Get PDF
    Recent international communicable disease crises have highlighted the need for countries to assure their preparedness to respond effectively to public health emergencies. The objective of this study was to critically review existing tools to support a country’s assessment of its health emergency prepar- edness. We developed a framework to analyze the expected effectiveness and utility of these tools. Through mixed search strategies, we identified 12 tools with relevance to public health emergencies. There was considerable consensus concerning the critical preparedness system elements to be assessed, although their relative emphasis and means of assessment and measurement varied consid- erably. Several tools identified appeared to have reporting requirements as their primary aim, rather than primary utility for system self-assessment of the countries and states using the tool. Few tools attempted to give an account of their underlying evidence base. Only some tools were available in a user-friendly electronic modality or included quantitative measures to support the monitoring of system preparedness over time. We conclude there is still a need for improvement in tools available for assessment of country preparedness for public health emergencies, and for applied research to increase identification of system measures that are valid indicators of system response capability.Peer Reviewe
    corecore