555 research outputs found

    A new parameter space study of cosmological microlensing

    Full text link
    Cosmological gravitational microlensing is a useful technique for understanding the structure of the inner parts of a quasar, especially the accretion disk and the central supermassive black hole. So far, most of the cosmological microlensing studies have focused on single objects from ~90 currently known lensed quasars. However, present and planned all-sky surveys are expected to discover thousands of new lensed systems. Using a graphics processing unit (GPU) accelerated ray-shooting code, we have generated 2550 magnification maps uniformly across the convergence ({\kappa}) and shear ({\gamma}) parameter space of interest to microlensing. We examine the effect of random realizations of the microlens positions on map properties such as the magnification probability distribution (MPD). It is shown that for most of the parameter space a single map is representative of an average behaviour. All of the simulations have been carried out on the GPU-Supercomputer for Theoretical Astrophysics Research (gSTAR).Comment: 16 pages, 10 figures, accepted for publication in MNRA

    Data Compression in the Petascale Astronomy Era: a GERLUMPH case study

    Full text link
    As the volume of data grows, astronomers are increasingly faced with choices on what data to keep -- and what to throw away. Recent work evaluating the JPEG2000 (ISO/IEC 15444) standards as a future data format standard in astronomy has shown promising results on observational data. However, there is still a need to evaluate its potential on other type of astronomical data, such as from numerical simulations. GERLUMPH (the GPU-Enabled High Resolution cosmological MicroLensing parameter survey) represents an example of a data intensive project in theoretical astrophysics. In the next phase of processing, the ~27 terabyte GERLUMPH dataset is set to grow by a factor of 100 -- well beyond the current storage capabilities of the supercomputing facility on which it resides. In order to minimise bandwidth usage, file transfer time, and storage space, this work evaluates several data compression techniques. Specifically, we investigate off-the-shelf and custom lossless compression algorithms as well as the lossy JPEG2000 compression format. Results of lossless compression algorithms on GERLUMPH data products show small compression ratios (1.35:1 to 4.69:1 of input file size) varying with the nature of the input data. Our results suggest that JPEG2000 could be suitable for other numerical datasets stored as gridded data or volumetric data. When approaching lossy data compression, one should keep in mind the intended purposes of the data to be compressed, and evaluate the effect of the loss on future analysis. In our case study, lossy compression and a high compression ratio do not significantly compromise the intended use of the data for constraining quasar source profiles from cosmological microlensing.Comment: 15 pages, 9 figures, 5 tables. Published in the Special Issue of Astronomy & Computing on The future of astronomical data format

    CHARACTERIZING PATTERNS IN E. COLI LEVELS IN RIO GRANDE RIVER WATER AND RIVERBED SEDIMENTS NEAR ALBUQUERQUE, NM

    Get PDF
    In this work I examined how Fecal indicator Bacteria (FIB) behave in a large environmental system (Rio Grande near Albuquerque, ~60 km distance). I addressed the questions: How do FIB levels in river water and riverbed sediments of this reach change with distance along the river and throughout one year? I conducted year-round river water and sediment sampling for concentration of E. coli bacteria, a persistent contaminant in the area. I found that over the year, E. coli loading in river water increased along the 60 km reach and E. coli in the sediments mainly increased near the Albuquerque urban area. Site by site along the reach, relative fluctuations in E. coli loadings and sediment concentrations were seasonally coupled. This study found high E. coli sediment concentrations during Summer and Fall co-occur with higher Summer and Fall loadings, and higher E. coli sediment concentrations downstream may be related to more frequent exceedances of the Total Maximum Daily Load (TMDL) in the downstream section. However, the net direction of E. coli transfer (river water to sediment or sediment to river water) is unknown at any point and the physical interactions between river water and sediment causing transfer of E. coli cells are not well understood on the reach-scale

    Advanced Architectures for Astrophysical Supercomputing

    Full text link
    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×)O(100\times) in general-purpose computation -- performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.Comment: 4 pages, 1 figure, to appear in the proceedings of ADASS XIX, Oct 4-8 2009, Sapporo, Japan (ASP Conf. Series
    • …
    corecore