14,414 research outputs found
Cosmological Simulations Using Special Purpose Computers: Implementing P3M on Grape
An adaptation of the Particle-Particle/Particle-Mesh (P3M) code to the
special purpose hardware GRAPE is presented. The short range force is
calculated by a four chip GRAPE-3A board, while the rest of the calculation is
performed on a Sun Sparc 10/51 workstation. The limited precision of the GRAPE
hardware and algorithm constraints introduce stochastic errors of the order of
a few percent in the gravitational forces. Tests of this new P3MG3A code show
that it is a robust tool for cosmological simulations. The code currently
achieves a peak efficiency of one third the speed of the vectorized P3M code on
a Cray C-90 and significant improvements are planned in the near future.
Special purpose computers like GRAPE are therefore an attractive alternative to
supercomputers for numerical cosmology.Comment: 9 pages (ApJS style); uuencoded compressed PostScript file (371 kb)
Also available by anonymous 'ftp' to astro.Princeton.EDU [128.112.24.45] in:
summers/grape/p3mg3a.ps (668 kb) and WWW at:
http://astro.Princeton.EDU/~library/prep.html (as POPe-600) Send all
comments, questions, requests, etc. to: [email protected]
A Search for Sub-Millisecond Pulsars
We have conducted a search of 19 southern Galactic globular clusters for
sub-millisecond pulsars at 660 MHz with the Parkes 64-m radio telescope. To
minimize dispersion smearing we used the CPSR baseband recorder, which samples
the 20 MHz observing band at the Nyquist rate. By possessing a complete
description of the signal we could synthesize an optimal filterbank in
software, and in the case of globular clusters of known dispersion measure,
much of the dispersion could be removed using coherent techniques. This allowed
for very high time resolution (25.6 us in most cases), making our searches in
general sensitive to sub-millisecond pulsars with flux densities greater than
about 3 mJy at 50 cm. No new pulsars were discovered, placing important
constraints on the proportion of pulsars with very short spin periods in these
clusters.Comment: 8 pages, 3 figures, to appear in Ap
Enabling a High Throughput Real Time Data Pipeline for a Large Radio Telescope Array with GPUs
The Murchison Widefield Array (MWA) is a next-generation radio telescope
currently under construction in the remote Western Australia Outback. Raw data
will be generated continuously at 5GiB/s, grouped into 8s cadences. This high
throughput motivates the development of on-site, real time processing and
reduction in preference to archiving, transport and off-line processing. Each
batch of 8s data must be completely reduced before the next batch arrives.
Maintaining real time operation will require a sustained performance of around
2.5TFLOP/s (including convolutions, FFTs, interpolations and matrix
multiplications). We describe a scalable heterogeneous computing pipeline
implementation, exploiting both the high computing density and FLOP-per-Watt
ratio of modern GPUs. The architecture is highly parallel within and across
nodes, with all major processing elements performed by GPUs. Necessary
scatter-gather operations along the pipeline are loosely synchronized between
the nodes hosting the GPUs. The MWA will be a frontier scientific instrument
and a pathfinder for planned peta- and exascale facilities.Comment: Version accepted by Comp. Phys. Com
Plans for the first balloon flight of the gamma-ray polarimeter experiment (GRAPE)
We have developed a design for a hard X-ray polarimeter operating in the energy range from 50 to 500 keV. This modular design, known as GRAPE (Gamma-Ray Polarimeter Experiment), has been successfully demonstrated in the lab using partially polarized gamma-ray sources and using fully polarized photon beams at Argonne National Laboratory. In June of 2007, a GRAPE engineering model, consisting of a single detector module, was flown on a high altitude balloon flight to further demonstrate the design and to collect background data. We are currently preparing a much larger balloon payload for a flight in the fall of 2011. Using a large (16-element) array of detector modules, this payload is being designed to search for polarization from known point sources of radiation, namely the Crab and Cygnus X-1. This first flight will not only provide a scientific demonstration of the GRAPE design (by measuring polarization from the Crab nebula), it will also lay the foundation for subsequent long duration balloon flights that will be designed for studying polarization from gamma-ray bursts and solar flares. Here we shall present data from calibration of the first flight module detectors, review the latest payload design and update the predicted polarization sensitivity for both the initial continental US balloon flight and the subsequent long-duration balloon flights
Viewpoints: A high-performance high-dimensional exploratory data analysis tool
Scientific data sets continue to increase in both size and complexity. In the
past, dedicated graphics systems at supercomputing centers were required to
visualize large data sets, but as the price of commodity graphics hardware has
dropped and its capability has increased, it is now possible, in principle, to
view large complex data sets on a single workstation. To do this in practice,
an investigator will need software that is written to take advantage of the
relevant graphics hardware. The Viewpoints visualization package described
herein is an example of such software. Viewpoints is an interactive tool for
exploratory visual analysis of large, high-dimensional (multivariate) data. It
leverages the capabilities of modern graphics boards (GPUs) to run on a single
workstation or laptop. Viewpoints is minimalist: it attempts to do a small set
of useful things very well (or at least very quickly) in comparison with
similar packages today. Its basic feature set includes linked scatter plots
with brushing, dynamic histograms, normalization and outlier detection/removal.
Viewpoints was originally designed for astrophysicists, but it has since been
used in a variety of fields that range from astronomy, quantum chemistry, fluid
dynamics, machine learning, bioinformatics, and finance to information
technology server log mining. In this article, we describe the Viewpoints
package and show examples of its usage.Comment: 18 pages, 3 figures, PASP in press, this version corresponds more
closely to that to be publishe
Imaging solar neutrons below 10 MeV in the inner heliosphere
Inner heliosphere measurements of the Sun can be conducted with the proposed Solar Sentinel spacecraft and mission. One of the key measurements that can be made inside the orbit of the Earth is that of lower energy neutrons that arise in flares from nuclear reactions. Solar flare neutrons below 10 MeV suffer heavy weak-decay losses before reaching 1 AU. For heliocentric radii as close as 0.3 AU, the number of surviving neutrons from a solar event is dramatically greater. Neutrons from 1-10 MeV provide a new measure of heavy ion interactions at low energies, where the vast majority of energetic ions reside. Such measurements are difficult because of locally generated background neutrons. An instrument to make these measurements must be compact, lightweight and efficient. We describe our progress in developing a low-energy neutron telescope that can operate and measure neutrons in the inner heliosphere and take a brief look at other possible applications for this detector
The Dark Energy Survey Data Management System
The Dark Energy Survey collaboration will study cosmic acceleration with a
5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The
DES data management (DESDM) system will be used to process and archive these
data and the resulting science ready data products. The DESDM system consists
of an integrated archive, a processing framework, an ensemble of astronomy
codes and a data access framework. We are developing the DESDM system for
operation in the high performance computing (HPC) environments at NCSA and
Fermilab. Operating the DESDM system in an HPC environment offers both speed
and flexibility. We will employ it for our regular nightly processing needs,
and for more compute-intensive tasks such as large scale image coaddition
campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available
to the Collaboration and later to the public through a virtual-observatory
compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project,
which must deploy and maintain only the storage, database platforms and
orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we
tested the current DESDM system on both simulated and real survey data. We used
Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and
calibrating approximately 250 million objects into the DES Archive database. We
also used DESDM to process and calibrate over 50 nights of survey data acquired
with the Mosaic2 camera. Comparison to truth tables in the case of the
simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on
Astronomical Instrumentation (held in Marseille in June 2008). This preprint
is made available with the permission of SPIE. Further information together
with preprint containing full quality images is available at
http://desweb.cosmology.uiuc.edu/wik
- …