1,136 research outputs found
Data Compression in the Petascale Astronomy Era: a GERLUMPH case study
As the volume of data grows, astronomers are increasingly faced with choices
on what data to keep -- and what to throw away. Recent work evaluating the
JPEG2000 (ISO/IEC 15444) standards as a future data format standard in
astronomy has shown promising results on observational data. However, there is
still a need to evaluate its potential on other type of astronomical data, such
as from numerical simulations. GERLUMPH (the GPU-Enabled High Resolution
cosmological MicroLensing parameter survey) represents an example of a data
intensive project in theoretical astrophysics. In the next phase of processing,
the ~27 terabyte GERLUMPH dataset is set to grow by a factor of 100 -- well
beyond the current storage capabilities of the supercomputing facility on which
it resides. In order to minimise bandwidth usage, file transfer time, and
storage space, this work evaluates several data compression techniques.
Specifically, we investigate off-the-shelf and custom lossless compression
algorithms as well as the lossy JPEG2000 compression format. Results of
lossless compression algorithms on GERLUMPH data products show small
compression ratios (1.35:1 to 4.69:1 of input file size) varying with the
nature of the input data. Our results suggest that JPEG2000 could be suitable
for other numerical datasets stored as gridded data or volumetric data. When
approaching lossy data compression, one should keep in mind the intended
purposes of the data to be compressed, and evaluate the effect of the loss on
future analysis. In our case study, lossy compression and a high compression
ratio do not significantly compromise the intended use of the data for
constraining quasar source profiles from cosmological microlensing.Comment: 15 pages, 9 figures, 5 tables. Published in the Special Issue of
Astronomy & Computing on The future of astronomical data format
Materials and methods for large-area solar cells Final report, 17 Dec. 1964 - 16 Dec. 1965
Growth and evaluation of gallium arsenide-indium arsenide-aluminum foil structures in construction of thin film large area solar cells for satellite
Thin film GaAs photovoltaic solar energy cells
Fabrication process for thin film gallium arsenide photovoltaic solar energy cell
Paul Vohl to James H. Meredith (3 October 1962)
https://egrove.olemiss.edu/mercorr_pro/1903/thumbnail.jp
New approaches to Volume and Velocity challenges of Modern Astronomy
Fundamental problems facing modern astronomy relate to processing, visualization, analysis, and remote access to data. As the volume and velocity at which data is generated and stored increases, new approaches, methods and analytical tools are required to let us fully explore information hidden within our data. In this talk, I discuss ways to enhance and streamline analysis tasks in surveys by adopting a set of informatics tricks—including concepts like display ecology, visual and immersive analytics, `single instruction, multiple visualizations`, graphics shaders, and data compression—to alleviate a number of bottlenecks and offer a better experience for individuals and research teams
Unearthing Defendants in Toxic Waste Litigation: Problems of Liability and Identification
This Comment examines two significant barriers to obtaining compensation from waste generators posed by traditional proof requirements, and proposes possible approaches for alleviating them. The author evaluates the plausibility of integrating strict products liability and risk share liability, an adaptation of market share liability, into a unified approach that may provide tort victims with greater access to legal remedies. The author suggests that the court adopt a more progressive analytical framework that is analogous to the theory articulated in Sindell v. Abbott Laboratories, and argues for an allocation of liability based on the relative risk share of generators in a toxic waste disposal market, comprised of an aggregate threat of harm to a plaintiff created by a dumpsite
- …
