2,273 research outputs found
Invest to Save: Report and Recommendations of the NSF-DELOS Working Group on Digital Archiving and Preservation
Digital archiving and preservation are important areas for research and development, but there is no agreed upon set of priorities or coherent plan for research in this area. Research projects in this area tend to be small and driven by particular institutional problems or concerns. As a consequence, proposed solutions from experimental projects and prototypes tend not to scale to millions of digital objects, nor do the results from disparate projects readily build on each other. It is also unclear whether it is worthwhile to seek general solutions or whether different strategies are needed for different types of digital objects and collections. The lack of coordination in both research and development means that there are some areas where researchers are reinventing the wheel while other areas are neglected.
Digital archiving and preservation is an area that will benefit from an exercise in analysis, priority setting, and planning for future research. The WG aims to survey current research activities, identify gaps, and develop a white paper proposing future research directions in the area of digital preservation. Some of the potential areas for research include repository architectures and inter-operability among digital archives; automated tools for capture, ingest, and normalization of digital objects; and harmonization of preservation formats and metadata. There can also be opportunities for development of commercial products in the areas of mass storage systems, repositories and repository management systems, and data management software and tools.
Recommended from our members
Biological intrusion of low-level-waste trench covers
The long-term integrity of low-level waste shallow land burial sites is dependent on the interaction of physical, chemical, and biological factors that modify the waste containment system. Past research on low-level waste shallow land burial methods has emphasized physical (i.e., water infiltration, soil erosion) and chemical (radionuclide leaching) processes that can cause waste site failure and subsequent radionuclide transport. The purpose of this paper is to demonstrate the need to consider biological processes as being potentially important in reducing the integrity of waste burial site cover treatments. Plants and animals not only can transport radionuclides to the ground surface via root systems and soil excavated from the cover profile by animal burrowing activities, but they modify physical and chemical processes within the cover profile by changing the water infiltration rates, soil erosion rates and chemical composition of the soil. One approach to limiting biological intrusion through the waste cover is to apply a barrier within the profile to limit root and animal penetration with depth. Experiments in the Los Alamos Experimental Engineered Test Facility were initiated to develop and evaluate biological barriers that are effective in minimizing intrusion into waste trenches. The experiments that are described employ four different candidate barrier materials of geologic origin. Experimental variables that will be evaluated, in addition to barrier type, are barrier depth and soil overburden depth. The rate of biological intrusion through the various barrier materials is being evaluated through the use of activatable stable tracers
Recommended from our members
Mortandad Canyon: Elemental concentrations in vegetation, streambank soils, and stream sediments - 1979
In 1979, stream sediments, streambank soils, and streambank vegetation were sampled at 100 m intervals downstream of the outfall of the TA-50 radioactive liquid waste treatment facility in Mortandad Canyon. Sampling was discontinued at a distance of 3260 m at the location of the sediment traps in the canyon. The purpose of the sampling was to investigate the effect of the residual contaminants in the waste treatment facility effluent on elemental concentrations in various environmental media
The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST
The focus of this report is on the opportunities enabled by the combination
of LSST, Euclid and WFIRST, the optical surveys that will be an essential part
of the next decade's astronomy. The sum of these surveys has the potential to
be significantly greater than the contributions of the individual parts. As is
detailed in this report, the combination of these surveys should give us
multi-wavelength high-resolution images of galaxies and broadband data covering
much of the stellar energy spectrum. These stellar and galactic data have the
potential of yielding new insights into topics ranging from the formation
history of the Milky Way to the mass of the neutrino. However, enabling the
astronomy community to fully exploit this multi-instrument data set is a
challenging technical task: for much of the science, we will need to combine
the photometry across multiple wavelengths with varying spectral and spatial
resolution. We identify some of the key science enabled by the combined surveys
and the key technical challenges in achieving the synergies.Comment: Whitepaper developed at June 2014 U. Penn Workshop; 28 pages, 3
figure
Reducing Zero-point Systematics in Dark Energy Supernova Experiments
We study the effect of filter zero-point uncertainties on future supernova
dark energy missions. Fitting for calibration parameters using simultaneous
analysis of all Type Ia supernova standard candles achieves a significant
improvement over more traditional fit methods. This conclusion is robust under
diverse experimental configurations (number of observed supernovae, maximum
survey redshift, inclusion of additional systematics). This approach to
supernova fitting considerably eases otherwise stringent mission calibration
requirements. As an example we simulate a space-based mission based on the
proposed JDEM satellite; however the method and conclusions are general and
valid for any future supernova dark energy mission, ground or space-based.Comment: 30 pages,8 figures, 5 table, one reference added, submitted to
Astroparticle Physic
Weak Lensing from Space I: Instrumentation and Survey Strategy
A wide field space-based imaging telescope is necessary to fully exploit the
technique of observing dark matter via weak gravitational lensing. This first
paper in a three part series outlines the survey strategies and relevant
instrumental parameters for such a mission. As a concrete example of hardware
design, we consider the proposed Supernova/Acceleration Probe (SNAP). Using
SNAP engineering models, we quantify the major contributions to this
telescope's Point Spread Function (PSF). These PSF contributions are relevant
to any similar wide field space telescope. We further show that the PSF of SNAP
or a similar telescope will be smaller than current ground-based PSFs, and more
isotropic and stable over time than the PSF of the Hubble Space Telescope. We
outline survey strategies for two different regimes - a ``wide'' 300 square
degree survey and a ``deep'' 15 square degree survey that will accomplish
various weak lensing goals including statistical studies and dark matter
mapping.Comment: 25 pages, 8 figures, 1 table, replaced with Published Versio
Supernova / Acceleration Probe: A Satellite Experiment to Study the Nature of the Dark Energy
The Supernova / Acceleration Probe (SNAP) is a proposed space-based
experiment designed to study the dark energy and alternative explanations of
the acceleration of the Universe's expansion by performing a series of
complementary systematics-controlled measurements. We describe a
self-consistent reference mission design for building a Type Ia supernova
Hubble diagram and for performing a wide-area weak gravitational lensing study.
A 2-m wide-field telescope feeds a focal plane consisting of a 0.7
square-degree imager tiled with equal areas of optical CCDs and near infrared
sensors, and a high-efficiency low-resolution integral field spectrograph. The
SNAP mission will obtain high-signal-to-noise calibrated light-curves and
spectra for several thousand supernovae at redshifts between z=0.1 and 1.7. A
wide-field survey covering one thousand square degrees resolves ~100 galaxies
per square arcminute. If we assume we live in a cosmological-constant-dominated
Universe, the matter density, dark energy density, and flatness of space can
all be measured with SNAP supernova and weak-lensing measurements to a
systematics-limited accuracy of 1%. For a flat universe, the
density-to-pressure ratio of dark energy can be similarly measured to 5% for
the present value w0 and ~0.1 for the time variation w'. The large survey area,
depth, spatial resolution, time-sampling, and nine-band optical to NIR
photometry will support additional independent and/or complementary dark-energy
measurement approaches as well as a broad range of auxiliary science programs.
(Abridged)Comment: 40 pages, 18 figures, submitted to PASP, http://snap.lbl.go
Search for charged Higgs decays of the top quark using hadronic tau decays
We present the result of a search for charged Higgs decays of the top quark,
produced in collisions at 1.8 TeV. When the charged
Higgs is heavy and decays to a tau lepton, which subsequently decays
hadronically, the resulting events have a unique signature: large missing
transverse energy and the low-charged-multiplicity tau. Data collected in the
period 1992-1993 at the Collider Detector at Fermilab, corresponding to
18.70.7~pb, exclude new regions of combined top quark and charged
Higgs mass, in extensions to the standard model with two Higgs doublets.Comment: uuencoded, gzipped tar file of LaTeX and 6 Postscript figures; 11 pp;
submitted to Phys. Rev.
- …