144 research outputs found
Adapting SAM for CDF
The CDF and D0 experiments probe the high-energy frontier and as they do so
have accumulated hundreds of Terabytes of data on the way to petabytes of data
over the next two years. The experiments have made a commitment to use the
developing Grid based on the SAM system to handle these data. The D0 SAM has
been extended for use in CDF as common patterns of design emerged to meet the
similar requirements of these experiments. The process by which the merger was
achieved is explained with particular emphasis on lessons learned concerning
the database design patterns plus realization of the use cases.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 4 pages, pdf format, TUAT00
ASCR/HEP Exascale Requirements Review Report
This draft report summarizes and details the findings, results, and
recommendations derived from the ASCR/HEP Exascale Requirements Review meeting
held in June, 2015. The main conclusions are as follows. 1) Larger, more
capable computing and data facilities are needed to support HEP science goals
in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of
the demand at the 2025 timescale is at least two orders of magnitude -- and in
some cases greater -- than that available currently. 2) The growth rate of data
produced by simulations is overwhelming the current ability, of both facilities
and researchers, to store and analyze it. Additional resources and new
techniques for data analysis are urgently needed. 3) Data rates and volumes
from HEP experimental facilities are also straining the ability to store and
analyze large and complex data volumes. Appropriately configured
leadership-class facilities can play a transformational role in enabling
scientific discovery from these datasets. 4) A close integration of HPC
simulation and data analysis will aid greatly in interpreting results from HEP
experiments. Such an integration will minimize data movement and facilitate
interdependent workflows. 5) Long-range planning between HEP and ASCR will be
required to meet HEP's research needs. To best use ASCR HPC resources the
experimental HEP program needs a) an established long-term plan for access to
ASCR computational and data resources, b) an ability to map workflows onto HPC
resources, c) the ability for ASCR facilities to accommodate workflows run by
collaborations that can have thousands of individual members, d) to transition
codes to the next-generation HPC platforms that will be available at ASCR
facilities, e) to build up and train a workforce capable of developing and
using simulations and analysis to support HEP scientific research on
next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio
Recommended from our members
Exabyte helical scan devices at Fermilab
Exabyte 8mm helical scan storage devices are in use at Fermilab in a number of applications. These devices have the functionality of magnetic tape, but use media which is much more economical and much more dense than conventional 9 track tape. 6 refs., 3 figs
Recommended from our members
Data acquisition systems for the Sloan Digital Sky Survey
The Sloan Digital Sky Survey (SDSS) will image {Pi} steradians about the north galactic cap in five filters, and acquire one million spectra using a dedicated 2.5 meter telescope at the Apache Point Observatory in New Mexico. The authors describe the data acquisition system for the survey`s three main detectors: an imaging camera, mounting 54 Tektronix charge-coupled devices (CCD); a pair of spectrographs, each mounting a pair of CCDs; and a smaller monitor telescope camera. The authors describe the system`s hardware and software architecture, and relate it to the survey`s special requirements for high reliability and need to understand its instrumentation in order to produce a consistent survey over a five year period
- …