1,601 research outputs found

    Effectiveness and predictability of in-network storage cache for scientific workflows

    Full text link
    Large scientific collaborations often have multiple scientists accessing the same set of files while doing different analyses, which create repeated accesses to the large amounts of shared data located far away. These data accesses have long latency due to distance and occupy the limited bandwidth available over the wide-area network. To reduce the wide-area network traffic and the data access latency, regional data storage caches have been installed as a new networking service. To study the effectiveness of such a cache system in scientific applications, we examine the Southern California Petabyte Scale Cache for a high-energy physics experiment. By examining about 3TB of operational logs, we show that this cache removed 67.6% of file requests from the wide-area network and reduced the traffic volume on wide-area network by 12.3TB (or 35.4%) an average day. The reduction in the traffic volume (35.4%) is less than the reduction in file counts (67.6%) because the larger files are less likely to be reused. Due to this difference in data access patterns, the cache system has implemented a policy to avoid evicting smaller files when processing larger files. We also build a machine learning model to study the predictability of the cache behavior. Tests show that this model is able to accurately predict the cache accesses, cache misses, and network throughput, making the model useful for future studies on resource provisioning and planning

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    B Physics at the Tevatron: Run II and Beyond

    Full text link
    This report provides a comprehensive overview of the prospects for B physics at the Tevatron. The work was carried out during a series of workshops starting in September 1999. There were four working groups: 1) CP Violation, 2) Rare and Semileptonic Decays, 3) Mixing and Lifetimes, 4) Production, Fragmentation and Spectroscopy. The report also includes introductory chapters on theoretical and experimental tools emphasizing aspects of B physics specific to hadron colliders, as well as overviews of the CDF, D0, and BTeV detectors, and a Summary.Comment: 583 pages. Further information on the workshops, including transparencies, can be found at the workshop's homepage: http://www-theory.lbl.gov/Brun2/. The report is also available in 2-up http://www-theory.lbl.gov/Brun2/report/report2.ps.gz or chapter-by-chapter http://www-theory.lbl.gov/Brun2/report

    Pharmacokinetic/pharmacodynamic modelling approaches in paediatric infectious diseases and immunology.

    Get PDF
    Pharmacokinetic/pharmacodynamic (PKPD) modelling is used to describe and quantify dose-concentration-effect relationships. Within paediatric studies in infectious diseases and immunology these methods are often applied to developing guidance on appropriate dosing. In this paper, an introduction to the field of PKPD modelling is given, followed by a review of the PKPD studies that have been undertaken in paediatric infectious diseases and immunology. The main focus is on identifying the methodological approaches used to define the PKPD relationship in these studies. The major findings were that most studies of infectious diseases have developed a PK model and then used simulations to define a dose recommendation based on a pre-defined PD target, which may have been defined in adults or in vitro. For immunological studies much of the modelling has focused on either PK or PD, and since multiple drugs are usually used, delineating the relative contributions of each is challenging. The use of dynamical modelling of in vitro antibacterial studies, and paediatric HIV mechanistic PD models linked with the PK of all drugs, are emerging methods that should enhance PKPD-based recommendations in the future

    Study of the Decays B0 --> D(*)+D(*)-

    Full text link
    The decays B0 --> D*+D*-, B0 --> D*+D- and B0 --> D+D- are studied in 9.7 million Y(4S) --> BBbar decays accumulated with the CLEO detector. We determine Br(B0 --> D*+D*-) = (9.9+4.2-3.3+-1.2)e-4 and limit Br(B0 --> D*+D-) < 6.3e-4 and Br(B0 --> D+D-) < 9.4e-4 at 90% confidence level (CL). We also perform the first angular analysis of the B0 --> D*+D*- decay and determine that the CP-even fraction of the final state is greater than 0.11 at 90% CL. Future measurements of the time dependence of these decays may be useful for the investigation of CP violation in neutral B meson decays.Comment: 21 pages, 5 figures, submitted to Phys. Rev.

    Improved Measurement of the Pseudoscalar Decay Constant fDsf_{D_{s}}

    Get PDF
    We present a new determination of the Ds decay constant, f_{Ds} using 5 million continuum charm events obtained with the CLEO II detector. Our value is derived from our new measured ratio of widths for Ds -> mu nu/Ds -> phi pi of 0.173+/- 0.021 +/- 0.031. Taking the branching ratio for Ds -> phi pi as (3.6 +/- 0.9)% from the PDG, we extract f_{Ds} = (280 +/- 17 +/- 25 +/- 34){MeV}. We compare this result with various model calculations.Comment: 23 page postscript file, postscript file also available through http://w4.lns.cornell.edu/public/CLN
    • …
    corecore