48,022 research outputs found
Mercy Medical Center: Reducing Readmissions Through Clinical Excellence, Palliative Care, and Collaboration
Outlines strategies and practices behind low readmissions rates for heart attack, heart failure, and pneumonia patients, such as investing in advanced practice nurses who help incorporate evidence-based standards into patient care. Lists lessons learned
Rebuild Iowa Office Quarterly Performance Report 3rd Quarter, April 2009
As the anniversaries of 2008 tornado’s and floods approach, the Rebuild Iowa Office vision of a safer, stronger and smarter Iowa is coming into sharper focus. While much more remains to be done, hundreds of displaced Iowans and businesses are on the road to recovery and the building blocks for communities coming together. While recovery is a marathon and not a sprint, the work done so far couldn’t have been accomplished without an extensive recovery planning effort and an unprecedented level of cooperation among local, state and federal governments, private citizens, businesses and non-profit organizations, there is a rebirth and recovery underway in Iowa
Preliminary basic performance analysis of the Cedar multiprocessor memory system
Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system
Monte Carlo event generator validation and tuning for the LHC
We summarise the motivation for, and the status of, the tools developed by
CEDAR/MCnet for validating and tuning Monte Carlo event generators for the LHC
against data from previous colliders. We then present selected preliminary
results from studies of event shapes and hadronisation observables from e+e-
colliders, and of minimum bias and underlying event observables from the
Tevatron, and comment on the approach needed with early LHC data to best
exploit the potential for new physics discoveries at the LHC in the next few
years.Comment: Prepared for Proceedings of XII Advanced Computing and Analysis
Techniques in Physics Research, November 3-7 2008, Erice, Ital
Evaluation of the Cedar memory system: Configuration of 16 by 16
Some basic results on the performance of the Cedar multiprocessor system are presented. Empirical results on the 16 processor 16 memory bank system configuration, which show the behavior of the Cedar system under different modes of operation are presented
CEDAR: tools for event generator tuning
I describe the work of the CEDAR collaboration in developing tools for tuning
and validating Monte Carlo event generator programs. The core CEDAR task is to
interface the Durham HepData database of experimental measurements to event
generator validation tools such as the UCL JetWeb system - this has
necessitated the migration of HepData to a new relational database system and a
Java-based interaction model. The "number crunching" part of JetWeb is also
being upgraded, from the Fortran HZTool library to the new C++ Rivet system and
a generator interfacing layer named RivetGun. Finally, I describe how Rivet is
already being used as a central part of a new generator tuning system, and
summarise two other CEDAR activities, HepML and HepForge.Comment: 13 pages, prepared for XI International Workshop on Advanced
Computing and Analysis Techniques in Physics Research, Amsterdam, April 23-27
200
HepData and JetWeb: HEP data archiving and model validation
The CEDAR collaboration is extending and combining the JetWeb and HepData
systems to provide a single service for tuning and validating models of
high-energy physics processes. The centrepiece of this activity is the fitting
by JetWeb of observables computed from Monte Carlo event generator events
against their experimentally determined distributions, as stored in HepData.
Caching the results of the JetWeb simulation and comparison stages provides a
single cumulative database of event generator tunings, fitted against a wide
range of experimental quantities. An important feature of this integration is a
family of XML data formats, called HepML.Comment: 4 pages, 0 figures. To be published in proceedings of CHEP0
Practical Evaluation of Lempel-Ziv-78 and Lempel-Ziv-Welch Tries
We present the first thorough practical study of the Lempel-Ziv-78 and the
Lempel-Ziv-Welch computation based on trie data structures. With a careful
selection of trie representations we can beat well-tuned popular trie data
structures like Judy, m-Bonsai or Cedar
Design of testbed and emulation tools
The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems
Recommended from our members
The role of hydrograph indices in parameter estimation of rainfall-runoff models
A reliable prediction of hydrologic models, among other things, requires a set of plausible parameters that correspond with physiographic properties of the basin. This study proposes a parameter estimation approach, which is based on extracting, through hydrograph diagnoses, information in the form of indices that carry intrinsic properties of a basin. This concept is demonstrated by introducing two indices that describe the shape of a streamflow hydrograph in an integrated manner. Nineteen mid-size (223-4790 km2) perennial headwater basins with a long record of streamflow data were selected to evaluate the ability of these indices to capture basin response characteristics. An examination of the utility of the proposed indices in parameter estimation is conducted for a five-parameter hydrologic model using data from the Leaf River, located in Fort Collins, Mississippi. It is shown that constraining the parameter estimation by selecting only those parameters that result in model output which maintains the indices as found in the historical data can improve the reliability of model predictions. These improvements were manifested in (a) improvement of the prediction of low and high flow, (b) improvement of the overall total biases, and (c) maintenance of the hydrograph's shape for both long-term and short-term predictions. Copyright © 2005 John Wiley & Sons, Ltd
- …