67 research outputs found

    Experimental study exploring the interaction of structural and leakage dynamics

    Get PDF
    Strategies for managing leakage from water distribution systems require the ability to effectively evaluate such real losses through the understanding of the behavior of individual leaks, including their response to changes in pressure regime due to demand or management strategies. This paper presents the results from an innovative experimental investigation aimed at understanding the response of longitudinal slits in pressurized viscoelastic pipes, specifically considering the interaction between the structural and leakage dynamics. For the first time, leakage flow rate, pressure, leak area, and material strain were recorded simultaneously, providing new knowledge of the complex interaction of these factors. The paper shows that strain and area are directly related, hence it is possible to employ strain as a predictor of leak area, calculated using a calibrated viscoelastic model. Using such an approach, the leakage flow rates under a range of quasi-static pressures were accurately predicted and validated. Overall the paper demonstrates that the orifice equation, with a constant coefficient of discharge, is suitable for accurately estimating dynamic leakage flow rates from longitudinal slits, provided that the leak area is suitably incorporated

    Approaches to the evaluation of outbreak detection methods

    Get PDF
    BACKGROUND: An increasing number of methods are being developed for the early detection of infectious disease outbreaks which could be naturally occurring or as a result of bioterrorism; however, no standardised framework for examining the usefulness of various outbreak detection methods exists. To promote comparability between studies, it is essential that standardised methods are developed for the evaluation of outbreak detection methods. METHODS: This analysis aims to review approaches used to evaluate outbreak detection methods and provide a conceptual framework upon which recommendations for standardised evaluation methods can be based. We reviewed the recently published literature for reports which evaluated methods for the detection of infectious disease outbreaks in public health surveillance data. Evaluation methods identified in the recent literature were categorised according to the presence of common features to provide a conceptual basis within which to understand current approaches to evaluation. RESULTS: There was considerable variation in the approaches used for the evaluation of methods for the detection of outbreaks in public health surveillance data, and appeared to be no single approach of choice. Four main approaches were used to evaluate performance, and these were labelled the Descriptive, Derived, Epidemiological and Simulation approaches. Based on the approaches identified, we propose a basic framework for evaluation and recommend the use of multiple approaches to evaluation to enable a comprehensive and contextualised description of outbreak detection performance. CONCLUSION: The varied nature of performance evaluation demonstrated in this review supports the need for further development of evaluation methods to improve comparability between studies. Our findings indicate that no single approach can fulfil all evaluation requirements. We propose that the cornerstone approaches to evaluation identified provide key contributions to support internal and external validity and comparability of study findings, and suggest these be incorporated into future recommendations for performance assessment

    Using GIS to create synthetic disease outbreaks

    Get PDF
    BACKGROUND: The ability to detect disease outbreaks in their early stages is a key component of efficient disease control and prevention. With the increased availability of electronic health-care data and spatio-temporal analysis techniques, there is great potential to develop algorithms to enable more effective disease surveillance. However, to ensure that the algorithms are effective they need to be evaluated. The objective of this research was to develop a transparent user-friendly method to simulate spatial-temporal disease outbreak data for outbreak detection algorithm evaluation. A state-transition model which simulates disease outbreaks in daily time steps using specified disease-specific parameters was developed to model the spread of infectious diseases transmitted by person-to-person contact. The software was developed using the MapBasic programming language for the MapInfo Professional geographic information system environment. RESULTS: The simulation model developed is a generalised and flexible model which utilises the underlying distribution of the population and incorporates patterns of disease spread that can be customised to represent a range of infectious diseases and geographic locations. This model provides a means to explore the ability of outbreak detection algorithms to detect a variety of events across a large number of stochastic replications where the influence of uncertainty can be controlled. The software also allows historical data which is free from known outbreaks to be combined with simulated outbreak data to produce files for algorithm performance assessment. CONCLUSION: This simulation model provides a flexible method to generate data which may be useful for the evaluation and comparison of outbreak detection algorithm performance

    An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge

    Get PDF
    BACKGROUND: There is tremendous potential for genome sequencing to improve clinical diagnosis and care once it becomes routinely accessible, but this will require formalizing research methods into clinical best practices in the areas of sequence data generation, analysis, interpretation and reporting. The CLARITY Challenge was designed to spur convergence in methods for diagnosing genetic disease starting from clinical case history and genome sequencing data. DNA samples were obtained from three families with heritable genetic disorders and genomic sequence data was donated by sequencing platform vendors. The challenge was to analyze and interpret these data with the goals of identifying disease causing variants and reporting the findings in a clinically useful format. Participating contestant groups were solicited broadly, and an independent panel of judges evaluated their performance. RESULTS: A total of 30 international groups were engaged. The entries reveal a general convergence of practices on most elements of the analysis and interpretation process. However, even given this commonality of approach, only two groups identified the consensus candidate variants in all disease cases, demonstrating a need for consistent fine-tuning of the generally accepted methods. There was greater diversity of the final clinical report content and in the patient consenting process, demonstrating that these areas require additional exploration and standardization. CONCLUSIONS: The CLARITY Challenge provides a comprehensive assessment of current practices for using genome sequencing to diagnose and report genetic diseases. There is remarkable convergence in bioinformatic techniques, but medical interpretation and reporting are areas that require further development by many groups

    An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge

    Get PDF
    There is tremendous potential for genome sequencing to improve clinical diagnosis and care once it becomes routinely accessible, but this will require formalizing research methods into clinical best practices in the areas of sequence data generation, analysis, interpretation and reporting. The CLARITY Challenge was designed to spur convergence in methods for diagnosing genetic disease starting from clinical case history and genome sequencing data. DNA samples were obtained from three families with heritable genetic disorders and genomic sequence data were donated by sequencing platform vendors. The challenge was to analyze and interpret these data with the goals of identifying disease-causing variants and reporting the findings in a clinically useful format. Participating contestant groups were solicited broadly, and an independent panel of judges evaluated their performance. RESULTS: A total of 30 international groups were engaged. The entries reveal a general convergence of practices on most elements of the analysis and interpretation process. However, even given this commonality of approach, only two groups identified the consensus candidate variants in all disease cases, demonstrating a need for consistent fine-tuning of the generally accepted methods. There was greater diversity of the final clinical report content and in the patient consenting process, demonstrating that these areas require additional exploration and standardization. CONCLUSIONS: The CLARITY Challenge provides a comprehensive assessment of current practices for using genome sequencing to diagnose and report genetic diseases. There is remarkable convergence in bioinformatic techniques, but medical interpretation and reporting are areas that require further development by many groups

    Mortality of emergency abdominal surgery in high-, middle- and low-income countries

    Get PDF
    Background: Surgical mortality data are collected routinely in high-income countries, yet virtually no low- or middle-income countries have outcome surveillance in place. The aim was prospectively to collect worldwide mortality data following emergency abdominal surgery, comparing findings across countries with a low, middle or high Human Development Index (HDI). Methods: This was a prospective, multicentre, cohort study. Self-selected hospitals performing emergency surgery submitted prespecified data for consecutive patients from at least one 2-week interval during July to December 2014. Postoperative mortality was analysed by hierarchical multivariable logistic regression. Results: Data were obtained for 10 745 patients from 357 centres in 58 countries; 6538 were from high-, 2889 from middle- and 1318 from low-HDI settings. The overall mortality rate was 1â‹…6 per cent at 24 h (high 1â‹…1 per cent, middle 1â‹…9 per cent, low 3â‹…4 per cent; P < 0â‹…001), increasing to 5â‹…4 per cent by 30 days (high 4â‹…5 per cent, middle 6â‹…0 per cent, low 8â‹…6 per cent; P < 0â‹…001). Of the 578 patients who died, 404 (69â‹…9 per cent) did so between 24 h and 30 days following surgery (high 74â‹…2 per cent, middle 68â‹…8 per cent, low 60â‹…5 per cent). After adjustment, 30-day mortality remained higher in middle-income (odds ratio (OR) 2â‹…78, 95 per cent c.i. 1â‹…84 to 4â‹…20) and low-income (OR 2â‹…97, 1â‹…84 to 4â‹…81) countries. Surgical safety checklist use was less frequent in low- and middle-income countries, but when used was associated with reduced mortality at 30 days. Conclusion: Mortality is three times higher in low- compared with high-HDI countries even when adjusted for prognostic factors. Patient safety factors may have an important role. Registration number: NCT02179112 (http://www.clinicaltrials.gov)

    Financial statements and reports 2002

    No full text
    Contains the brochure: Financial statements. Key figuresConsiglio Nazionale delle Ricerche - Biblioteca Centrale - P.le Aldo Moro, 7 , Rome / CNR - Consiglio Nazionale delle RichercheSIGLEITItal
    • …
    corecore