1,101 research outputs found

    Increasing the reliability of fully automated surveillance for central line–associated bloodstream infections

    Get PDF
    OBJECTIVETo increase reliability of the algorithm used in our fully automated electronic surveillance system by adding rules to better identify bloodstream infections secondary to other hospital-acquired infections.METHODSIntensive care unit (ICU) patients with positive blood cultures were reviewed. Central line–associated bloodstream infection (CLABSI) determinations were based on 2 sources: routine surveillance by infection preventionists, and fully automated surveillance. Discrepancies between the 2 sources were evaluated to determine root causes. Secondary infection sites were identified in most discrepant cases. New rules to identify secondary sites were added to the algorithm and applied to this ICU population and a non-ICU population. Sensitivity, specificity, predictive values, and kappa were calculated for the new models.RESULTSOf 643 positive ICU blood cultures reviewed, 68 (10.6%) were identified as central line–associated bloodstream infections by fully automated electronic surveillance, whereas 38 (5.9%) were confirmed by routine surveillance. New rules were tested to identify organisms as central line–associated bloodstream infections if they did not meet one, or a combination of, the following: (I) matching organisms (by genus and species) cultured from any other site; (II) any organisms cultured from sterile site; (III) any organisms cultured from skin/wound; (IV) any organisms cultured from respiratory tract. The best-fit model included new rules I and II when applied to positive blood cultures in an ICU population. However, they didn’t improve performance of the algorithm when applied to positive blood cultures in a non-ICU population.CONCLUSIONElectronic surveillance system algorithms may need adjustment for specific populations.Infect. Control Hosp. Epidemiol. 2015;36(12):1396–1400</jats:sec

    The ocean carbon sink – impacts, vulnerabilities and challenges

    Get PDF
    Carbon dioxide (CO2) is, next to water vapour, considered to be the most important natural greenhouse gas on Earth. Rapidly rising atmospheric CO2 concentrations caused by human actions such as fossil fuel burning, land-use change or cement production over the past 250 years have given cause for concern that changes in Earth’s climate system may progress at a much faster pace and larger extent than during the past 20 000 years. Investigating global carbon cycle pathways and finding suitable adaptation and mitigation strategies has, therefore, become of major concern in many research fields. The oceans have a key role in regulating atmospheric CO2 concentrations and currently take up about 25% of annual anthropogenic carbon emissions to the atmosphere. Questions that yet need to be answered are what the carbon uptake kinetics of the oceans will be in the future and how the increase in oceanic carbon inventory will affect its ecosystems and their services. This requires comprehensive investigations, including high-quality ocean carbon measurements on different spatial and temporal scales, the management of data in sophisticated databases, the application of Earth system models to provide future projections for given emission scenarios as well as a global synthesis and outreach to policy makers. In this paper, the current understanding of the ocean as an important carbon sink is reviewed with respect to these topics. Emphasis is placed on the complex interplay of different physical, chemical and biological processes that yield both positive and negative air–sea flux values for natural and anthropogenic CO2 as well as on increased CO2 (uptake) as the regulating force of the radiative warming of the atmosphere and the gradual acidification of the oceans. Major future ocean carbon challenges in the fields of ocean observations, modelling and process research as well as the relevance of other biogeochemical cycles and greenhouse gases are discussed

    REACH implementation costs in the Belgian food industry:a semi-qualitative study

    Full text link
    In this paper we discuss how companies in the Belgian food industry are affected by the REACH legislation and whether their competitiveness is weakened as a result. The study has been carried out through an extensive literature study, an electronic survey, in-depth interviews and a case-study. No indication is observed of REACH compliance significantly hampering the competitive position of Belgian food industry. The overall cost burden seems to be relatively low. In contrast with the chemical industry, large food companies bear the highest costs, whereas the financial impact on small and medium-sized food companies remains limited.<br

    Psychometric properties of the Zephyr bioharness device: A systematic review

    Get PDF
    © 2018 The Author(s). Background: Technological development and improvements in Wearable Physiological Monitoring devices, have facilitated the wireless and continuous field-based monitoring/capturing of physiologic measures in healthy, clinical or athletic populations. These devices have many applications for prevention and rehabilitation of musculoskeletal disorders, assuming reliable and valid data is collected. The purpose of this study was to appraise the quality and synthesize findings from published studies on psychometric properties of heart rate measurements taken with the Zephyr Bioharness device. Methods: We searched the Embase, Medline, PsycInfo, PuMed and Google Scholar databases to identify articles. Articles were appraised for quality using a structured clinical measurement specific appraisal tool. Two raters evaluated the quality and conducted data extraction. We extracted data on the reliability (intra-class correlation coefficients and standard error of measurement) and validity measures (Pearson/Spearman’s correlation coefficients) along with mean differences. Agreement parameters were summarised by the average biases and 95% limits of agreement. Results: A total of ten studies were included: quality ratings ranged from 54 to 92%. The intra-class correlation coefficients reported ranged from 0.85–0.98. The construct validity coefficients compared against gold standard calibrations or other commercially used devices, ranged from 0.74–0.99 and 0.67–0.98 respectively. Zephyr Bioharness agreement error ranged from − 4.81 (under-estimation) to 3.00 (over-estimation) beats per minute, with varying 95% limits of agreement, when compared with gold standard measures. Conclusion: Good to excellent quality evidence from ten studies suggested that the Zephyr Bioharness device can provide reliable and valid measurements of heart rate across multiple contexts, and that it displayed good agreements vs. gold standard comparators – supporting criterion validity

    Provably scale-covariant networks from oriented quasi quadrature measures in cascade

    Full text link
    This article presents a continuous model for hierarchical networks based on a combination of mathematically derived models of receptive fields and biologically inspired computations. Based on a functional model of complex cells in terms of an oriented quasi quadrature combination of first- and second-order directional Gaussian derivatives, we couple such primitive computations in cascade over combinatorial expansions over image orientations. Scale-space properties of the computational primitives are analysed and it is shown that the resulting representation allows for provable scale and rotation covariance. A prototype application to texture analysis is developed and it is demonstrated that a simplified mean-reduced representation of the resulting QuasiQuadNet leads to promising experimental results on three texture datasets.Comment: 12 pages, 3 figures, 1 tabl

    Geometric reconstruction methods for electron tomography

    Get PDF
    Electron tomography is becoming an increasingly important tool in materials science for studying the three-dimensional morphologies and chemical compositions of nanostructures. The image quality obtained by many current algorithms is seriously affected by the problems of missing wedge artefacts and nonlinear projection intensities due to diffraction effects. The former refers to the fact that data cannot be acquired over the full 180180^\circ tilt range; the latter implies that for some orientations, crystalline structures can show strong contrast changes. To overcome these problems we introduce and discuss several algorithms from the mathematical fields of geometric and discrete tomography. The algorithms incorporate geometric prior knowledge (mainly convexity and homogeneity), which also in principle considerably reduces the number of tilt angles required. Results are discussed for the reconstruction of an InAs nanowire
    corecore