35 research outputs found

    Comparison of clinical outcomes over time of inpatients with healthcare-associated or community-acquired coronavirus disease 2019 (COVID-19): A multicenter, prospective cohort study.

    Get PDF
    OBJECTIVE To compare clinical outcomes over time of inpatients with healthcare-associated coronavirus disease 2019 (HA-COVID-19) versus community-acquired COVID-19 (CA-COVID-19). DESIGN We conducted a multicenter, prospective observational cohort study of inpatients with COVID-19. SETTING The study was conducted across 16 acute-care hospitals in Switzerland. PARTICIPANTS AND METHODS We compared HA-COVID-19 cases, defined as patients with a positive severe acute respiratory coronavirus virus 2 (SARS-CoV-2) test > 5 days after hospital admission, with hospitalized CA-COVID-19 cases, defined as those who tested positive within 5 days of admission. The composite primary outcome was patient transfer to an intensive care unit (ICU) or an intermediate care unit (IMCU) and/or all-cause in-hospital mortality. We used cause-specific Cox regression and Fine-Gray regression to model the time to the composite clinical outcome, adjusting for confounders and accounting for the competing event of discharge from hospital. We compared our results to those from a conventional approach using an adjusted logistic regression model where time-varying effects and competitive risk were ignored. RESULTS Between February 19, 2020, and December 31, 2020, we included 1,337 HA-COVID-19 cases and 9,068 CA-COVID-19 cases. HA-COVID-19 patients were significantly older: median, 80 (interquartile range [IQR], 71-87) versus median 70 (IQR, 57-80) (P < .001). A greater proportion of HA-COVID-19 patients had a Charlson comorbidity index ≥ 5 (79% vs 55%; P < .001) than did CA-COVID-19 patients. In time-varying analyses, between day 0 and 8, HA-COVID-19 cases had a decreased risk of death or ICU or IMCU transfer compared to CA-COVID-19 cases (cause-specific hazard ratio [csHR], 0.43; 95% confidence interval [CI], 0.33-0.56). In contrast, from day 8 to 30, HA-COVID-19 cases had an increased risk of death or ICU or IMCU transfer (csHR, 1.49; 95% CI, 1.20-1.85), with no significant effect on the rate of discharge (csHR, 0.83; 95% CI, 0.61-1.14). In the conventional logistic regression model, HA-COVID-19 was protective against transfer to an ICU or IMCU and/or all-cause in-hospital mortality (adjusted odds ratio [aOR], 0.79, 95% CI, 0.67-0.93). CONCLUSIONS The risk of adverse clinical outcomes for HA-COVID-19 cases increased substantially over time in hospital and exceeded that for CA-COVID-19. Using approaches that do not account for time-varying effects or competing events may not fully capture the true risk of HA-COVID-19 compared to CA-COVID-19

    Systematic scoping review of automated systems for the surveillance of healthcare-associated bloodstream infections related to intravascular catheters

    Get PDF
    Introduction Intravascular catheters are crucial devices in medical practice that increase the risk of healthcare-associated infections (HAIs), and related health-economic adverse outcomes. This scoping review aims to provide a comprehensive overview of published automated algorithms for surveillance of catheter-related bloodstream infections (CRBSI) and central line-associated bloodstream infections (CLABSI). Methods We performed a scoping review based on a systematic search of the literature in PubMed and EMBASE from 1 January 2000 to 31 December 2021. Studies were included if they evaluated predictive performance of automated surveillance algorithms for CLABSI/CRBSI detection and used manually collected surveillance data as reference. We assessed the design of the automated systems, including the definitions used to develop algorithms (CLABSI versus CRBSI), the datasets and denominators used, and the algorithms evaluated in each of the studies. Results We screened 586 studies based on title and abstract, and 99 were assessed based on full text. Nine studies were included in the scoping review. Most studies were monocentric (n = 5), and they identified CLABSI (n = 7) as an outcome. The majority of the studies used administrative and microbiological data (n = 9) and five studies included the presence of a vascular central line in their automated system. Six studies explained the denominator they selected, five of which chose central line-days. The most common rules and steps used in the algorithms were categorized as hospital-acquired rules, infection rules (infection versus contamination), deduplication, episode grouping, secondary BSI rules (secondary versus primary BSI), and catheter-associated rules. Conclusion The automated surveillance systems that we identified were heterogeneous in terms of definitions, datasets and denominators used, with a combination of rules in each algorithm. Further guidelines and studies are needed to develop and implement algorithms to detect CLABSI/CRBSI, with standardized definitions, appropriate data sources and suitable denominators

    Digitalizing Clinical Guidelines: Experiences in the Development of Clinical Decision Support Algorithms for Management of Childhood Illness in Resource-Constrained Settings.

    Get PDF
    Clinical decision support systems (CDSSs) can strengthen the quality of integrated management of childhood illness (IMCI) in resource-constrained settings. Several IMCI-related CDSSs have been developed and implemented in recent years. Yet, despite having a shared starting point, the IMCI-related CDSSs are markedly varied due to the need for interpretation when translating narrative guidelines into decision logic combined with considerations of context and design choices. Between October 2019 and April 2021, we conducted a comparative analysis of 4 IMCI-related CDSSs. The extent of adaptations to IMCI varied, but common themes emerged. Scope was extended to cover a broader range of conditions. Content was added or modified to enhance precision, align with new evidence, and support rational resource use. Structure was modified to increase efficiency, improve usability, and prioritize care for severely ill children. The multistakeholder development processes involved syntheses of recommendations from existing guidelines and literature; creation and validation of clinical algorithms; and iterative development, implementation, and evaluation. The common themes surrounding adaptations of IMCI guidance highlight the complexities of digitalizing evidence-based recommendations and reinforce the rationale for leveraging standards for CDSS development, such as the World Health Organization's SMART Guidelines. Implementation through multistakeholder dialogue is critical to ensure CDSSs can effectively and equitably improve quality of care for children in resource-constrained settings

    Predictive performance of automated surveillance algorithms for intravascular catheter bloodstream infections: a systematic review and meta-analysis.

    Get PDF
    BACKGROUND Intravascular catheter infections are associated with adverse clinical outcomes. However, a significant proportion of these infections are preventable. Evaluations of the performance of automated surveillance systems for adequate monitoring of central-line associated bloodstream infection (CLABSI) or catheter-related bloodstream infection (CRBSI) are limited. OBJECTIVES We evaluated the predictive performance of automated algorithms for CLABSI/CRBSI detection, and investigated which parameters included in automated algorithms provide the greatest accuracy for CLABSI/CRBSI detection. METHODS We performed a meta-analysis based on a systematic search of published studies in PubMed and EMBASE from 1 January 2000 to 31 December 2021. We included studies that evaluated predictive performance of automated surveillance algorithms for CLABSI/CRBSI detection and used manually collected surveillance data as reference. We estimated the pooled sensitivity and specificity of algorithms for accuracy and performed a univariable meta-regression of the different parameters used across algorithms. RESULTS The search identified five full text studies and 32 different algorithms or study populations were included in the meta-analysis. All studies analysed central venous catheters and identified CLABSI or CRBSI as an outcome. Pooled sensitivity and specificity of automated surveillance algorithm were 0.88 [95%CI 0.84-0.91] and 0.86 [95%CI 0.79-0.92] with significant heterogeneity (I2 = 91.9, p < 0.001 and I2 = 99.2, p < 0.001, respectively). In meta-regression, algorithms that include results of microbiological cultures from specific specimens (respiratory, urine and wound) to exclude non-CRBSI had higher specificity estimates (0.92, 95%CI 0.88-0.96) than algorithms that include results of microbiological cultures from any other body sites (0.88, 95% CI 0.81-0.95). The addition of clinical signs as a predictor did not improve performance of these algorithms with similar specificity estimates (0.92, 95%CI 0.88-0.96). CONCLUSIONS Performance of automated algorithms for detection of intravascular catheter infections in comparison to manual surveillance seems encouraging. The development of automated algorithms should consider the inclusion of results of microbiological cultures from specific specimens to exclude non-CRBSI, while the inclusion of clinical data may not have an added-value. Trail Registration Prospectively registered with International prospective register of systematic reviews (PROSPERO ID CRD42022299641; January 21, 2022). https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42022299641

    Study protocol for an international, multicentre stepped-wedge cluster randomised trial to evaluate the impact of a digital antimicrobial stewardship smartphone application

    Get PDF
    Introduction With the widespread use of electronic health records and handheld electronic devices in hospitals, informatics-based antimicrobial stewardship interventions hold great promise as tools to promote appropriate antimicrobial drug prescribing. However, more research is needed to evaluate their optimal design and impact on quantity and quality of antimicrobial prescribing. Methods and analysis Use of smartphone-based digital stewardship applications (apps) with local guideline directed empirical antimicrobial use by physicians will be compared with antimicrobial prescription as per usual as primary outcome in three hospitals in the Netherlands, Sweden and Switzerland. Secondary outcomes will incl

    Aide informatique pour mieux prescrire des antimicrobiens : mythe ou réalité ?

    No full text
    Antimicrobial stewardship programmes aim to improve antimicrobial prescribing with the final aim to improve patient care, while limiting the emergence and spread of resistant bacteria. Two main categories of digital tools are currently available in this area: stand-alone mobile applications and tools directly integrated into electronic health records. The former are easy to implement and less costly, but offer limited support as they do not take into account individual patient data. Their impact depends on the clinician's willingness to use them regularly. Integrated systems are based on more sophisticated, individualised algorithms and offer the possibility of intervening with a variety of techniques (restriction, reassessment, feedback, alerts), sometimes before the prescription occurs. They are costly and complex to implement and require an appropriate IT infrastructure. Both systems, as in other areas of digital medicine, have a low level of evidence regarding their clinical impact. In this review we examine the two types of tools, the benefits and challenges associated with each, and the available data on effectiveness.Les programmes d’antimicrobial stewardship visent à optimiser la prescription d’antimicrobiens, dans l’objectif d’améliorer les soins aux patients, tout en limitant l’émergence et la dissémination des bactéries résistantes. Deux grandes catégories d’outils digitaux sont aujourd’hui disponibles : les applications mobiles « stand alone » et les outils intégrés au dossier patient informatisé. Les premiers, faciles à implémenter et moins coûteux, offrent cependant une aide plus limitée, ne prenant pas en compte des données individuelles. Les systèmes intégrés reposent sur des algorithmes plus élaborés, et offrent la possibilité d’intervenir par des moyens variés (restriction, réévaluation, feedback, alertes), parfois en amont de la prescription. Leur implémentation est coûteuse et nécessite une infrastructure informatique adéquate. Nous passons ici en revue les deux types d’outils, les avantages et les défis associés à chacun ainsi que les données d’efficacité disponibles

    Gestion de l'émergence de la tuberculose multi-résistante aux hospices civils de Lyon de 2007 à 2012 (étude de vingt cas adultes et du suivi pédiatrique autour des cas)

    No full text
    L'émergence et la diffusion de souches de tuberculose multi-résistantes est préoccupante, y compris dans les pays de faible incidence de tuberculose. Dans la première partie de ce travail nous avons étudié de façon rétrospective les cas de 20 patients atteints d'une tuberculose multi-résistante pris en charge dans notre institution entre Janvier 2007 et Juin 2012. 90% des patients étaient d'origine étrangère, dont 60% de pays d'Europe de l'Est ou d'Asie centrale. Les souches appartenaient majoritairement au génotype M. tuberculosis Beijing. Le taux de résistance à I'Ethionamide était particulièrement élevé (78%). Une phase intensive incluant une molécule injectable pour 8 mois tel que recommandé par l'OMS a été toléré chez un seul patient. Une phase d'entretien de 12 mois incluant 3 molécules actives a été achevée pour 8 patients. L'amikacine et le linezolide étaient les molécules les plus toxiques. 6 patients ont pu bénéficier d'un traitement par Bedaquiline TMC207 avec un bon profil de tolérance clinique et biologique. Une chirurgie de résection pulmonaire a été réalisée chez 5 patients. Aucun patient n'a présenté d'échec de traitement ni de rechute. Dans la seconde partie de ce travail, nous avons étudié la prise en charge des enfants contacts dans l'entourage des 20 cas adultes précédemment décrits. Les cas pédiatriques de TB-MR sont un évènement sentinelle indiquant une circulation récente de souches au sein d'une communauté. 46 enfants contacts ont été recensés. Durant le suivi, le nombre d'enfants perdus de vue était élevé : 1/4 à 3 mois, 3/4 à 12 mois attestant de la difficulté de suivre une jeune population réservoir (87% d'origine étrangère). Trois enfants ont présenté un virage tuberculinique. Toutes les radiographies thoraciques étaient interprétées comme normales. Pour 1 (2%) enfant, le diagnostic de TB a été retenu devant un virage tuberculinique associé à des anomalies sur la TDM thoracique. Pour 2 (6%) enfants, le diagnostic d'ITL a été retenu devant virage tuberculinique avec une TOM thoracique normale. Deux enfants ont eu un traitement adapté à la souche du cas index. Dans ce contexte d'émergence de la TB-MR, la substitution de l'lDR par les tests de libération d'interféron (TLI), aussi sensibles et plus spécifiques (indépendant du BCG), semble pertinente. De même, la TDM thoracique optimise la détection de lésions infraradiographiques. La stratégie de traitement de I'ITL en post-exposition d'une TB-MR est débattue au profit d'une stratégie proposée par l'OMS de surveillance prolongée avec traitement curatif d'une TB active le cas échéant.LYON1-BU Santé (693882101) / SudocSudocFranceF

    In search of lost time: A timing evaluation of antimicrobial prescribing with and without a computerized decision support system using clinical vignettes

    No full text
    Background: We implemented a computerized decision support system (CDSS) integrated in the in-house computerized physician order entry (CPOE) system to assist physicians with antimicrobial prescribing decisions in the context of the multicenter cluster-randomized COMPASS trial (NCT03120975). Some physicians in the intervention wards complained about the perceived extra-time associated with the use of the CDSS compared with routine prescribing through CPOE. The aim of this study was to compare the time needed to prescribe antimicrobials with and without the CDSS. Methods: Physicians with and without previous experience with the COMPASS CDSS working at our hospital in Geneva, Switzerland, were recruited to prescribe antimicrobials using clinical vignettes. Physicians without experience received a brief explanation of the CDSS. Each physician received 2 groups of 5-7 clinical vignettes randomly selected from a pool of 28. Each group of vignettes included prescriptions with different levels of complexity (empiric versus targeted or pre-defined treatment, dose adjustment for renal function, oral switch, treatments for which COMPASS does not provide recommendations or where a deviation was necessary). Prescriptions were completed using the standard CPOE (first set), then using COMPASS (second set). A print version of the local antimicrobial guidelines was available for consultation. Time to complete each prescription was recorded (including time needed to consult paper guidelines). The Mann-Whitney test was used for comparisons. Consultation of guidelines booklet and concordance with local guidelines were assessed. Results: Twenty-five physicians were recruited. Thirteen (52%) had previously used COMPASS. Among them, 11 (85%) estimated the extra-time being above 1 min. We evaluated a total of 296 vignettes. Overall, the median time to complete a prescription was 55.5 s (IQR 38-86) using COMPASS and 50 s (IQR 31-88) using the standard CPOE (p = 0.24). Concordance of prescriptions with local guidelines was similar with the 2 systems (127/148, 85.8% for both), but consultation of paper guidelines was more frequent when prescribing without the CDSS (49.3% (73/148) vs 22.3% (33/148)). Conclusions: The increased time required for prescribing using COMPASS is overestimated by end-users. Information collected in the study will be used to streamline the prescribing process via COMPASS and increase acceptance.</p

    Methicillin-resistant Staphylococcus aureus: an update on prevention and control in acute care settings

    No full text
    Abstract Methicillin-resistant Staphylococcus aureus (MRSA) is a leading cause of health-care-associated infections. Controversies regarding the effectiveness of various control strategies have contributed to varying approaches to MRSA control. However, new evidence from large-scale studies has emerged, particularly concerning screening and decolonization. Importantly, implementation and outcomes of control measures in practice are not only influenced by scientific evidence, but also economic, administrative, and political factors, as demonstrated by decreasing MRSA rates in a number of countries after concerted and coordinated efforts at a national level. Flexibility to adapt measures based on local epidemiology and resources is essential for successful MRSA control.</p
    corecore