679 research outputs found
Risk factors for recurrence in patients with Clostridium difficile infection due to 027 and non-027 ribotypes
Objectives: Our objective was to evaluate factors associated with recurrence in patients with 027+ and 027– Clostridium difficile infection (CDI). Methods: Patients with CDI observed between January and December 2014 in six hospitals were consecutively included in the study. The 027 ribotype was deduced by the presence of tcdB, tcdB, cdt genes and the deletion Δ117 in tcdC (Xpert® C. difficile/Epi). Recurrence was defined as a positive laboratory test result for C. difficile more than 14 days but within 8 weeks after the initial diagnosis date with reappearance of symptoms. To identify factors associated with recurrence in 027+ and 027– CDI, a multivariate analysis was performed in each patient group. Subdistributional hazard ratios (sHRs) and 95% confidence intervals (95%CIs) were calculated. Results: Overall, 238 patients with 027+ CDI and 267 with 027– CDI were analysed. On multivariate analysis metronidazole monotherapy (sHR 2.380, 95%CI 1.549–3.60, p <0.001) and immunosuppressive treatment (sHR 3.116, 95%CI 1.906–5.090, p <0.001) were factors associated with recurrence in patients with 027+ CDI. In this patient group, metronidazole monotherapy was independently associated with recurrence in both mild/moderate (sHR 1.894, 95%CI 1.051–3.410, p 0.033) and severe CDI (sHR 2.476, 95%CI 1.281–4.790, p 0.007). Conversely, non-severe disease (sHR 3.704, 95%CI 1.437–9.524, p 0.007) and absence of chronic renal failure (sHR 16.129, 95%CI 2.155–125.000, p 0.007) were associated with recurrence in 027– CDI. Conclusions: Compared to vancomycin, metronidazole monotherapy appears less effective in curing CDI without relapse in the 027+ patient group, independently of disease severity
Core components for effective infection prevention and control programmes: new WHO evidence-based recommendations
Abstract
Health care-associated infections (HAI) are a major public health problem with a significant impact on morbidity, mortality and quality of life. They represent also an important economic burden to health systems worldwide. However, a large proportion of HAI are preventable through effective infection prevention and control (IPC) measures. Improvements in IPC at the national and facility level are critical for the successful containment of antimicrobial resistance and the prevention of HAI, including outbreaks of highly transmissible diseases through high quality care within the context of universal health coverage. Given the limited availability of IPC evidence-based guidance and standards, the World Health Organization (WHO) decided to prioritize the development of global recommendations on the core components of effective IPC programmes both at the national and acute health care facility level, based on systematic literature reviews and expert consensus. The aim of the guideline development process was to identify the evidence and evaluate its quality, consider patient values and preferences, resource implications, and the feasibility and acceptability of the recommendations. As a result, 11 recommendations and three good practice statements are presented here, including a summary of the supporting evidence, and form the substance of a new WHO IPC guideline
Detection of Viral RNA in Tissues following Plasma Clearance from an Ebola Virus Infected Patient
An unprecedented Ebola virus (EBOV) epidemic occurred in 2013–2016 in West Africa. Over this time the epidemic exponentially grew and moved to Europe and North America, with several imported cases and many Health Care Workers (HCW) infected. Better understanding of EBOV infection patterns in different body compartments is mandatory to develop new countermeasures, as well as to fully comprehend the pathways of human-to-human transmission. We have longitudinally explored the persistence of EBOV-specific negative sense genomic RNA (neg-RNA) and the presence of positive sense RNA (pos-RNA), including both replication intermediate (antigenomic-RNA) and messenger RNA (mRNA) molecules, in the upper and lower respiratory tract, as compared to plasma, in a HCW infected with EBOV in Sierra Leone, who was hospitalized in the high isolation facility of the National Institute for Infectious Diseases “Lazzaro Spallanzani” (INMI), Rome, Italy. We observed persistence of pos-RNA and neg-RNAs in longitudinally collected specimens of the lower respiratory tract, even after viral clearance from plasma, suggesting possible local replication. The purpose of the present study is to enhance the knowledge on the biological features of EBOV that can contribute to the human-to-human transmissibility and to develop effective intervention strategies. However, further investigation is needed in order to better understand the clinical meaning of viral replication and shedding in the respiratory tract
Variability in testing policies and impact on reported Clostridium difficile infection rates: results from the pilot Longitudinal European Clostridium difficile Infection Diagnosis surveillance study (LuCID)
Lack of standardised Clostridium difficile testing is a potential confounder when comparing infection rates. We used an observational, systematic, prospective large-scale sampling approach to investigate variability in C. difficile sampling to understand C. difficile infection (CDI) incidence rates. In-patient and institutional data were gathered from 60 European hospitals (across three countries). Testing methodology, testing/CDI rates and case profiles were compared between countries and institution types. The mean annual CDI rate per hospital was lowest in the UK and highest in Italy (1.5 vs. 4.7 cases/10,000 patient bed days [pbds], p < 0.001). The testing rate was highest in the UK compared with Italy and France (50.7/10,000 pbds vs. 31.5 and 30.3, respectively, p < 0.001). Only 58.4 % of diarrhoeal samples were tested for CDI across all countries. Overall, only 64 % of hospitals used recommended testing algorithms for laboratory testing. Small hospitals were significantly more likely to use standalone toxin tests (SATTs). There was an inverse correlation between hospital size and CDI testing rate. Hospitals using SATT or assays not detecting toxin reported significantly higher CDI rates than those using recommended methods, despite testing similar testing frequencies. These data are consistent with higher false-positive rates in such (non-recommended) testing scenarios. Cases in Italy and those diagnosed by SATT or methods NOT detecting toxin were significantly older. Testing occurred significantly earlier in the UK. Assessment of testing practice is paramount to the accurate interpretation and comparison of CDI rates
Seroprevalence of five neglected parasitic diseases among immigrants accessing five infectious and tropical diseases units in Italy: a cross-sectional study.
: This multicentre cross-sectional study aims to estimate the prevalence of five neglected tropical diseases (Chagas disease, filariasis, schistosomiasis, strongyloidiasis, toxocariasis) among immigrants accessing health care facilities in five Italian cities (Bologna, Brescia, Florence, Rome, Verona). : Individuals underwent a different set of serological tests, according to country of origin and presence of eosinophilia. Seropositive patients were treated and further followed up. : A total of 930 adult immigrants were enrolled: 477 men (51.3%), 445 women (47.9%), 8 transgender (0.8%); median age was 37.81 years (range 18-80). Most of them were coming from the African continent (405/930, 43.5%), the rest from East Europe, South America and Asia. A portion of 9.6% (89/930) were diagnosed with at least one of the infections under study. Seroprevalence of each specific infection varied from 3.9% (7/180) for Chagas diseases to 9.7% (11/113) for toxocariasis. Seropositive people were more likely to be 35 to 40 years-old male and to come from South East Asia, Sub-Saharan Africa or South America. : The results of our study confirm that neglected tropical diseases represent a substantial health problem among immigrants and highlight the need for addressing this emerging public health issue.<br/
Mitochondrial complex 1 activity measured by spectrophotometry is reduced across all brain regions in ageing and more specifically in neurodegeneration
Mitochondrial function, in particular complex 1 of the electron transport chain (ETC), has been shown to decrease during normal ageing and in neurodegenerative disease. However, there is some debate concerning which area of the brain has the greatest complex 1 activity. It is important to identify the pattern of activity in order to be able to gauge the effect of age or disease related changes. We determined complex 1 activity spectrophotometrically in the cortex, brainstem and cerebellum of middle aged mice (70–71 weeks), a cerebellar ataxic neurodegeneration model (pcd5J) and young wild type controls. We share our updated protocol on the measurements of complex1 activity and find that mitochondrial fractions isolated from frozen tissues can be measured for robust activity. We show that complex 1 activity is clearly highest in the cortex when compared with brainstem and cerebellum (p<0.003). Cerebellum and brainstem mitochondria exhibit similar levels of complex 1 activity in wild type brains. In the aged brain we see similar levels of complex 1 activity in all three-brain regions. The specific activity of complex 1 measured in the aged cortex is significantly decreased when compared with controls (p<0.0001). Both the cerebellum and brainstem mitochondria also show significantly reduced activity with ageing (p<0.05). The mouse model of ataxia predictably has a lower complex 1 activity in the cerebellum, and although reductions are measured in the cortex and brain stem, the remaining activity is higher than in the aged brains. We present clear evidence that complex 1 activity decreases across the brain with age and much more specifically in the cerebellum of the pcd5j mouse. Mitochondrial impairment can be a region specific phenomenon in disease, but in ageing appears to affect the entire brain, abolishing the pattern of higher activity in cortical regions
Disturbance patterns in a socio-ecological system at multiple scales
Ecological systems with hierarchical organization and non-equilibrium dynamics require multiple-scale analyses to comprehend how a system is structured and to formulate hypotheses about regulatory mechanisms. Characteristic scales in real landscapes are determined by, or at least reflect, the spatial patterns and scales of constraining human interactions with the biophysical environment. If the patterns or scales of human actions change, then the constraints change, and the structure and dynamics of the entire socioecological system (SES) can change accordingly. Understanding biodiversity in a SES requires understanding how the actions of humans as a keystone species shape the environment across a range of scales. We address this problem by investigating the spatial patterns of human disturbances at multiple scales in a SES in southern Italy. We describe an operational framework to identify multi-scale profiles of short-term anthropogenic disturbances using a moving window algorithm to measure the amount and configuration of disturbance as detected by satellite imagery. Prevailing land uses were found to contribute in different ways to the disturbance gradient at multiple scales, as land uses resulted from other types of biophysical and social controls shaping the region. The resulting profiles were then interpreted with respect to defining critical support regions and scale-dependent models for the assessment and management of disturbances, and for indicating system fragility and resilience of socio-ecological systems in the region. The results suggest support regions and scale intervals where past disturbance has been most likely and clumped - i.e. where fragility is highest and resilience is lowest. We discuss the potential for planning and managing landscape disturbances with a predictable effect on ecological processes. (c) 2006 Elsevier B.V. All rights reserved
Cardiovascular system and human immunodeficiency virus infection
Iako su rana klinička opažanja ukazivala da virus humane imunodeficijencije (HIV) pošteđuje srce, novija istraživanja dokazuju da se kardiovaskularne bolesti u tijeku infekcije HIV-om susreću sve češće. Na pojavnost ovih bolesti utječu brojni čimbenici: produljenje života osoba zaraženih HIV-om, sve učinkovitije antiretrovirusno liječenje, smanjenje imunosupresije, a time i rjeđa pojava oportunističkih infekcija, kao i nuspojave nekih lijekova. Klinički oblici kardiovaskularnih bolesti u tijeku infekcije HIV-om uključuju: miokarditis, dilatacijsku kardiomiopatiju, endokarditis, perikardni izljev i perikarditis, tumore srca povezane s AIDS-om (Kaposijev sarkom i maligni limfomi) te plućnu hipertenziju. Uvođenjem vrlo učinkovite antiretrovirusne terapije (HAART) znatno se promijenila pojavnost kardiovaskularnih manifestacija u sklopu infekcije HIV-om. S jedne strane HAART je modificirao klinički tijek HIV-bolesti, produljio preživljenje te poboljšao kvalitetu života osoba zaraženih HIV-om. S druge strane, HAART se dovodi u vezu s ranijom pojavom i napredovanjem ateroskleroze odnosno koronarne bolesti i bolesti perifernih arterija. Stoga se u cilju ranog otkrivanja i adekvatnog liječenja kardiovaskularnih bolesti povezanih s HIV-om preporuča detaljan periodički monitoring kardiovaskularnog sustava svih osoba zaraženih HIV-om, osobito onih u kojih su prisutni i drugi predisponirajući čimbenici rizika. Ovaj članak prikazuje kliničke aspekte osnovnih kardiovaskularnih manifestacija u tijeku infekcije HIV-om s osobitim osvrtom na novija saznanja o patogenezi i liječenju ovih bolesti.Although early clinical observations suggested that human immunodeficiency virus (HIV) spared the heart, subsequent experience has shown that cardiovascular diseases in the course of HIV infection are becoming more frequent. The frequency of these diseases is influenced by different variables such as survival prolongation in HIV-infected patients, advances in antiretroviral treatment, improvement of immunosupression and reduction in the occurrence of opportunistic infections, adverse effects of some drugs. Cardiac abnormalities in patients with HIV infection may include myocarditis, dilated cardiomyopathy, endocarditis, pericardial effusion and pericarditis, AIDS-related heart tumors (Kaposi\u27s sarcoma and malignant lymphomas), and pulmonary hypertension. Introduction of highly active antiretroviral therapy (HAART) regimens have greatly altered cardiovascular manifestations of HIV. On one hand, HAART has significantly modified the course of HIV disease, lengthened survival, and improved the quality of life of HIV-infected patients. On the other hand, HAARTis associated with acceleration of atherosclerotic arterial disease, both peripheral and coronary. Therefore, detailed periodically cardiovascular monitoring is warranted for all HIV-infected patients, especially those with other known underlying cardiovascular risk factors, for early identification and appropriate treatment of HIV-related cardiovascular diseases. This article reviews clinical aspects of principal HIV-associated cardiovascular diseases with an emphasis on new knowledge about pathogenesis and treatment of such conditions
Screening for carriage of carbapenem-resistant Enterobacteriaceae in settings of high endemicity: A position paper from an Italian working group on CRE infections
A variety of national and international guidelines exist around the management of carbapenem resistant Enterobacteriaceae (CREs), but some of these are several years old and do not reflect current epidemiology and they also do not necessarily give pragmatic advice around active surveillance of CREs in countries with a high burden of cases and limited resources. This paper aims to provide a best practice position paper to guide active surveillance in a variety of scenarios in these settings, and discusses which patients should be screened, what methods could be used for screening, and how results might influence infection prevention interventions
- …
