475 research outputs found

    A Fully Integrated Real-Time Detection, Diagnosis, and Control of Community Diarrheal Disease Clusters and Outbreaks (the INTEGRATE Project):Protocol for an Enhanced Surveillance System

    Get PDF
    BACKGROUND:Diarrheal disease, which affects 1 in 4 people in the United Kingdom annually, is the most common cause of outbreaks in community and health care settings. Traditional surveillance methods tend to detect point-source outbreaks of diarrhea and vomiting; they are less effective at identifying low-level and intermittent food supply contamination. Furthermore, it can take up to 9 weeks for infections to be confirmed, reducing slow-burn outbreak recognition, potentially impacting hundreds or thousands of people over wide geographical areas. There is a need to address fundamental problems in traditional diarrheal disease surveillance because of underreporting and subsequent unconfirmed infection by patients and general practitioners (GPs); varying submission practices and selective testing of samples in laboratories; limitations in traditional microbiological diagnostics, meaning that the timeliness of sample testing and etiology of most cases remains unknown; and poorly integrated human and animal surveillance systems, meaning that identification of zoonoses is delayed or missed. OBJECTIVE:This study aims to detect anomalous patterns in the incidence of gastrointestinal disease in the (human) community; to target sampling; to test traditional diagnostic methods against rapid, modern, and sensitive molecular and genomic microbiology methods that identify and characterize responsible pathogens rapidly and more completely; and to determine the cost-effectiveness of rapid, modern, sensitive molecular and genomic microbiology methods. METHODS:Syndromic surveillance will be used to aid identification of anomalous patterns in microbiological events based on temporal associations, demographic similarities among patients and animals, and changes in trends in acute gastroenteritis cases using a point process statistical model. Stool samples will be obtained from patients' consulting GPs, to improve the timeliness of cluster detection and characterize the pathogens responsible, allowing health protection professionals to investigate and control outbreaks quickly, limiting their size and impact. The cost-effectiveness of the proposed system will be examined using formal cost-utility analysis to inform decisions on national implementation. RESULTS:The project commenced on April 1, 2013. Favorable approval was obtained from the Research Ethics Committee on June 15, 2015, and the first patient was recruited on October 13, 2015, with 1407 patients recruited and samples processed using traditional laboratory techniques as of March 2017. CONCLUSIONS:The overall aim of this study is to create a new One Health paradigm for detecting and investigating diarrhea and vomiting in the community in near-real time, shifting from passive human surveillance and management of laboratory-confirmed infection toward an integrated, interdisciplinary enhanced surveillance system including management of people with symptoms. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID):DERR1-10.2196/13941

    Extracorporeal life support in mitral papillary muscle rupture: Outcome of multicenter study

    Get PDF
    Background: Post-acute myocardial infarction papillary muscle rupture (post-AMI PMR) may present variable clinical scenarios and degree of emergency due to result of cardiogenic shock. Veno-arterial extracorporeal life support (V-A ECLS) has been proposed to improve extremely poor pre- or postoperative conditions. Information in this respect is scarce.Methods: From the CAUTION (meChanical complicAtion of acUte myocardial infarcTion: an InternatiOnal multiceNter cohort study) database (16 different Centers, data from 2001 to 2018), we extracted adult patients who were surgically treated for post-AMI PMR and underwent pre- or/and postoperative V-A ECLS support. The end-points of this study were in-hospital survival and ECLS complications.Results: From a total of 214 post-AMI PMR patients submitted to surgery, V-A ECLS was instituted in 23 (11%) patients. The median age was 61.7 years (range 46-81 years). Preoperatively, ECLS was commenced in 10 patients (43.5%), whereas intra/postoperative in the remaining 13. The most common V-A ECLS indication was post-cardiotomy shock, followed by preoperative cardiogenic shock and cardiac arrest. The median duration of V-A ECLS was 4 days. V-A ECLS complications occurred in more than half of the patients. Overall, in-hospital mortality was 39.2% (9/23), compared to 22% (42/219) for the non-ECLS group.Conclusions: In post-AMI PMR patients, V-A ECLS was used in almost 10% of the patients either to promote bridge to surgery or as postoperative support. Further investigations are required to better evaluate a potential for increased use and its effects of V-A ECLS in such a context based on the still high perioperative mortality

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700

    ϒ production in p–Pb collisions at √sNN=8.16 TeV

    Get PDF
    ϒ production in p–Pb interactions is studied at the centre-of-mass energy per nucleon–nucleon collision √sNN = 8.16 TeV with the ALICE detector at the CERN LHC. The measurement is performed reconstructing bottomonium resonances via their dimuon decay channel, in the centre-of-mass rapidity intervals 2.03 < ycms < 3.53 and −4.46 < ycms < −2.96, down to zero transverse momentum. In this work, results on the ϒ(1S) production cross section as a function of rapidity and transverse momentum are presented. The corresponding nuclear modification factor shows a suppression of the ϒ(1S) yields with respect to pp collisions, both at forward and backward rapidity. This suppression is stronger in the low transverse momentum region and shows no significant dependence on the centrality of the interactions. Furthermore, the ϒ(2S) nuclear modification factor is evaluated, suggesting a suppression similar to that of the ϒ(1S). A first measurement of the ϒ(3S) has also been performed. Finally, results are compared with previous ALICE measurements in p–Pb collisions at √sNN = 5.02 TeV and with theoretical calculations.publishedVersio

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    A recognition advantage for members of higher-status racial groups.

    No full text

    Generalized Ingroup Recognition Advantage

    No full text
    corecore