44 research outputs found

    Contacts of Retreatment Tuberculosis Cases with a Prior Poor Treatment Outcome are at Increased Risk of Latent Tuberculosis Infection

    Get PDF
    Objectives: To estimate the prevalence of and risk factors for latent tuberculosis infection (LTBI) among contacts of index patients with tuberculosis (TB) with a prior history of active TB disease and TB treatment (retreatment cases). Methods: A cross-sectional population-based study was conducted using data from the national TB contact surveillance program in the country of Georgia. Contacts of retreatment cases were investigated and tuberculin skin testing was offered. Bivariate and multivariable analyses were performed to calculate odds ratios (OR) and 95% confidence intervals for risk of LTBI among contacts. Results: The prevalence of LTBI was significantly higher among contacts whose index TB patient had had a prior unfavorable treatment outcome compared to those who had had a favorable outcome (OR 3.14). Contacts whose index TB case had previously failed therapy (OR 6.43), was lost to follow-up (OR 5.63), or had completed treatment (OR 3.33) had a significantly higher prevalence of LTBI compared to contacts of previously cured TB cases. Conclusions: Among contacts of active TB retreatment cases, the risk of LTBI was related to the outcome of the index case’s previous TB treatment. Efforts aimed at reducing treatment loss to follow-up should be emphasized to enhance TB control efforts and may also decrease LTBI and active TB among contacts

    Tobacco Smoking and Tuberculosis Treatment Outcomes: A Prospective Cohort Study in Georgia

    Get PDF
    Objective To assess the effect of tobacco smoking on the outcome of tuberculosis treatment in Tbilisi, Georgia. Methods We conducted a prospective cohort study of adults with laboratory-confirmed tuberculosis from May 2011 to November 2013. History of tobacco smoking was collected using a standardized questionnaire adapted from the global adult tobacco survey. We considered tuberculosis therapy to have a poor outcome if participants defaulted, failed treatment or died. We used multivariable regressions to estimate the risk of a poor treatment outcome. Findings Of the 591 tuberculosis patients enrolled, 188 (31.8%) were past smokers and 271 (45.9%) were current smokers. Ninety (33.2%) of the current smokers and 24 (18.2%) of the participants who had never smoked had previously been treated for tuberculosis (P \u3c 0.01). Treatment outcome data were available for 524 of the participants, of whom 128 (24.4%) – including 80 (32.9%) of the 243 current smokers and 21 (17.2%) of the 122 individuals who had never smoked – had a poor treatment outcome. Compared with those who had never smoked, current smokers had an increased risk of poor treatment outcome (adjusted relative risk, aRR: 1.70; 95% confidence interval, CI: 1.00–2.90). Those who had ceased smoking more than two months before enrolment did not have such an increased risk (aRR: 1.01; 95% CI: 0.51–1.99). Conclusion There is a high prevalence of smoking among patients with tuberculosis in Georgia and smoking increases the risk of a poor treatment outcome

    Characteristics and Antibiotic Use Associated With Short-Term Risk of Clostridium difficile Infection Among Hospitalized Patients

    Get PDF
    Objectives—Polymerase chain reaction (PCR) has been shown to have an excellent sensitivity and specificity for the detection of Clostridium difficile infection (CDI). Little is known about risk factors for CDI within 14 days of an initial negative test. We sought to determine the characteristics among hospitalized patients associated with risk of short-term acquisition of CDI. Methods—A case-control study was conducted. Cases were patients who converted from PCR negative to positive within 14 days. Each case was matched with three controls. Conditional logistic regression was used to estimate the association between patient characteristics and CDI. Results—Of the 30 patients in our study who had a positive PCR within 14 days of a first negative PCR (cases), 15 (50%) occurred within 7 days of the initial test. Cases had a higher proportion of intravenous vancomycin use in the previous 8 weeks (odds ratio [OR], 3.38; 95% confidence interval [CI], 1.34-8.49) and were less likely to have recent antiviral agent use (OR, 0.30; 95% CI, 0.11-0.83) compared with controls. Conclusions—In hospitalized patients, treatment with intravenous vancomycin within the prior 8 weeks of a first negative PCR test for C difficile is a risk factor for short-term risk for hospital-acquired CDI. Repeat testing guidelines for C difficile PCR should take into consideration patients who may be at high risk for short-term acquisition of CDI

    International criteria for electrocardiographic interpretation in athletes: Consensus statement.

    Get PDF
    Sudden cardiac death (SCD) is the leading cause of mortality in athletes during sport. A variety of mostly hereditary, structural or electrical cardiac disorders are associated with SCD in young athletes, the majority of which can be identified or suggested by abnormalities on a resting 12-lead electrocardiogram (ECG). Whether used for diagnostic or screening purposes, physicians responsible for the cardiovascular care of athletes should be knowledgeable and competent in ECG interpretation in athletes. However, in most countries a shortage of physician expertise limits wider application of the ECG in the care of the athlete. A critical need exists for physician education in modern ECG interpretation that distinguishes normal physiological adaptations in athletes from distinctly abnormal findings suggestive of underlying pathology. Since the original 2010 European Society of Cardiology recommendations for ECG interpretation in athletes, ECG standards have evolved quickly, advanced by a growing body of scientific data and investigations that both examine proposed criteria sets and establish new evidence to guide refinements. On 26-27 February 2015, an international group of experts in sports cardiology, inherited cardiac disease, and sports medicine convened in Seattle, Washington (USA), to update contemporary standards for ECG interpretation in athletes. The objective of the meeting was to define and revise ECG interpretation standards based on new and emerging research and to develop a clear guide to the proper evaluation of ECG abnormalities in athletes. This statement represents an international consensus for ECG interpretation in athletes and provides expert opinion-based recommendations linking specific ECG abnormalities and the secondary evaluation for conditions associated with SCD

    Italian guidelines for primary headaches: 2012 revised version

    Get PDF
    The first edition of the Italian diagnostic and therapeutic guidelines for primary headaches in adults was published in J Headache Pain 2(Suppl. 1):105–190 (2001). Ten years later, the guideline committee of the Italian Society for the Study of Headaches (SISC) decided it was time to update therapeutic guidelines. A literature search was carried out on Medline database, and all articles on primary headache treatments in English, German, French and Italian published from February 2001 to December 2011 were taken into account. Only randomized controlled trials (RCT) and meta-analyses were analysed for each drug. If RCT were lacking, open studies and case series were also examined. According to the previous edition, four levels of recommendation were defined on the basis of levels of evidence, scientific strength of evidence and clinical effectiveness. Recommendations for symptomatic and prophylactic treatment of migraine and cluster headache were therefore revised with respect to previous 2001 guidelines and a section was dedicated to non-pharmacological treatment. This article reports a summary of the revised version published in extenso in an Italian version

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore