386 research outputs found

    Observational study to estimate the changes in the effectiveness of bacillus Calmette-Guérin (BCG) vaccination with time since vaccination for preventing tuberculosis in the UK.

    Get PDF
    Until recently, evidence that protection from the bacillus Calmette-Guérin (BCG) vaccination lasted beyond 10 years was limited. In the past few years, studies in Brazil and the USA (in Native Americans) have suggested that protection from BCG vaccination against tuberculosis (TB) in childhood can last for several decades. The UK's universal school-age BCG vaccination programme was stopped in 2005 and the programme of selective vaccination of high-risk (usually ethnic minority) infants was enhanced. To assess the duration of protection of infant and school-age BCG vaccination against TB in the UK. Two case-control studies of the duration of protection of BCG vaccination were conducted, the first on minority ethnic groups who were eligible for infant BCG vaccination 0-19 years earlier and the second on white subjects eligible for school-age BCG vaccination 10-29 years earlier. TB cases were selected from notifications to the UK national Enhanced Tuberculosis Surveillance system from 2003 to 2012. Population-based control subjects, frequency matched for age, were recruited. BCG vaccination status was established from BCG records, scar reading and BCG history. Information on potential confounders was collected using computer-assisted interviews. Vaccine effectiveness was estimated as a function of time since vaccination, using a case-cohort analysis based on Cox regression. In the infant BCG study, vaccination status was determined using vaccination records as recall was poor and concordance between records and scar reading was limited. A protective effect was seen up to 10 years following infant vaccination [< 5 years since vaccination: vaccine effectiveness (VE) 66%, 95% confidence interval (CI) 17% to 86%; 5-10 years since vaccination: VE 75%, 95% CI 43% to 89%], but there was weak evidence of an effect 10-15 years after vaccination (VE 36%, 95% CI negative to 77%; p = 0.396). The analyses of the protective effect of infant BCG vaccination were adjusted for confounders, including birth cohort and ethnicity. For school-aged BCG vaccination, VE was 51% (95% CI 21% to 69%) 10-15 years after vaccination and 57% (95% CI 33% to 72%) 15-20 years after vaccination, beyond which time protection appeared to wane. Ascertainment of vaccination status was based on self-reported history and scar reading. The difficulty in examining vaccination sites in older women in the high-risk minority ethnic study population and the sparsity of vaccine record data in the later time periods precluded robust assessment of protection from infant BCG vaccination > 10 years after vaccination. Infant BCG vaccination in a population at high risk for TB was shown to provide protection for at least 10 years, whereas in the white population school-age vaccination was shown to provide protection for at least 20 years. This evidence may inform TB vaccination programmes (e.g. the timing of administration of improved TB vaccines, if they become available) and cost-effectiveness studies. Methods to deal with missing record data in the infant study could be explored, including the use of scar reading. The National Institute for Health Research Health Technology Assessment programme. During the conduct of the study, Jonathan Sterne, Ibrahim Abubakar and Laura C Rodrigues received other funding from NIHR; Ibrahim Abubakar and Laura C Rodrigues have also received funding from the Medical Research Council. Punam Mangtani received funding from the Biotechnology and Biological Sciences Research Council

    Time of Day and its Association with Risk of Death and Chance of Discharge in Critically Ill Patients: A Retrospective Study.

    Get PDF
    Outcomes following admission to intensive care units (ICU) may vary with time and day. This study investigated associations between time of day and risk of ICU mortality and chance of ICU discharge in acute ICU admissions. Adult patients (age ≥ 18 years) who were admitted to ICUs participating in the Austrian intensive care database due to medical or surgical urgencies and emergencies between January 2012 and December 2016 were included in this retrospective study. Readmissions were excluded. Statistical analysis was conducted using the Fine-and-Gray proportional subdistribution hazards model concerning ICU mortality and ICU discharge within 30 days adjusted for SAPS 3 score. 110,628 admissions were analysed. ICU admission during late night and early morning was associated with increased hazards for ICU mortality; HR: 1.17; 95% CI: 1.08-1.28 for 00:00-03:59, HR: 1.16; 95% CI: 1.05-1.29 for 04:00-07:59. Risk of death in the ICU decreased over the day; lowest HR: 0.475, 95% CI: 0.432-0.522 for 00:00-03:59. Hazards for discharge from the ICU dropped sharply after 16:00; lowest HR: 0.024; 95% CI: 0.019-0.029 for 00:00-03:59. We conclude that there are "time effects" in ICUs. These findings may spark further quality improvement efforts

    Weekends affect mortality risk and chance of discharge in critically ill patients: a retrospective study in the Austrian registry for intensive care.

    Get PDF
    BACKGROUND: In this study, we primarily investigated whether ICU admission or ICU stay at weekends (Saturday and Sunday) is associated with a different risk of ICU mortality or chance of ICU discharge than ICU admission or ICU stay on weekdays (Monday to Friday). Secondarily, we analysed whether weekend ICU admission or ICU stay influences risk of hospital mortality or chance of hospital discharge. METHODS: A retrospective study was performed for all adult patients admitted to 119 ICUs participating in the benchmarking project of the Austrian Centre for Documentation and Quality Assurance in Intensive Care (ASDI) between 2012 and 2015. Readmissions to the ICU during the same hospital stay were excluded. RESULTS: In a multivariable competing risk analysis, a strong weekend effect was observed. Patients admitted to ICUs on Saturday or Sunday had a higher mortality risk after adjustment for severity of illness by Simplified Acute Physiology Score (SAPS) 3, year, month of the year, type of admission, ICU, and weekday of death or discharge. Hazard ratios (95% confidence interval) for death in the ICU following admission on a Saturday or Sunday compared with Wednesday were 1.15 (1.08-1.23) and 1.11 (1.03-1.18), respectively. Lower hazard ratios were observed for dying on a Saturday (0.93 (0.87-1.00)) or Sunday (0.85 (0.80-0.91)) compared with Wednesday. This is probably related to the reduced chance of being discharged from the ICU at the weekend (0.63 (0.62-064) for Saturday and 0.56 (0.55-0.57) for Sunday). Similar results were found for hospital mortality and hospital discharge following ICU admission. CONCLUSIONS: Patients admitted to ICUs at weekends are at increased risk of death in both the ICU and the hospital even after rigorous adjustment for severity of illness. Conversely, death in the ICU and discharge from the ICU are significantly less likely at weekends

    Non-Adherence in Patients on Peritoneal Dialysis: A Systematic Review

    Get PDF
    Background: It has been increasingly recognized that non-adherence is an important factor that determines the outcome of peritoneal dialysis (PD) therapy. There is therefore a need to establish the levels of non-adherence to different aspects of the PD regimen (dialysis procedures, medications, and dietary/fluid restrictions). Methods: A systematic review of peer-reviewed literature was performed in PubMed, PsycINFO and CINAHL databases using PRISMA guidelines in May 2013. Publications on non-adherence in PD were selected by two reviewers independently according to predefined inclusion and exclusion criteria. Relevant data on patient characteristics, measures, rates and factors associated with non-adherence were extracted. The quality of studies was also evaluated independently by two reviewers according to a revised version of the Effective Public Health Practice Project assessment tool. Results: The search retrieved 204 studies, of which a total of 25 studies met inclusion criteria. Reported rates of nonadherence varied across studies: 2.6 1353% for dialysis exchanges, 3.9 1385% for medication, and 14.4 1367% for diet/fluid restrictions. Methodological differences in measurement and definition of non-adherence underlie the observed variation. Factors associated with non-adherence that showed a degree of consistency were mostly socio-demographical, such as age, employment status, ethnicity, sex, and time period on PD treatment. Conclusion: Non-adherence to different dimensions of the dialysis regimen appears to be prevalent in PD patients. There is a need for further, high-quality research to explore these factors in more detail, with the aim of informing intervention designs to facilitate adherence in this patient populatio

    The spatial distribution of leprosy cases during 15 years of a leprosy control program in Bangladesh: An observational study

    Get PDF
    BACKGROUND: An uneven spatial distribution of leprosy can be caused by the influence of geography on the distribution of risk factors over the area, or by population characteristics that are heterogeneously distributed over the area. We studied the distribution of leprosy cases detected by a control program to identify spatial and spatio-temporal patterns of occurrence and to search for environmental risk factors for leprosy. METHODS: The houses of 11,060 leprosy cases registered in the control area during a 15-year period (1989-2003) were traced back, added to a geographic database (GIS), and plotted on digital maps. We looked for clusters of cases in space and time. Furthermore, relationships with the proximity to geographic features, such as town center, roads, rivers, and clinics, were studied. RESULTS: Several spatio-temporal clusters were observed for voluntarily reported cases. The cases within and outside clusters did not differ in age at detection, percentage with multibacillary leprosy, or sex ratio. There was no indication of the spread from one point to other parts of the district, indicating a spatially stable endemic situation during the study period. The overall risk of leprosy in the district was not associated with roads, rivers, and leprosy clinics. The risk was highest within 1 kilometer of town centers and decreased with distance from town centers. CONCLUSION: The association of a risk of leprosy with the proximity to towns indicates that rural towns may play an important role in the epidemiology of leprosy in this district. Further research on the role of towns, particularly in rural areas, is warranted

    Improving antibiotic prescribing for adults with community acquired pneumonia: Does a computerised decision support system achieve more than academic detailing alone? – a time series analysis

    Get PDF
    BACKGROUND: The ideal method to encourage uptake of clinical guidelines in hospitals is not known. Several strategies have been suggested. This study evaluates the impact of academic detailing and a computerised decision support system (CDSS) on clinicians' prescribing behaviour for patients with community acquired pneumonia (CAP). METHODS: The management of all patients presenting to the emergency department over three successive time periods was evaluated; the baseline, academic detailing and CDSS periods. The rate of empiric antibiotic prescribing that was concordant with recommendations was studied over time comparing pre and post periods and using an interrupted time series analysis. RESULTS: The odds ratio for concordant therapy in the academic detailing period, after adjustment for age, illness severity and suspicion of aspiration, compared with the baseline period was OR = 2.79 [1.88, 4.14], p < 0.01, and for the computerised decision support period compared to the academic detailing period was OR = 1.99 [1.07, 3.69], p = 0.02. During the first months of the computerised decision support period an improvement in the appropriateness of antibiotic prescribing was demonstrated, which was greater than that expected to have occurred with time and academic detailing alone, based on predictions from a binary logistic model. CONCLUSION: Deployment of a computerised decision support system was associated with an early improvement in antibiotic prescribing practices which was greater than the changes seen with academic detailing. The sustainability of this intervention requires further evaluation

    Metabolite Cross-Feeding Enhances Virulence in a Model Polymicrobial Infection

    Get PDF
    Microbes within polymicrobial infections often display synergistic interactions resulting in enhanced pathogenesis; however, the molecular mechanisms governing these interactions are not well understood. Development of model systems that allow detailed mechanistic studies of polymicrobial synergy is a critical step towards a comprehensive understanding of these infections in vivo. In this study, we used a model polymicrobial infection including the opportunistic pathogen Aggregatibacter actinomycetemcomitans and the commensal Streptococcus gordonii to examine the importance of metabolite cross-feeding for establishing co-culture infections. Our results reveal that co-culture with S. gordonii enhances the pathogenesis of A. actinomycetemcomitans in a murine abscess model of infection. Interestingly, the ability of A. actinomycetemcomitans to utilize L-lactate as an energy source is essential for these co-culture benefits. Surprisingly, inactivation of L-lactate catabolism had no impact on mono-culture growth in vitro and in vivo suggesting that A. actinomycetemcomitans L-lactate catabolism is only critical for establishing co-culture infections. These results demonstrate that metabolite cross-feeding is critical for A. actinomycetemcomitans to persist in a polymicrobial infection with S. gordonii supporting the idea that the metabolic properties of commensal bacteria alter the course of pathogenesis in polymicrobial communities

    Subcutaneous dissociative conscious sedation (sDCS) an alternative method for airway regional blocks: a new approach

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Predicted difficult airway is a definite indication for awake intubation and spontaneous ventilation. Airway regional blocks which are commonly used to facilitate awake intubation are sometimes impossible or forbidden. On the other hand deep sedation could be life threatening in the case of compromised airway.</p> <p>The aim of this study is evaluating "Subcutaneous Dissociative Conscious Sedation" (sDCS) as an alternative method to airway regional blocks for awake intubation.</p> <p>Methods</p> <p>In this prospective, non-randomized study, 30 patients with predicted difficult airway (laryngeal tumors), who were scheduled for direct laryngoscopic biopsy (DLB), underwent "Subcutaneous Dissociative Conscious Sedation" (sDCS) exerted by intravenous fentanyl 3-4ug/kg and subcutaneous ketamine 0.6-0.7 mg/kg. The tongue and pharynx were anesthetized with lidocaine spray (4%<b>)</b>. 10 minutes after a subcutaneous injection of ketamine direct laryngoscopy was performed. Extra doses of fentanyl 50-100 ug were administered if the patient wasn't cooperative enough for laryngoscopy.</p> <p>Patients were evaluated for hemodynamic stability (heart rate and blood pressure), oxygen saturation (Spo<sub>2</sub>), patient cooperation (obedient to open the mouth for laryngoscopy and the number of tries for laryngoscopy), patient comfort (remaining moveless), hallucination, nystagmus and salivation (need for aspiration before laryngoscopy).</p> <p>Results</p> <p>Direct laryngoscopy was performed successfully in all patients. One patient needed extra fentanyl and then laryngoscopy was performed successfully on the second try. All patients were cooperative enough during laryngoscopy. Hemodynamic changes more than 20% occurred in just one patient. Oxygen desaturation (spo<sub>2</sub>< 90%) didn't occur in any patient.</p> <p>Conclusions</p> <p>Subcutaneous Dissociative Conscious Sedation (sDCS) as a new approach to airway is an acceptable and safe method for awake intubation and it can be suggested as a noninvasive substitute of low complication rate for regional airway blocks.</p> <p>Registration ID in IRCT</p> <p>IRCT201012075333N1</p

    Attitudes towards terminal sedation: an empirical survey among experts in the field of medical ethics

    Get PDF
    BACKGROUND: "Terminal sedation" regarded as the use of sedation in (pre-)terminal patients with treatment-refractory symptoms is controversially discussed not only within palliative medicine. While supporters consider terminal sedation as an indispensable palliative medical treatment option, opponents disapprove of it as "slow euthanasia". Against this background, we interviewed medical ethics experts by questionnaire on the term and the moral acceptance of terminal sedation in order to find out how they think about this topic. We were especially interested in whether experts with a professional medical and nursing background think differently about the topic than experts without this background. METHODS: The survey was carried out by questionnaire; beside the provided answering options free text comments were possible. As test persons we chose the 477 members of the German Academy for Ethics in Medicine, an interdisciplinary society for medical ethics. RESULTS: 281 completed questionnaires were returned (response rate = 59%). The majority of persons without medical background regarded "terminal sedation" as an intentional elimination of consciousness until the patient's death occurs; persons with a medical background generally had a broader understanding of the term, including light or intermittent forms of sedation. 98% of the respondents regarded terminal sedation in dying patients with treatment-refractory physical symptoms as acceptable. Situations in which the dying process has not yet started, in which untreatable mental symptoms are the indication for terminal sedation or in which life-sustaining measures are withdrawn during sedation were evaluated as morally difficult. CONCLUSION: The survey reveals a great need for research and discussion on the medical indication as well as on the moral evaluation of terminal sedation. Prerequisite for this is a more precise terminology which describes the circumstances of the sedation
    corecore