25 research outputs found

    Evaluating duration of antimicrobial therapy for community-acquired pneumonia in clinically stable patients

    Get PDF
    "In the United States, community-acquired pneumonia (CAP) results in an estimated 2 to 3 million diagnoses each year, 10 million physician visits, and 600,000 hospitalizations resulting in a total cost of over 20 billion dollars annually. Common causative organisms of CAP include Streptococcus pneumoniae, Haemophilus influenzae, Mycoplasma pneumoniae, Chlamydia pneumoniae, and Legionella pneumophila. Identifying the etiologic organism helps guide therapeutic decisions, however, the pathogen remains unknown in about 50 percent of cases. Therefore, optimal empiric therapy relies on a physician's experience and clinical judgment. The Infectious Diseases Society of America (IDSA) guidelines for treatment of community-acquired pneumonia (CAP) recommend a minimum 5-day course of antibiotics for patients who achieve clinical stability within 48 to 72 hours from initiation of appropriate therapy. A multicenter, cohort study of 686 patients hospitalized with CAP found that most were treated for 7 to 10 days despite median time to clinical stability of 3 days, indicating that a shorter duration of therapy is often not favored by clinicians despite guideline recommendations. Moreover, although many patients receive active antimicrobial therapy while hospitalized, additional courses of antimicrobials are often prescribed upon discharge resulting in excessive antibiotic use. While many patients are given prolonged courses of therapy for CAP, shorter durations of antibiotics in patients eligible for such courses of treatment offer a number of advantages such as minimizing the emergence and selection of resistant organisms, increasing patient compliance, and reducing the risk of medication adverse effects. The objective of this study was to assess the percentage of hospitalized patients diagnosed with uncomplicated CAP receiving antimicrobial therapy in excess of the guideline-recommended duration, evaluate subsequent thirty-day all-cause readmission rates, and determine if select co-morbidities influenced the length of antimicrobial therapy prescribed."--Introduction.Lucy Hahn (Parkland Health and Hospital System, Dallas, Texas), Anita Hegde (University of Texas Southwestern Medical Center, Dallas, Texas), Norman Mang (Parkland Health and Hospital System, Dallas, Texas, University of Texas Southwestern Medical Center, Dallas, Texas), Jessica K. Ortwine (Parkland Health and Hospital System, Dallas, Texas, University of Texas Southwestern Medical Center, Dallas, Texas), Wenjing Wei (Parkland Health and Hospital System, Dallas, Texas, University of Texas Southwestern Medical Center, Dallas, Texas), Bonnie Chase Prokesch (University of Texas Southwestern Medical Center, Dallas, Texas)Includes bibliographical reference

    Mycobacterium chimaera

    Full text link

    The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Get PDF
    Background The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses

    Evolutionary History of Rabies in Ghana

    Get PDF
    Rabies virus (RABV) is enzootic throughout Africa, with the domestic dog (Canis familiaris) being the principal vector. Dog rabies is estimated to cause 24,000 human deaths per year in Africa, however, this estimate is still considered to be conservative. Two sub-Saharan African RABV lineages have been detected in West Africa. Lineage 2 is present throughout West Africa, whereas Africa 1a dominates in northern and eastern Africa, but has been detected in Nigeria and Gabon, and Africa 1b was previously absent from West Africa. We confirmed the presence of RABV in a cohort of 76 brain samples obtained from rabid animals in Ghana collected over an eighteen-month period (2007–2009). Phylogenetic analysis of the sequences obtained confirmed all viruses to be RABV, belonging to lineages previously detected in sub-Saharan Africa. However, unlike earlier reported studies that suggested a single lineage (Africa 2) circulates in West Africa, we identified viruses belonging to the Africa 2 lineage and both Africa 1 (a and b) sub-lineages. Phylogeographic Bayesian Markov chain Monte Carlo analysis of a 405 bp fragment of the RABV nucleoprotein gene from the 76 new sequences derived from Ghanaian animals suggest that within the Africa 2 lineage three clades co-circulate with their origins in other West African countries. Africa 1a is probably a western extension of a clade circulating in central Africa and the Africa 1b virus a probable recent introduction from eastern Africa. We also developed and tested a novel reverse-transcription loop-mediated isothermal amplification (RT-LAMP) assay for the detection of RABV in African laboratories. This RT-LAMP was shown to detect both Africa 1 and 2 viruses, including its adaptation to a lateral flow device format for product visualization. These data suggest that RABV epidemiology is more complex than previously thought in West Africa and that there have been repeated introductions of RABV into Ghana. This analysis highlights the potential problems of individual developing nations implementing rabies control programmes in the absence of a regional programme

    Whole-genome sequencing for prediction of Mycobacterium tuberculosis drug susceptibility and resistance : a retrospective cohort study

    Get PDF
    BACKGROUND : Diagnosing drug-resistance remains an obstacle to the elimination of tuberculosis. Phenotypic drugsusceptibility testing is slow and expensive, and commercial genotypic assays screen only common resistancedetermining mutations. We used whole-genome sequencing to characterise common and rare mutations predicting drug resistance, or consistency with susceptibility, for all fi rst-line and second-line drugs for tuberculosis. METHODS : Between Sept 1, 2010, and Dec 1, 2013, we sequenced a training set of 2099 Mycobacterium tuberculosis genomes. For 23 candidate genes identifi ed from the drug-resistance scientifi c literature, we algorithmically characterised genetic mutations as not conferring resistance (benign), resistance determinants, or uncharacterised. We then assessed the ability of these characterisations to predict phenotypic drug-susceptibility testing for an independent validation set of 1552 genomes. We sought mutations under similar selection pressure to those characterised as resistance determinants outside candidate genes to account for residual phenotypic resistance. FINDINGS : We characterised 120 training-set mutations as resistance determining, and 772 as benign. With these mutations, we could predict 89·2% of the validation-set phenotypes with a mean 92·3% sensitivity (95% CI 90·7–93·7) and 98·4% specifi city (98·1–98·7). 10·8% of validation-set phenotypes could not be predicted because uncharacterised mutations were present. With an in-silico comparison, characterised resistance determinants had higher sensitivity than the mutations from three line-probe assays (85·1% vs 81·6%). No additional resistance determinants were identifi ed among mutations under selection pressure in non-candidate genes. INTERPRETATION : A broad catalogue of genetic mutations enable data from whole-genome sequencing to be used clinically to predict drug resistance, drug susceptibility, or to identify drug phenotypes that cannot yet be genetically predicted. This approach could be integrated into routine diagnostic workfl ows, phasing out phenotypic drugsusceptibility testing while reporting drug resistance early.Wellcome Trust, National Institute of Health Research, Medical Research Council, and the European Union.http://www.thelancet.com/infectionhb201

    Hedge, Jessica

    No full text

    Hedge, Jessica

    No full text

    Molecular epidemiology and evolution of the 2009 H1N1 influenza A pandemic virus

    No full text
    The swine-origin H1N1 influenza A pandemic virus (A(H1N1)pdm09) was detected in the human population in March 2009. Due to its antigenic novelty, the majority of individuals were susceptible to the virus and the pandemic quickly disseminated around the globe. Rapid characterization of the epidemic was required in order to help inform interventions and determine the risk posed to public health. Widespread sampling and sequencing of virus isolates enabled early characterization of the virus using phylogenetic analysis and continued surveillance over the subsequent three years of global circulation. Throughout this thesis, Bayesian phylogenetic methods are employed to investigate how quickly evolutionary parameters can be accurately and precisely estimated from pandemic genome sequence data and explore how selection has acted across the A(H1N1)pdm09 genome over its period of transition to a seasonal influenza lineage. It is shown that accurate estimates of the evolutionary rate, date of emergence and initial exponential growth rate of the virus can be obtained with high precision from analysis of 100 genome sequences, thereby helping to characterize the virus just 2 months after the first cases were reported. In order to account for variation in growth rates of influenza epidemics between localized outbreaks around the globe, a hierarchical phylogenetic model is employed for analysis of pandemic and seasonal influenza data. The results suggest that the A(H1N1)pdm09 lineage spread more easily and with greater variation between populations during its first pandemic wave than either seasonal influenza lineage in previous seasons. The birth-death epidemiology model has been shown to provide more precise estimates of the basic reproductive number than the coalescent in analysis of HIV epidemic data. Analysis of pandemic influenza data carried out here suggests that the model assumptions are less applicable to influenza and in fact thebirth-death epidemiology model loses accuracy more rapidly than coalescent models as data increased during the pandemic. The effects of an increasingly immune global host population over the pandemic and subsequent influenza seasons were investigated using robust counting of substitutions across the genome. Results suggest that antigenic genes were under a greater selective pressure to evolve than internally expressed genes and the rate of non-synonymous substitution was highest across all segments immediately after emergence in the human population. Bayesian phylogenetics is increasingly being employed as an important tool for rapid characterization of novel infectious disease epidemics. As such, the work carried out here aims to determine the accuracy and applicability of existing evolutionary models with pandemic sequence data sampled over a range of temporal and spatial scales to help better inform similar analyses of future epidemics

    Data from: Real-time characterization of the molecular epidemiology of an influenza pandemic

    Get PDF
    Early characterization of the epidemiology and evolution of a pandemic is essential for determining the most appropriate interventions. During the 2009 H1N1 influenza A pandemic, public databases facilitated widespread sharing of genetic sequence data from the outset. We employ Bayesian phylogenetics to simulate real-time estimation of the evolutionary rate, date of emergence and intrinsic growth rate (r0) of the pandemic from whole-genome sequences. We investigate the effects of temporal range of sampling and dataset size on the precision and accuracy of parameter estimation. Parameters can be accurately estimated as early as two months after the first reported case, from 100 genomes. Early deleterious mutations were purged from the population during the second pandemic wave and the choice of growth model is important for accurate estimation of r0. This demonstrates the utility of simple coalescent models to rapidly inform intervention strategies during a pandemic
    corecore