729 research outputs found

    Leveraging a Rapid, Round-the-Clock HIV Testing System to Screen for Acute HIV Infection in a Large Urban Public Medical Center

    Get PDF
    Methods The hospital laboratory performed round-the-clock rapid HIV antibody testing on venipuncture specimens from patients undergoing HIV testing in hospital and community clinics, inpatient settings, and the emergency department. For patients with negative results, a public health laboratory conducted pooled HIV RNA testing for acute HIV infection. The laboratories communicated positive results from the hospital campus to a linkage team. Linkage was defined as one outpatient HIV-related visit. Results Among 7,927 patients, 8,550 rapid tests resulted in 137 cases of HIV infection (1.7%, 95% CI 1.5%–2.0%), of whom 46 were new HIV diagnoses (0.58%, 95% CI 0.43%–0.77%). Pooled HIV RNA testing of 6,704 specimens (78.4%) resulted in 3 cases of acute HIV infection (0.05%, 95% CI 0.01%–0.14) and increased HIV case detection by 3.5%. Half of new HIV diagnoses and 2/3 of acute infections were detected in the emergency department and urgent care clinic. Rapid test sensitivity was 98.9% (95% CI 93.8%– 99.8%); specificity was 99.9% (95% CI 99.7%–99.9%). Over 95% of newly diagnosed and out-of-care HIV-infected patients were linked to care. Conclusions Patients undergoing HIV testing in emergency departments and urgent care clinics may benefit from being simultaneously screened for acute HIV infection

    Electronic Medical Record Inaccuracies: Multicenter Analysis of Challenges with Modified Lung Cancer Screening Criteria.

    Get PDF
    The National Comprehensive Cancer Network expanded their lung cancer screening (LCS) criteria to comprise one additional clinical risk factor, including chronic obstructive pulmonary disease (COPD). The electronic medical record (EMR) is a source of clinical information that could identify high-risk populations for LCS, including a diagnosis of COPD; however, an unsubstantiated COPD diagnosis in the EMR may lead to inappropriate LCS referrals. We aimed to detect the prevalence of unsubstantiated COPD diagnosis in the EMR for LCS referrals, to determine the efficacy of utilizing the EMR as an accurate population-based eligibility screening trigger using modified clinical criteria. We performed a multicenter review of all individuals referred to three LCS programs from 2012 to 2015. Each individual\u27s EMR was searched for COPD diagnostic terms and the presence of a diagnostic pulmonary functionality test (PFT). An unsubstantiated COPD diagnosis was defined by an individual\u27s EMR containing a COPD term with no PFTs present, or the presence of PFTs without evidence of obstruction. A total of 2834 referred individuals were identified, of which 30% (840/2834) had a COPD term present in their EMR. Of these, 68% (571/840) were considered unsubstantiated diagnoses: 86% (489/571) due to absent PFTs and 14% (82/571) due to PFTs demonstrating no evidence of postbronchodilation obstruction. A large proportion of individuals referred for LCS may have an unsubstantiated COPD diagnosis within their EMR. Thus, utilizing the EMR as a population-based eligibility screening tool, employing expanded criteria, may lead to individuals being referred, potentially, inappropriately for LCS

    Evaluating the accuracy of a functional SNP annotation system

    Get PDF
    Many common and chronic diseases are influenced at some level by genetic variation. Research done in population genetics, specifically in the area of single nucleotide polymorphisms (SNPs) is critical to understanding human genetic variation. A key element in assessing role of a given SNP is determining if the variation is likely to result in change in function. The SNP Integration Tool (SNPit) is a comprehensive tool that integrates diverse, existing predictors of SNP functionality, providing the user with information for improved association study analysis. To evaluate the SNPit system, we developed an alternative gold standard to measure accuracy using sensitivity and specificity. The results of our evaluation demonstrated that our alternative gold standard produced encouraging results

    Relayed nuclear Overhauser enhancement sensitivity to membrane Cho phospholipids

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/155956/1/mrm28258_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/155956/2/mrm28258.pd

    SER-109: An Oral Investigational Microbiome Therapeutic for Patients with Recurrent Clostridioides difficile Infection (rCDI)

    Get PDF
    Clostridioides difficile infection (CDI) is classified as an urgent health threat by the Centers for Disease Control and Prevention (CDC), and affects nearly 500,000 Americans annually. Approximately 20–25% of patients with a primary infection experience a recurrence, and the risk of recurrence increases with subsequent episodes to greater than 40%. The leading risk factor for CDI is broad-spectrum antibiotics, which leads to a loss of microbial diversity and impaired colonization resistance. Current FDA-approved CDI treatment strategies target toxin or toxin-producing bacteria, but do not address microbiome disruption, which is key to the pathogenesis of recurrent CDI. Fecal microbiota transplantation (FMT) reduces the risk of recurrent CDI through the restoration of microbial diversity. However, FDA safety alerts describing hospitalizations and deaths related to pathogen transmission have raised safety concerns with the use of unregulated and unstandardized donor-derived products. SER-109 is an investigational oral microbiome therapeutic composed of purified spore-forming Firmicutes. SER-109 was superior to a placebo in reducing CDI recurrence at Week 8 (12% vs. 40%, respectively; p \u3c 0.001) in adults with a history of recurrent CDI with a favorable observed safety profile. Here, we discuss the role of the microbiome in CDI pathogenesis and the clinical development of SER-109, including its rigorous manufacturing process, which mitigates the risk of pathogen transmission. Additionally, we discuss compositional and functional changes in the gastrointestinal microbiome in patients with recurrent CDI following treatment with SER-109 that are critical to a sustained clinical response

    Nitrogen Increases Early-Stage and Slows Late-Stage Decomposition Across Diverse Grasslands

    Get PDF
    To evaluate how increased anthropogenic nutrient inputs alter carbon cycling in grasslands, we conducted a litter decomposition study across 20 temperate grasslands on three continents within the Nutrient Network, a globally distributed nutrient enrichment experiment We determined the effects of addition of experimental nitrogen (N), phosphorus (P) and potassium plus micronutrient (Kμ) on decomposition of a common tree leaf litter in a long-term study (maximum of 7 years; exact deployment period varied across sites). The use of higher order decomposition models allowed us to distinguish between the effects of nutrients on early- versus late-stage decomposition. Across continents, the addition of N (but not other nutrients) accelerated early-stage decomposition and slowed late-stage decomposition, increasing the slowly decomposing fraction by 28% and the overall litter mean residence time by 58%. Synthesis. Using a novel, long-term cross-site experiment, we found widespread evidence that N enhances the early stages of above-ground plant litter decomposition across diverse and widespread temperate grassland sites but slows late-stage decomposition. These findings were corroborated by fitting the data to multiple decomposition models and have implications for N effects on soil organic matter formation. For example, following N enrichment, increased microbial processing of litter substrates early in decomposition could promote the production and transfer of low molecular weight compounds to soils and potentially enhance the stabilization of mineral-associated organic matter. By contrast, by slowing late-stage decomposition, N enrichment could promote particulate organic matter (POM) accumulation. Such hypotheses deserve further testing

    Performance of Risk-Based Criteria for Targeting Acute HIV Screening in San Francisco

    Get PDF
    Federal guidelines now recommend supplemental HIV RNA testing for persons at high risk for acute HIV infection. However, many rapid HIV testing sites do not include HIV RNA or p24 antigen testing due to concerns about cost, the need for results follow-up, and the impact of expanded venipuncture on clinic flow. We developed criteria to identify patients in a municipal STD clinic in San Francisco who are asymptomatic but may still be likely to have acute infection.Data were from patients tested with serial HIV antibody and HIV RNA tests to identify acute HIV infection. BED-CEIA results were used to classify non-acute cases as recent or longstanding. Demographics and self-reported risk behaviors were collected at time of testing. Multivariate models were developed and preliminarily evaluated using predictors associated with recent infection in bivariate analyses as a proxy for acute HIV infection. Multivariate models demonstrating ≥70% sensitivity for recent infection while testing ≤60% of patients in this development dataset were then validated by determining their performance in identifying acute infections.From 2004-2007, 137 of 12,622 testers had recent and 36 had acute infections. A model limiting acute HIV screening to MSM plus any one of a series of other predictors resulted in a sensitivity of 83.3% and only 47.6% of patients requiring testing. A single-factor model testing only patients reporting any receptive anal intercourse resulted in 88.9% sensitivity with only 55.2% of patients requiring testing.In similar high risk HIV testing sites, acute screening using "supplemental" HIV p24 antigen or RNA tests can be rationally targeted to testers who report particular HIV risk behaviors. By improving the efficiency of acute HIV testing, such criteria could facilitate expanded acute case identification

    Complex Consequences of Herbivory and Interplant Cues in Three Annual Plants

    Get PDF
    Information exchange (or signaling) between plants following herbivore damage has recently been shown to affect plant responses to herbivory in relatively simple natural systems. In a large, manipulative field study using three annual plant species (Achyrachaena mollis, Lupinus nanus, and Sinapis arvensis), we tested whether experimental damage to a neighboring conspecific affected a plant's lifetime fitness and interactions with herbivores. By manipulating relatedness between plants, we assessed whether genetic relatedness of neighboring individuals influenced the outcome of having a damaged neighbor. Additionally, in laboratory feeding assays, we assessed whether damage to a neighboring plant specifically affected palatability to a generalist herbivore and, for S. arvensis, a specialist herbivore. Our study suggested a high level of contingency in the outcomes of plant signaling. For example, in the field, damaging a neighbor resulted in greater herbivory to A. mollis, but only when the damaged neighbor was a close relative. Similarly, in laboratory trials, the palatability of S. arvensis to a generalist herbivore increased after the plant was exposed to a damaged neighbor, while palatability to a specialist herbivore decreased. Across all species, damage to a neighbor resulted in decreased lifetime fitness, but only if neighbors were closely related. These results suggest that the outcomes of plant signaling within multi-species neighborhoods may be far more context-specific than has been previously shown. In particular, our study shows that herbivore interactions and signaling between plants are contingent on the genetic relationship between neighboring plants. Many factors affect the outcomes of plant signaling, and studies that clarify these factors will be necessary in order to assess the role of plant information exchange about herbivory in natural systems

    Predicting Hemolytic Uremic Syndrome and Renal Replacement Therapy in Shiga Toxin-producing Escherichia coli-infected Children.

    Get PDF
    BACKGROUND: Shiga toxin-producing Escherichia coli (STEC) infections are leading causes of pediatric acute renal failure. Identifying hemolytic uremic syndrome (HUS) risk factors is needed to guide care. METHODS: We conducted a multicenter, historical cohort study to identify features associated with development of HUS (primary outcome) and need for renal replacement therapy (RRT) (secondary outcome) in STEC-infected children without HUS at initial presentation. Children agedeligible. RESULTS: Of 927 STEC-infected children, 41 (4.4%) had HUS at presentation; of the remaining 886, 126 (14.2%) developed HUS. Predictors (all shown as odds ratio [OR] with 95% confidence interval [CI]) of HUS included younger age (0.77 [.69-.85] per year), leukocyte count ≥13.0 × 103/μL (2.54 [1.42-4.54]), higher hematocrit (1.83 [1.21-2.77] per 5% increase) and serum creatinine (10.82 [1.49-78.69] per 1 mg/dL increase), platelet count \u3c250 \u3e× 103/μL (1.92 [1.02-3.60]), lower serum sodium (1.12 [1.02-1.23 per 1 mmol/L decrease), and intravenous fluid administration initiated ≥4 days following diarrhea onset (2.50 [1.14-5.46]). A longer interval from diarrhea onset to index visit was associated with reduced HUS risk (OR, 0.70 [95% CI, .54-.90]). RRT predictors (all shown as OR [95% CI]) included female sex (2.27 [1.14-4.50]), younger age (0.83 [.74-.92] per year), lower serum sodium (1.15 [1.04-1.27] per mmol/L decrease), higher leukocyte count ≥13.0 × 103/μL (2.35 [1.17-4.72]) and creatinine (7.75 [1.20-50.16] per 1 mg/dL increase) concentrations, and initial intravenous fluid administration ≥4 days following diarrhea onset (2.71 [1.18-6.21]). CONCLUSIONS: The complex nature of STEC infection renders predicting its course a challenge. Risk factors we identified highlight the importance of avoiding dehydration and performing close clinical and laboratory monitoring
    • …
    corecore