542 research outputs found

    New York\u27s All-Payer Database: A New Lens for Consumer Transparency

    Get PDF

    Potential climate change impacts on temperate forest ecosystem processes

    Get PDF
    Large changes in atmospheric CO2, temperature, and precipitation are predicted by 2100, yet the long-term consequences for carbon (C), water, and nitrogen (N) cycling in forests are poorly understood. We applied the PnET-CN ecosystem model to compare the long-term effects of changing climate and atmospheric CO2 on productivity, evapotranspiration, runoff, and net nitrogen mineralization in current Great Lakes forest types. We used two statistically downscaled climate projections, PCM B1 (warmer and wetter) and GFDL A1FI (hotter and drier), to represent two potential future climate and atmospheric CO2 scenarios. To separate the effects of climate and CO2, we ran PnET-CN including and excluding the CO2 routine. Our results suggest that, with rising CO2 and without changes in forest type, average regional productivity could increase from 67% to 142%, changes in evapotranspiration could range from –3% to +6%, runoff could increase from 2% to 22%, and net N mineralization could increase 10% to 12%. Ecosystem responses varied geographically and by forest type. Increased productivity was almost entirely driven by CO2 fertilization effects, rather than by temperature or precipitation (model runs holding CO2 constant showed stable or declining productivity). The relative importance of edaphic and climatic spatial drivers of productivity varied over time, suggesting that productivity in Great Lakes forests may switch from being temperature- to water-limited by the end of the century

    Learning to detect and understand drug discontinuation events from clinical narratives

    Get PDF
    OBJECTIVE: Identifying drug discontinuation (DDC) events and understanding their reasons are important for medication management and drug safety surveillance. Structured data resources are often incomplete and lack reason information. In this article, we assessed the ability of natural language processing (NLP) systems to unlock DDC information from clinical narratives automatically. MATERIALS AND METHODS: We collected 1867 de-identified providers\u27 notes from the University of Massachusetts Medical School hospital electronic health record system. Then 2 human experts chart reviewed those clinical notes to annotate DDC events and their reasons. Using the annotated data, we developed and evaluated NLP systems to automatically identify drug discontinuations and reasons at the sentence level using a novel semantic enrichment-based vector representation (SEVR) method for enhanced feature representation. RESULTS: Our SEVR-based NLP system achieved the best performance of 0.785 (AUC-ROC) for detecting discontinuation events and 0.745 (AUC-ROC) for identifying reasons when testing this highly imbalanced data, outperforming 2 state-of-the-art non-SEVR-based models. Compared with a rule-based baseline system for discontinuation detection, our system improved the sensitivity significantly (57.75% vs 18.31%, absolute value) while retaining a high specificity of 99.25%, leading to a significant improvement in AUC-ROC by 32.83% (absolute value). CONCLUSION: Experiments have shown that a high-performance NLP system can be developed to automatically identify DDCs and their reasons from providers\u27 notes. The SEVR model effectively improved the system performance showing better generalization and robustness on unseen test data. Our work is an important step toward identifying reasons for drug discontinuation that will inform drug safety surveillance and pharmacovigilance

    Developmental Delays in Executive Function from 3 to 5 Years of Age Predict Kindergarten Academic Readiness

    Get PDF
    Substantial evidence has established that individual differences in executive function (EF) in early childhood are uniquely predictive of children’s academic readiness at school entry. The current study tested whether growth trajectories of EF across the early childhood period could be used to identify a subset of children who were at pronounced risk for academic impairment in kindergarten. Using data that were collected at the age 3, 4, and 5 home assessments in the Family Life Project (N = 1,120), growth mixture models were used to identify 9% of children who exhibited impaired EF performance (i.e., persistently low levels of EF that did not show expected improvements across time). Compared to children who exhibited typical trajectories of EF, the delayed group exhibited substantial impairments in multiple indicators of academic readiness in kindergarten (Cohen’s ds = 0.9–2.7; odds ratios = 9.8–23.8). Although reduced in magnitude following control for a range of socioeconomic and cognitive (general intelligence screener, receptive vocabulary) covariates, moderate-sized group differences remained (Cohen’s ds = 0.2–2.4; odds ratios = 3.9–5.4). Results are discussed with respect to the use of repeated measures of EF as a method of early identification, as well as the resulting translational implications of doing so

    Paradoxical reversal learning enhancement by stress or prefrontal cortical damage: rescue with BDNF.

    Get PDF
    Stress affects various forms of cognition. We found that moderate stress enhanced late reversal learning in a mouse touchscreen-based choice task. Ventromedial prefrontal cortex (vmPFC) lesions mimicked the effect of stress, whereas orbitofrontal and dorsolateral striatal lesions impaired reversal. Stress facilitation of reversal was prevented by BDNF infusion into the vmPFC. These findings suggest a mechanism by which stress-induced vmPFC dysfunction disinhibits learning by alternate (for example, striatal) systems

    Screening Yield of HIV Antigen/Antibody Combination and Pooled HIV RNA Testing for Acute HIV Infection in a High-Prevalence Population

    Get PDF
    Although acute HIV infection contributes disproportionately to onward HIV transmission, HIV testing has not routinely included screening for acute HIV infection. To evaluate the performance of an HIV antigen/antibody (Ag/Ab) combination assay to detect acute HIV infection compared with pooled HIV RNA testing. Multisite, prospective, within-individual comparison study conducted between September 2011 and October 2013 in 7 sexually transmitted infection clinics and 5 community-based programs in New York, California, and North Carolina. Participants were 12 years or older and seeking HIV testing, without known HIV infection. All participants with a negative rapid HIV test result were screened for acute HIV infection with an HIV Ag/Ab combination assay (index test) and pooled human immunodeficiency virus 1 (HIV-1) RNA testing. HIV RNA testing was the reference standard, with positive reference standard result defined as detectable HIV-1 RNA on an individual RNA test. Number and proportion with acute HIV infections detected. Among 86,836 participants with complete test results (median age, 29 years; 75.0% men; 51.8% men who have sex with men), established HIV infection was diagnosed in 1158 participants (1.33%) and acute HIV infection was diagnosed in 168 participants (0.19%). Acute HIV infection was detected in 134 participants with HIV Ag/Ab combination testing (0.15% [95% CI, 0.13%-0.18%]; sensitivity, 79.8% [95% CI, 72.9%-85.6%]; specificity, 99.9% [95% CI, 99.9%-99.9%]; positive predictive value, 59.0% [95% CI, 52.3%-65.5%]) and in 164 participants with pooled HIV RNA testing (0.19% [95% CI, 0.16%-0.22%]; sensitivity, 97.6% [95% CI, 94.0%-99.4%]; specificity, 100% [95% CI, 100%-100%]; positive predictive value, 96.5% [95% CI, 92.5%-98.7%]; sensitivity comparison, P < .001). Overall HIV Ag/Ab combination testing detected 82% of acute HIV infections detectable by pooled HIV RNA testing. Compared with rapid HIV testing alone, HIV Ag/Ab combination testing increased the relative HIV diagnostic yield (both established and acute HIV infections) by 10.4% (95% CI, 8.8%-12.2%) and pooled HIV RNA testing increased the relative HIV diagnostic yield by 12.4% (95% CI, 10.7%-14.3%). In a high-prevalence population, HIV screening using an HIV Ag/Ab combination assay following a negative rapid test detected 82% of acute HIV infections detectable by pooled HIV RNA testing, with a positive predictive value of 59%. Further research is needed to evaluate this strategy in lower-prevalence populations and in persons using preexposure prophylaxis for HIV prevention

    Imaging of Glial Cell Activation and White Matter Integrity in Brains of Active and Recently Retired National Football League Players

    Get PDF
    Importance: Microglia, the resident immune cells of the central nervous system, play an important role in the brain\u27s response to injury and neurodegenerative processes. It has been proposed that prolonged microglial activation occurs after single and repeated traumatic brain injury, possibly through sports-related concussive and subconcussive injuries. Limited in vivo brain imaging studies months to years after individuals experience a single moderate to severe traumatic brain injury suggest widespread persistent microglial activation, but there has been little study of persistent glial cell activity in brains of athletes with sports-related traumatic brain injury. Objective: To measure translocator protein 18 kDa (TSPO), a marker of activated glial cell response, in a cohort of National Football League (NFL) players and control participants, and to report measures of white matter integrity. Design, Setting, and Participants: This cross-sectional, case-control study included young active (n = 4) or former (n = 10) NFL players recruited from across the United States, and 16 age-, sex-, highest educational level-, and body mass index-matched control participants. This study was conducted at an academic research institution in Baltimore, Maryland, from January 29, 2015, to February 18, 2016. Main Outcomes and Measures: Positron emission tomography-based regional measures of TSPO using [11C]DPA-713, diffusion tensor imaging measures of regional white matter integrity, regional volumes on structural magnetic resonance imaging, and neuropsychological performance. Results: The mean (SD) ages of the 14 NFL participants and 16 control participants were 31.3 (6.1) years and 27.6 (4.9) years, respectively. Players reported a mean (SD) of 7.0 (6.4) years (range, 1-21 years) since the last self-reported concussion. Using [11C]DPA-713 positron emission tomographic data from 12 active or former NFL players and 11 matched control participants, the NFL players showed higher total distribution volume in 8 of the 12 brain regions examined (P \u3c .004). We also observed limited change in white matter fractional anisotropy and mean diffusivity in 13 players compared with 15 control participants. In contrast, these young players did not differ from control participants in regional brain volumes or in neuropsychological performance. Conclusions and Relevance: The results suggest that localized brain injury and repair, indicated by higher TSPO signal and white matter changes, may be associated with NFL play. Further study is needed to confirm these findings and to determine whether TSPO signal and white matter changes in young NFL athletes are related to later onset of neuropsychiatric symptoms

    The Critical Juncture Concept’s Evolving Capacity to Explain Policy Change

    Get PDF
    This article examines the evolution of our understanding of the critical junctures concept. The concept finds its origins in historical intuitionalism, being employed in the context of path dependence to account for sudden and jarring institutional or policy changes. We argue that the concept and the literature surrounding it—now incorporating ideas, discourse, and agency—have gradually become more comprehensive and nuanced as historical institutionalism was followed by ideational historical institutionalism and constructivist and discursive institutionalism. The prime position of contingency has been supplanted by the role of ideas and agency in explaining critical junctures and other instances of less than transformative change. Consequently, the concept is now capable of providing more comprehensive explanations for policy change

    Detection of Acute HIV Infection in Two Evaluations of a New HIV Diagnostic Testing Algorithm — United States, 2011–2013

    Get PDF
    The highly infectious phase of acute human immunodeficiency virus (HIV) infection, defined as the interval between the appearance of HIV RNA in plasma and the detection of HIV-1-specific antibodies, contributes disproportionately to HIV transmission. The current HIV diagnostic algorithm consists of a repeatedly reactive immunoassay (IA), followed by a supplemental test, such as the Western blot (WB) or indirect immunofluorescence assay (IFA). Because current laboratory IAs detect HIV infection earlier than supplemental tests, reactive IA results and negative supplemental test results very early in the course of HIV infection have been erroneously interpreted as negative. To address this problem, CDC has been evaluating a new HIV diagnostic algorithm. This report describes two evaluations of this algorithm. An HIV screening program at a Phoenix, Arizona emergency department (ED) identified 37 undiagnosed HIV infections during July 2011-February 2013. Of these, 12 (32.4%) were acute HIV infections. An ongoing HIV testing study in three sites identified 99 cases with reactive IA and negative supplemental test results; 55 (55.6%) had acute HIV infection. CDC and many health departments recognize that confirmatory supplemental tests can give false-negative results early in the course of HIV infection. This problem can be resolved by testing for HIV RNA after a reactive IA result and negative supplemental test result

    Interactions between folate intake and genetic predictors of gene expression levels associated with colorectal cancer risk

    Full text link
    Observational studies have shown higher folate consumption to be associated with lower risk of colorectal cancer (CRC). Understanding whether and how genetic risk factors interact with folate could further elucidate the underlying mechanism. Aggregating functionally relevant genetic variants in set-based variant testing has higher power to detect gene-environment (G x E) interactions and may provide information on the underlying biological pathway. We investigated interactions between folate consumption and predicted gene expression on colorectal cancer risk across the genome. We used variant weights from the PrediXcan models of colon tissue-specific gene expression as a priori variant information for a set-based G x E approach. We harmonized total folate intake (mcg/day) based on dietary intake and supplemental use across cohort and case-control studies and calculated sex and study specific quantiles. Analyses were performed using a mixed effects score tests for interactions between folate and genetically predicted expression of 4839 genes with available genetically predicted expression. We pooled results across 23 studies for a total of 13,498 cases with colorectal tumors and 13,918 controls of European ancestry. We used a false discovery rate of 0.2 to identify genes with suggestive evidence of an interaction. We found suggestive evidence of interaction with folate intake on CRC risk for genes including glutathione S-Transferase Alpha 1 (GSTA1; p = 4.3E-4), Tonsuko Like, DNA Repair Protein (TONSL; p = 4.3E-4), and Aspartylglucosaminidase (AGA: p = 4.5E-4). We identified three genes involved in preventing or repairing DNA damage that may interact with folate consumption to alter CRC risk. Glutathione is an antioxidant, preventing cellular damage and is a downstream metabolite of homocysteine and metabolized by GSTA1. TONSL is part of a complex that functions in the recovery of double strand breaks and AGA plays a role in lysosomal breakdown of glycoprotein
    corecore