984 research outputs found

    Descriptive Analysis of a Baseline Concussion Battery Among U.S. Service Academy Members: Results from the Concussion Assessment, Research, and Education (CARE) Consortium

    Get PDF
    Introduction The prevalence and possible long-term consequences of concussion remain an increasing concern to the U.S. military, particularly as it pertains to maintaining a medically ready force. Baseline testing is being used both in the civilian and military domains to assess concussion injury and recovery. Accurate interpretation of these baseline assessments requires one to consider other influencing factors not related to concussion. To date, there is limited understanding, especially within the military, of what factors influence normative test performance. Given the significant physical and mental demands placed on service academy members (SAM), and their relatively high risk for concussion, it is important to describe demographics and normative profile of SAMs. Furthermore, the absence of available baseline normative data on female and non-varsity SAMs makes interpretation of post-injury assessments challenging. Understanding how individuals perform at baseline, given their unique individual characteristics (e.g., concussion history, sex, competition level), will inform post-concussion assessment and management. Thus, the primary aim of this manuscript is to characterize the SAM population and determine normative values on a concussion baseline testing battery. Materials and Methods All data were collected as part of the Concussion Assessment, Research and Education (CARE) Consortium. The baseline test battery included a post-concussion symptom checklist (Sport Concussion Assessment Tool (SCAT), psychological health screening inventory (Brief Symptom Inventory (BSI-18) and neurocognitive evaluation (ImPACT), Balance Error Scoring System (BESS), and Standardized Assessment of Concussion (SAC). Linear regression models were used to examine differences across sexes, competition levels, and varsity contact levels while controlling for academy, freshman status, race, and previous concussion. Zero inflated negative binomial models estimated symptom scores due to the high frequency of zero scores. Results Significant, but small, sex effects were observed on the ImPACT visual memory task. While, females performed worse than males (p < 0.0001, pΞ·2 = 0.01), these differences were small and not larger than the effects of the covariates. A similar pattern was observed for competition level on the SAC. There was a small, but significant difference across competition level. SAMs participating in varsity athletics did significantly worse on the SAC compared to SAMs participating in club or intramural athletics (all p’s < 0.001, Ξ·2 = 0.01). When examining symptom reporting, males were more than two times as likely to report zero symptoms on the SCAT or BSI-18. Intramural SAMs had the highest number of symptoms and severity compared to varsity SAMs (p < 0.0001, Cohen’s d < 0.2). Contact level was not associated with SCAT or BSI-18 symptoms among varsity SAMs. Notably, the significant differences across competition level on SCAT and BSI-18 were sub-clinical and had small effect sizes. Conclusion The current analyses provide the first baseline concussion battery normative data among SAMs. While statistically significant differences may be observed on baseline tests, the effect sizes for competition and contact levels are very small, indicating that differences are likely not clinically meaningful at baseline. Identifying baseline differences and significant covariates is important for future concussion-related analyses to inform concussion evaluations for all athlete levels

    Impact of type 2 diabetes and the metabolic syndrome on myocardial structure and microvasculature of men with coronary artery disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Type 2 diabetes and the metabolic syndrome are associated with impaired diastolic function and increased heart failure risk. Animal models and autopsy studies of diabetic patients implicate myocardial fibrosis, cardiomyocyte hypertrophy, altered myocardial microvascular structure and advanced glycation end-products (AGEs) in the pathogenesis of diabetic cardiomyopathy. We investigated whether type 2 diabetes and the metabolic syndrome are associated with altered myocardial structure, microvasculature, and expression of AGEs and receptor for AGEs (RAGE) in men with coronary artery disease.</p> <p>Methods</p> <p>We performed histological analysis of left ventricular biopsies from 13 control, 10 diabetic and 23 metabolic syndrome men undergoing coronary artery bypass graft surgery who did not have heart failure or atrial fibrillation, had not received loop diuretic therapy, and did not have evidence of previous myocardial infarction.</p> <p>Results</p> <p>All three patient groups had similar extent of coronary artery disease and clinical characteristics, apart from differences in metabolic parameters. Diabetic and metabolic syndrome patients had higher pulmonary capillary wedge pressure than controls, and diabetic patients had reduced mitral diastolic peak velocity of the septal mitral annulus (E'), consistent with impaired diastolic function. Neither diabetic nor metabolic syndrome patients had increased myocardial interstitial fibrosis (picrosirius red), or increased immunostaining for collagen I and III, the AGE NΞ΅-(carboxymethyl)lysine, or RAGE. Cardiomyocyte width, capillary length density, diffusion radius, and arteriolar dimensions did not differ between the three patient groups, whereas diabetic and metabolic syndrome patients had reduced perivascular fibrosis.</p> <p>Conclusions</p> <p>Impaired diastolic function of type 2 diabetic and metabolic syndrome patients was not dependent on increased myocardial fibrosis, cardiomyocyte hypertrophy, alteration of the myocardial microvascular structure, or increased myocardial expression of NΞ΅-(carboxymethyl)lysine or RAGE. These findings suggest that the increased myocardial fibrosis and AGE expression, cardiomyocyte hypertrophy, and altered microvasculature structure described in diabetic heart disease were a consequence, rather than an initiating cause, of cardiac dysfunction.</p

    Association of Blood Biomarkers With Acute Sport-Related Concussion in Collegiate Athletes: Findings From the NCAA and Department of Defense CARE Consortium

    Get PDF
    Importance: There is potential scientific and clinical value in validation of objective biomarkers for sport-related concussion (SRC). Objective: To investigate the association of acute-phase blood biomarker levels with SRC in collegiate athletes. Design, Setting, and Participants: This multicenter, prospective, case-control study was conducted by the National Collegiate Athletic Association (NCAA) and the US Department of Defense Concussion Assessment, Research, and Education (CARE) Consortium from February 20, 2015, to May 31, 2018, at 6 CARE Advanced Research Core sites. A total of 504 collegiate athletes with concussion, contact sport control athletes, and non-contact sport control athletes completed clinical testing and blood collection at preseason baseline, the acute postinjury period, 24 to 48 hours after injury, the point of reporting being asymptomatic, and 7 days after return to play. Data analysis was conducted from March 1 to November 30, 2019. Main Outcomes and Measures: Glial fibrillary acidic protein (GFAP), ubiquitin C-terminal hydrolase-L1 (UCH-L1), neurofilament light chain, and tau were quantified using the Quanterix Simoa multiplex assay. Clinical outcome measures included the Sport Concussion Assessment Tool-Third Edition (SCAT-3) symptom evaluation, Standardized Assessment of Concussion, Balance Error Scoring System, and Brief Symptom Inventory 18. Results: A total of 264 athletes with concussion (mean [SD] age, 19.08 [1.24] years; 211 [79.9%] male), 138 contact sport controls (mean [SD] age, 19.03 [1.27] years; 107 [77.5%] male), and 102 non-contact sport controls (mean [SD] age, 19.39 [1.25] years; 82 [80.4%] male) were included in the study. Athletes with concussion had significant elevation in GFAP (mean difference, 0.430 pg/mL; 95% CI, 0.339-0.521 pg/mL; P < .001), UCH-L1 (mean difference, 0.449 pg/mL; 95% CI, 0.167-0.732 pg/mL; P < .001), and tau levels (mean difference, 0.221 pg/mL; 95% CI, 0.046-0.396 pg/mL; P = .004) at the acute postinjury time point compared with preseason baseline. Longitudinally, a significant interaction (group × visit) was found for GFAP (F7,1507.36 = 16.18, P < .001), UCH-L1 (F7,1153.09 = 5.71, P < .001), and tau (F7,1480.55 = 6.81, P < .001); the interaction for neurofilament light chain was not significant (F7,1506.90 = 1.33, P = .23). The area under the curve for the combination of GFAP and UCH-L1 in differentiating athletes with concussion from contact sport controls at the acute postinjury period was 0.71 (95% CI, 0.64-0.78; P < .001); the acute postinjury area under the curve for all 4 biomarkers combined was 0.72 (95% CI, 0.65-0.79; P < .001). Beyond SCAT-3 symptom score, GFAP at the acute postinjury time point was associated with the classification of athletes with concussion from contact controls (β = 12.298; 95% CI, 2.776-54.481; P = .001) and non-contact sport controls (β = 5.438; 95% CI, 1.676-17.645; P = .005). Athletes with concussion with loss of consciousness or posttraumatic amnesia had significantly higher levels of GFAP than athletes with concussion with neither loss of consciousness nor posttraumatic amnesia at the acute postinjury time point (mean difference, 0.583 pg/mL; 95% CI, 0.369-0.797 pg/mL; P < .001). Conclusions and Relevance: The results suggest that blood biomarkers can be used as research tools to inform the underlying pathophysiological mechanism of concussion and provide additional support for future studies to optimize and validate biomarkers for potential clinical use in SRC

    Bridging Alone: Religious Conservatism, Marital Homogamy, and Voluntary Association Membership

    Full text link
    This study characterizes social insularity of religiously conservative American married couples by examining patterns of voluntary associationmembership. Constructing a dataset of 3938 marital dyads from the second wave of the National Survey of Families and Households, the author investigates whether conservative religious homogamy encourages membership in religious voluntary groups and discourages membership in secular voluntary groups. Results indicate that couples’ shared affiliation with conservative denominations, paired with beliefs in biblical authority and inerrancy, increases the likelihood of religious group membership for husbands and wives and reduces the likelihood of secular group membership for wives, but not for husbands. The social insularity of conservative religious groups appears to be reinforced by homogamyβ€”particularly by wives who share faith with husbands

    A cohort study to identify and evaluate concussion risk factors across multiple injury settings: findings from the CARE Consortium

    Get PDF
    BACKGROUND: Concussion, or mild traumatic brain injury, is a major public health concern affecting 42 million individuals globally each year. However, little is known regarding concussion risk factors across all concussion settings as most concussion research has focused on only sport-related or military-related concussive injuries. METHODS: The current study is part of the Concussion, Assessment, Research, and Education (CARE) Consortium, a multi-site investigation on the natural history of concussion. Cadets at three participating service academies completed annual baseline assessments, which included demographics, medical history, and concussion history, along with the Sport Concussion Assessment Tool (SCAT) symptom checklist and Brief Symptom Inventory (BSI-18). Clinical and research staff recorded the date and injury setting at time of concussion. Generalized mixed models estimated concussion risk with service academy as a random effect. Since concussion was a rare event, the odds ratios were assumed to approximate relative risk. RESULTS: Beginning in 2014, 10,604 (n = 2421, 22.83% female) cadets enrolled over 3 years. A total of 738 (6.96%) cadets experienced a concussion, 301 (2.84%) concussed cadets were female. Female sex and previous concussion were the most consistent estimators of concussion risk across all concussion settings. Compared to males, females had 2.02 (95% CI: 1.70-2.40) times the risk of a concussion regardless of injury setting, and greater relative risk when the concussion occurred during sport (Odds Ratio (OR): 1.38 95% CI: 1.07-1.78). Previous concussion was associated with 1.98 (95% CI: 1.65-2.37) times increased risk for any incident concussion, and the magnitude was relatively stable across all concussion settings (OR: 1.73 to 2.01). Freshman status was also associated with increased overall concussion risk, but was driven by increased risk for academy training-related concussions (OR: 8.17 95% CI: 5.87-11.37). Medical history of headaches in the past 3 months, diagnosed ADD/ADHD, and BSI-18 Somatization symptoms increased overall concussion risk. CONCLUSIONS: Various demographic and medical history factors are associated with increased concussion risk. While certain factors (e.g. sex and previous concussion) are consistently associated with increased concussion risk, regardless of concussion injury setting, other factors significantly influence concussion risk within specific injury settings. Further research is required to determine whether these risk factors may aid in concussion risk reduction or prevention

    A Delphi study and ranking exercise to support commissioning services:Future delivery of Thrombectomy services in England

    Get PDF
    Background: Intra-arterial thrombectomy is the gold standard treatment for large artery occlusive stroke. However, the evidence of its benefits is almost entirely based on trials delivered by experienced neurointerventionists working in established teams in neuroscience centres. Those responsible for the design and prospective reconfiguration of services need access to a comprehensive and complementary array of information on which to base their decisions. This will help to ensure the demonstrated effects from trials may be realised in practice and account for regional/local variations in resources and skill-sets. One approach to elucidate the implementation preferences and considerations of key experts is a Delphi survey. In order to support commissioning decisions, we aimed to using an electronic Delphi survey to establish consensus on the options for future organisation of thrombectomy services among physicians with clinical experience in managing large artery occlusive stroke. Methods: A Delphi survey was developed with 12 options for future organisation of thrombectomy services in England. A purposive sampling strategy established an expert panel of stroke physicians from the British Association of Stroke Physicians (BASP) Clinical Standards and/or Executive Membership that deliver 24/7 intravenous thrombolysis. Options with aggregate scores falling within the lowest quartile were removed from the subsequent Delphi round. Options reaching consensus following the two Delphi rounds were then ranked in a final exercise by both the wider BASP membership and the British Society of Neuroradiologists (BSNR). Results: Eleven stroke physicians from BASP completed the initial two Delphi rounds. Three options achieved consensus, with subsequently wider BASP (97%, n=43) and BSNR members (86%, n=21) assigning the highest approval rankings in the final exercise for transferring large artery occlusive stroke patients to nearest neuroscience centre for thrombectomy based on local CT/CT Angiography. Conclusions: The initial Delphi rounds ensured optimal reduction of options by an expert panel of stroke physicians, while subsequent ranking exercises allowed remaining options to be ranked by a wider group of experts within stroke to reach consensus. The preferred implementation option for thrombectomy is conveying suspected stroke patients for CT/CT Angiography and secondary transfer of large artery occlusive stroke patients to the nearest neuroscience centre

    WASH for WORMS: a cluster-randomized controlled trial of the impact of a community integrated water, sanitation, and hygiene and deworming intervention on soil-transmitted helminth infections

    Get PDF
    Water, sanitation, and hygiene (WASH) interventions have been proposed as an important complement to deworming programs for sustainable control of soil-transmitted helminth (STH) infections. We aimed to determine whether a community-based WASH program had additional benefits in reducing STH infections compared with community deworming alone. We conducted the WASH for WORMS cluster-randomized controlled trial in 18 rural communities in Timor-Leste. Intervention communities received a WASH intervention that provided access to an improved water source, promoted improved household sanitation, and encouraged handwashing with soap. All eligible community members in intervention and control arms received albendazole every 6 months for 2 years. The primary outcomes were infection with each STH, measured using multiplex real-time quantitative polymerase chain reaction. We compared outcomes between study arms using generalized linear mixed models, accounting for clustering at community, household, and individual levels. At study completion, the integrated WASH and deworming intervention did not have an effect on infection with Ascaris spp. (relative risk [RR] 2.87, 95% confidence interval [CI]: 0.66-12.48, P = 0.159) or Necator americanus (RR 0.99, 95% CI: 0.52-1.89, P = 0.987), compared with deworming alone. At the last follow-up, open defecation was practiced by 66.1% (95% CI: 54.2-80.2) of respondents in the control arm versus 40.2% (95% CI: 25.3-52.6) of respondents in the intervention arm (P = 0.005). We found no evidence that the WASH intervention resulted in additional reductions in STH infections beyond that achieved with deworming alone over the 2-year trial period. The role of WASH on STH infections over a longer period of time and in the absence of deworming remains to be determined

    WASH for WORMS: a cluster-randomized controlled trial of the impact of a community integrated water, sanitation, and hygiene and deworming intervention on soil-transmitted helminth infections

    Get PDF
    Water, sanitation, and hygiene (WASH) interventions have been proposed as an important complement to deworming programs for sustainable control of soil-transmitted helminth (STH) infections. We aimed to determine whether a community-based WASH program had additional benefits in reducing STH infections compared with community deworming alone. We conducted the WASH for WORMS cluster-randomized controlled trial in 18 rural communities in Timor-Leste. Intervention communities received a WASH intervention that provided access to an improved water source, promoted improved household sanitation, and encouraged handwashing with soap. All eligible community members in intervention and control arms received albendazole every 6 months for 2 years. The primary outcomes were infection with each STH, measured using multiplex real-time quantitative polymerase chain reaction. We compared outcomes between study arms using generalized linear mixed models, accounting for clustering at community, household, and individual levels. At study completion, the integrated WASH and deworming intervention did not have an effect on infection with Ascaris spp. (relative risk [RR] 2.87, 95% confidence interval [CI]: 0.66-12.48, P = 0.159) or Necator americanus (RR 0.99, 95% CI: 0.52-1.89, P = 0.987), compared with deworming alone. At the last follow-up, open defecation was practiced by 66.1% (95% CI: 54.2-80.2) of respondents in the control arm versus 40.2% (95% CI: 25.3-52.6) of respondents in the intervention arm (P = 0.005). We found no evidence that the WASH intervention resulted in additional reductions in STH infections beyond that achieved with deworming alone over the 2-year trial period. The role of WASH on STH infections over a longer period of time and in the absence of deworming remains to be determined

    Predicting Maximum Tree Heights and Other Traits from Allometric Scaling and Resource Limitations

    Get PDF
    Terrestrial vegetation plays a central role in regulating the carbon and water cycles, and adjusting planetary albedo. As such, a clear understanding and accurate characterization of vegetation dynamics is critical to understanding and modeling the broader climate system. Maximum tree height is an important feature of forest vegetation because it is directly related to the overall scale of many ecological and environmental quantities and is an important indicator for understanding several properties of plant communities, including total standing biomass and resource use. We present a model that predicts local maximal tree height across the entire continental United States, in good agreement with data. The model combines scaling laws, which encode the average, base-line behavior of many tree characteristics, with energy budgets constrained by local resource limitations, such as precipitation, temperature and solar radiation. In addition to predicting maximum tree height in an environment, our framework can be extended to predict how other tree traits, such as stomatal density, depend on these resource constraints. Furthermore, it offers predictions for the relationship between height and whole canopy albedo, which is important for understanding the Earth's radiative budget, a critical component of the climate system. Because our model focuses on dominant features, which are represented by a small set of mechanisms, it can be easily integrated into more complicated ecological or climate models.National Science Foundation (U.S.) (Research Experience for Undergraduates stipend)Gordon and Betty Moore FoundationNational Science Foundation (U.S.) (Graduate Research Fellowship Program)Massachusetts Institute of Technology. Presidential FellowshipEugene V. and Clare Thaw Charitable TrustEngineering and Physical Sciences Research CouncilNational Science Foundation (U.S.) (PHY0202180)Colorado College (Venture Grant Program

    Pipeline for Large-Scale Microdroplet Bisulfite PCR-Based Sequencing Allows the Tracking of Hepitype Evolution in Tumors

    Get PDF
    Cytosine methylation provides an epigenetic level of cellular plasticity that is important for development, differentiation and cancerogenesis. We adopted microdroplet PCR to bisulfite treated target DNA in combination with second generation sequencing to simultaneously assess DNA sequence and methylation. We show measurement of methylation status in a wide range of target sequences (total 34 kb) with an average coverage of 95% (median 100%) and good correlation to the opposite strand (rhoβ€Š=β€Š0.96) and to pyrosequencing (rhoβ€Š=β€Š0.87). Data from lymphoma and colorectal cancer samples for SNRPN (imprinted gene), FGF6 (demethylated in the cancer samples) and HS3ST2 (methylated in the cancer samples) serve as a proof of principle showing the integration of SNP data and phased DNA-methylation information into β€œhepitypes” and thus the analysis of DNA methylation phylogeny in the somatic evolution of cancer
    • …
    corecore