153 research outputs found

    Convective infux/glymphatic system: tracers injected into the CSF enter and leave the brain along separate periarterial basement membrane pathways

    Get PDF
    Tracers injected into CSF pass into the brain alongside arteries and out again. This has been recently termed the "glymphatic system" that proposes tracers enter the brain along periarterial "spaces" and leave the brain along the walls of veins. The object of the present study is to test the hypothesis that: (1) tracers from the CSF enter the cerebral cortex along pial-glial basement membranes as there are no perivascular "spaces" around cortical arteries, (2) tracers leave the brain along smooth muscle cell basement membranes that form the Intramural Peri-Arterial Drainage (IPAD) pathways for the elimination of interstitial fluid and solutes from the brain. 2 μL of 100 μM soluble, fluorescent fixable amyloid β (Aβ) were injected into the CSF of the cisterna magna of 6-10 and 24-30 month-old male mice and their brains were examined 5 and 30 min later. At 5 min, immunocytochemistry and confocal microscopy revealed Aβ on the outer aspects of cortical arteries colocalized with α-2 laminin in the pial-glial basement membranes. At 30 min, Aβ was colocalised with collagen IV in smooth muscle cell basement membranes in the walls of cortical arteries corresponding to the IPAD pathways. No evidence for drainage along the walls of veins was found. Measurements of the depth of penetration of tracer were taken from 11 regions of the brain. Maximum depths of penetration of tracer into the brain were achieved in the pons and caudoputamen. Conclusions drawn from the present study are that tracers injected into the CSF enter and leave the brain along separate periarterial basement membrane pathways. The exit route is along IPAD pathways in which Aβ accumulates in cerebral amyloid angiopathy (CAA) in Alzheimer's disease. Results from this study suggest that CSF may be a suitable route for delivery of therapies for neurological diseases, including CAA

    Synthesis and separation in the history of ‘nature’ and ‘nurture.’

    Get PDF
    For much of the 20th century scientific psychology treated the relative contributions of nature and nurture to the development of phenotypes as the result of two quite separate sources of influence. One, nature, was linked to biological perspectives, often manifest as “instinct”, while the other, nurture, was taken to reflect psychological influences. We argue that this separation was contingent on historical circumstance. Prior to about 1920, several perspectives in biology and psychology promoted the synthesis of nature and nurture. But between 1930 and 1980 that synthetic consensus was lost in America as numerous influences converged to promote a view that identified psychological and biological aspects of mind and behavior as inherently separate. Around 1960, during the hegemony of behaviorism, Daniel Lehrman, Gilbert Gottlieb, and other pioneers of developmental psychobiology developed probabilistic epigenesis to reject predeterminist notions of instinct and restore a synthesis. We describe the earlier and later periods of synthesis and discuss several influences that led to the separation of nature and nurture in the middle of the 20th century

    Validation of Floating Node Method Using Three-Point Bend Doubler Under Quasi-Static Loading

    Get PDF
    The NASA Advanced Composite Project (ACP), an industry/government/university partnership, has embarked upon the task of developing technology that can aid in reducing the time line for structural certification of aircraft composite parts using a combination of technologies, one of which is high fidelity damage progression computational methods. Phase II of this project included a task for validating an approach based on the Floating Node Method combined with Directional Cohesive Elements (FNM-DCZE). This paper discusses predicted damage onset and growth in a three-point bend doubler specimen compared to experimental results. Sensitivity of the simulations to mesh refinement as well as key material properties and thermal effects are studied and reported. Overall, qualitative results suggest the main aspects of the damage progression have been captured, with the simulated damage morphology and sequence of events resembling closely what was observed experimentally. Quantitatively, the first load-peak is predicted. However, the re-loading observed in the experiments, after the first load peak, is not captured numerically, suggesting further investigation may be worth pursuing

    A Synergistic Approach for Evaluating Climate Model Output for Ecological Applications

    Get PDF
    Increasing concern about the impacts of climate change on ecosystems is prompting ecologists and ecosystem managers to seek reliable projections of physical drivers of change. The use of global climate models in ecology is growing, although drawing ecologically meaningful conclusions can be problematic. The expertise required to access and interpret output from climate and earth system models is hampering progress in utilizing them most effectively to determine the wider implications of climate change. To address this issue, we present a joint approach between climate scientists and ecologists that explores key challenges and opportunities for progress. As an exemplar, our focus is the Southern Ocean, notable for significant change with global implications, and on sea ice, given its crucial role in this dynamic ecosystem. We combined perspectives to evaluate the representation of sea ice in global climate models. With an emphasis on ecologically-relevant criteria (sea ice extent and seasonality) we selected a subset of eight models that reliably reproduce extant sea ice distributions. While the model subset shows a similar mean change to the full ensemble in sea ice extent (approximately 50% decline in winter and 30% decline in summer), there is a marked reduction in the range. This improved the precision of projected future sea ice distributions by approximately one third, and means they are more amenable to ecological interpretation. We conclude that careful multidisciplinary evaluation of climate models, in conjunction with ongoing modeling advances, should form an integral part of utilizing model output. © 2017 Cavanagh, Murphy, Bracegirdle, Turner, Knowland, Corney, Smith, Waluda, Johnston, Bellerby, Constable, Costa, Hofmann, Jackson, Staniland, Wolf-Gladrow, Xavier

    Enhancing Interdisciplinary Instruction in General and Special Education: Thematic Units and Technology

    Get PDF
    This article discusses interdisciplinary thematic units in the context of special and general education curricula and focuses on ways technology can be used to enhance interdisciplinary thematic units. Examples of curriculum integration activities enhanced by technology are provided in the context of productivity tools, presentation and multimedia tools, contextual themed software, and Web-based activities.Yeshttps://us.sagepub.com/en-us/nam/manuscript-submission-guideline

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Facing Aggression: Cues Differ for Female versus Male Faces

    Get PDF
    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F1,36 = 7.43, p = 0.01). In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait) and on femininity. Judgements of nurturing were associated with femininity (positively) and masculinity (negatively) ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as “honest signals”

    High-levels of acquired drug resistance in adult patients failing first-line antiretroviral therapy in a rural HIV treatment programme in KwaZulu-Natal, South Africa.

    Get PDF
    OBJECTIVE: To determine the frequency and patterns of acquired antiretroviral drug resistance in a rural primary health care programme in South Africa. DESIGN: Cross-sectional study nested within HIV treatment programme. METHODS: Adult (≥ 18 years) HIV-infected individuals initially treated with a first-line stavudine- or zidovudine-based antiretroviral therapy (ART) regimen and with evidence of virological failure (one viral load >1000 copies/ml) were enrolled from 17 rural primary health care clinics. Genotypic resistance testing was performed using the in-house SATuRN/Life Technologies system. Sequences were analysed and genotypic susceptibility scores (GSS) for standard second-line regimens were calculated using the Stanford HIVDB 6.0.5 algorithms. RESULTS: A total of 222 adults were successfully genotyped for HIV drug resistance between December 2010 and March 2012. The most common regimens at time of genotype were stavudine, lamivudine and efavirenz (51%); and stavudine, lamivudine and nevirapine (24%). Median duration of ART was 42 months (interquartile range (IQR) 32-53) and median duration of antiretroviral failure was 27 months (IQR 17-40). One hundred and ninety one (86%) had at least one drug resistance mutation. For 34 individuals (15%), the GSS for the standard second-line regimen was <2, suggesting a significantly compromised regimen. In univariate analysis, individuals with a prior nucleoside reverse-transcriptase inhibitor (NRTI) substitution were more likely to have a GSS <2 than those on the same NRTIs throughout (odds ratio (OR) 5.70, 95% confidence interval (CI) 2.60-12.49). CONCLUSIONS: There are high levels of drug resistance in adults with failure of first-line antiretroviral therapy in this rural primary health care programme. Standard second-line regimens could potentially have had reduced efficacy in about one in seven adults involved

    Surveillance of Transmitted Antiretroviral Drug Resistance among HIV-1 Infected Women Attending Antenatal Clinics in Chitungwiza, Zimbabwe

    Get PDF
    The rapid scale-up of highly active antiretroviral therapy (HAART) and use of single dose Nevirapine (SD NVP) for prevention of mother-to-child transmission (pMTCT) have raised fears about the emergence of resistance to the first line antiretroviral drug regimens. A cross-sectional study was conducted to determine the prevalence of primary drug resistance (PDR) in a cohort of young (<25 yrs) HAART-naïve HIV pregnant women attending antenatal clinics in Chitungwiza, Zimbabwe. Whole blood was collected in EDTA for CD4 counts, viral load, serological estimation of duration of infection using the BED Calypte assay and genotyping for drug resistance. Four hundred and seventy-one women, mean age 21 years; SD: 2.1 were enrolled into the study between 2006 and 2007. Their median CD4 count was 371cells/µL; IQR: 255–511 cells/µL. Two hundred and thirty-six samples were genotyped for drug resistance. Based on the BED assay, 27% were recently infected (RI) whilst 73% had long-term infection (LTI). Median CD4 count was higher (p<0.05) in RI than in women with LTI. Only 2 women had drug resistance mutations; protease I85V and reverse transcriptase Y181C. Prevalence of PDR in Chitungwiza, 4 years after commencement of the national ART program remained below WHO threshold limit (5%). Frequency of recent infection BED testing is consistent with high HIV acquisition during pregnancy. With the scale-up of long-term ART programs, maintenance of proper prescribing practices, continuous monitoring of patients and reinforcement of adherence may prevent the acquisition and transmission of PDR

    Poorer White Matter Microstructure Predicts Slower and More Variable Reaction Time Performance: Evidence for a Neural Noise Hypothesis in a Large Lifespan Cohort

    Get PDF
    Most prior research has focused on characterizing averages in cognition, brain characteristics, or behavior, and attempting to predict differences in these averages among individuals. However, this overwhelming focus on mean levels may leave us with an incomplete picture of what drives individual differences in behavioral phenotypes by ignoring the variability of behavior around an individual's mean. In particular, enhanced white matter (WM) structural microstructure has been hypothesized to support consistent behavioral performance by decreasing Gaussian noise in signal transfer. Conversely, lower indices of WM microstructure are associated with greater within-subject variance in the ability to deploy performance-related resources, especially in clinical populations. We tested a mechanistic account of the “neural noise” hypothesis in a large adult lifespan cohort (Cambridge Centre for Ageing and Neuroscience) with over 2500 adults (ages 18-102; 1508 female; 1173 male; 2681 behavioral sessions; 708 MRI scans) using WM fractional anisotropy to predict mean levels and variability in reaction time performance on a simple behavioral task using a dynamic structural equation model. By modeling robust and reliable individual differences in within-person variability, we found support for a neural noise hypothesis (Kail, 1997), with lower fractional anisotropy predicted individual differences in separable components of behavioral performance estimated using dynamic structural equation model, including slower mean responses and increased variability. These effects remained when including age, suggesting consistent effects of WM microstructure across the adult lifespan unique from concurrent effects of aging. Crucially, we show that variability can be reliably separated from mean performance using advanced modeling tools, enabling tests of distinct hypotheses for each component of performance
    corecore