16 research outputs found

    Are tangles as toxic as they look?

    Get PDF
    Neurofibrillary tangles are intracellular accumulations of hyperphosphorylated and misfolded tau protein characteristic of Alzheimer's disease and other tauopathies. Classic cross-sectional studies of Alzheimer patient brains showed associations of tangle accumulation with neuronal loss, synapse loss, and dementia, which led to the supposition that tangles are toxic to neurons. More recent advances in imaging techniques and mouse models have allowed the direct exploration of the question of toxicity of aggregated versus soluble tau and have surprisingly challenged the view of tangles as toxic species in the brain. Here, we review these recent experiments on the nature of the toxicity of tau with particular emphasis on our experiments imaging tangles in the intact brain through a cranial window, which allows observation of tangle formation and longitudinal imaging of the fate of tangle-bearing neurons. Neurofibrillary tangles (NFT) were first described in 1906 by Alois Alzheimer based on Bielschowsky silver staining of the brain of his demented patient Auguste D (Alzheimer 1907; Goedert and Spillantini 2006). These intraneuronal aggregates have subsequently been found to be composed primarily of hyperphosphorylated tau protein and are definitive pathological lesions not only in Alzheimer's disease but also in a class of neurodegenerative tauopathies (Goedert et al. 1988; Spires-Jones et al. 2009). NFT pathology in Alzheimer's disease (AD) correlates closely with cognitive decline and synapse and neuronal loss (Braak and Braak 1997; Bretteville and Planel 2008; Congdon and Duff 2008; Mocanu et al. 2008b; Spires-Jones et al. 2009). As a result, NFT have long been considered indicative of impending neuronal cell death. More recent evidence, however, opposes this classical view. Here we review evidence addressing the question of whether NFT cause structural or functional neuronal damage

    Intraocular Lens Unfurling Time Exponentially Decays with Increased Solution Temperature

    No full text
    Erick E Rocher,1 Rishima Mukherjee,1 James Pitingolo,1 Eli Levenshus,1 Gwyneth Alexander,1 Minyoung Park,1 Rupsa Acharya,1 Sarah Khan,1 Jordan Shuff,1 Andres Aguirre,1 Shababa Matin,2 Keith Walter,3 Allen O Eghrari4 1Center for Bioengineering Innovation and Design, Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD, USA; 2Rice 360 Institute for Global Health Technologies, Rice University, Houston, TX, USA; 3Department of Ophthalmology, Wake Forest Baptist Health, Winston-Salem, NC, USA; 4Department of Ophthalmology, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USACorrespondence: Allen O Eghrari, Department of Ophthalmology, Wilmer Eye Institute, Johns Hopkins University School of Medicine, 400 N Broadway, Smith 5013, Baltimore, MD, 21231, USA, Email [email protected]: Intraocular lens (IOL) unfurling can be a rate-limiting step in cataract surgery, limiting operative efficiency. Furthermore, inefficient unfurling has important implications for clinical outcomes. We examine the effects of solution temperature on IOL unfurling time using three in vitro models of the ocular environment.Methods: IOLs were injected into a 6-well plate filled with balanced salt solution (BSS), dispersive ophthalmic viscoelastic device (OVD), or cohesive OVD. Experiments were also performed in a plastic eye filled with dispersive or cohesive OVD. IOL unfurling time was recorded against the temperature of the respective solution.Results: IOL unfurling time decayed exponentially as solution temperature increased in all experiments, including the BSS-filled 6-well plate, the OVD-filled 6-well plate, and the OVD-filled plastic eye. IOLs failed to unfurl within 10 min at 10°C, below the glass transition temperature of the tested IOLs. Increasing solution temperature from 20°C to 30°C decreases IOL unfurling by greater than 2 min. Further heating to 40°C did not significantly decrease IOL unfurling time.Conclusion: Increased solution temperature rapidly decreases IOL unfurling time in vitro. IOLs do not unfurl within a clinically acceptable timeframe at or below their glass transition temperature. Increased BSS and/or OVD temperature may be a potential method to decrease IOL unfurling time in cataract surgery. However, future research is needed to elucidate potential consequences of warmed BSS and/or OVD on post-operative outcomes. This study demonstrates the potential for temperature regulation to decrease cataract surgery operative time and provides preliminary evidence to justify future clinical validation of this relationship.Plain Language Summary: During cataract surgery, a prosthetic intraocular lens (IOL) is inserted into the eye once the clouded lens is removed. The IOL must then unfurl before the procedure can proceed. When IOLs fail to unfurl or unfurl slowly, this can delay the operation and may even cause post-operative complications. Thus, we studied the effect temperature may have on IOL unfurling time to optimize this segment of the operation.We injected IOLs into solutions of saline (balanced salt solution) or ophthalmic viscoelastic device (OVD), two fluids injected into the eye during surgery. In both a well plate and a plastic eye, we found that increasing the temperature of the solution significantly affected IOL unfurling time. Specifically, heating the solution from refrigeration to room temperature decreased unfurling time from over 10 min to less than four. Heating to physiological temperature further decreased unfurling time to less than a minute.Our results show promise for potentially utilizing heated BSS and/or OVD to accelerate IOL unfurling and decrease cataract surgery operative time.Keywords: cataract surgery, balanced salt solution, ophthalmic viscoelastic devic

    Adaptation to bipolar disorder and perceived risk to children: a survey of parents with bipolar disorder

    No full text
    BACKGROUND: Bipolar disorder (BPD) is a common condition associated with significant morbidity and reduced quality of life. In addition to challenges caused by their mood symptoms, parents affected with BPD harbor concerns about the mental health of their children. Among adult parents who perceive themselves to have BPD, this study aims to examine participants’ coping methods; identify predictors of adaptation; assess parental perceptions of risks for mood disorders among their children; and describe the relationships among illness appraisals, coping, adaptation to one’s own illness, and perceived risk to one’s children. METHODS: Parents who self-identified as having BPD completed a web-based survey that assessed dispositional optimism, coping, perceived illness severity, perceived etiology of BPD, perceived risk to offspring, and adaptation to BPD. Participants had at least one unaffected child who was 30 years of age or below. RESULTS: 266 parents were included in the analysis. 87% of parents endorsed a “somewhat greater” or “much greater” risk for mood disorders in one’s child(ren) than someone without a family history. Endorsing a genetic/familial etiology to BPD was positively correlated with perceived risk for mood disorders in children (r(s) = .3, p < 0.01) and active coping with BDP (r = .2, p < 0.01). Increased active coping (β = 0.4, p < 0.001) and dispositional optimism (β = 0.3, p < 0.001) were positively associated with better adaptation, while using denial coping was negatively associated with adaptation (β = −0.3, p < 0.001). The variables explained 55.2% of the variance in adaptation (F = 73.2, p < 0.001). Coping mediated the effect of perceived illness severity on adaptation. CONCLUSIONS: These data inform studies of interventions that extend beyond symptom management and aim to improve the psychological wellbeing of parents with BPD. Interventions targeted at illness perceptions and those aimed at enhancing coping should be studied for positive effects on adaptation. Parents with BPD may benefit from genetic counseling to promote active coping with their condition, and manage worry about perceived risk to their children

    Determination of ploidy level and nuclear DNA content in blueberry by flow cytometry

    Full text link
    The technique of DNA flow cytometry was used to study variation in DNA content among different ploidy levels, as well as among diploid species, of Vaccinium section Cyanococcus . In a sample of plants of varying ploidy level, the relative fluorescence intensity (RFI) of nuclei stained with propidium iodide was a function of the number of chromosome sets (x), as represented by the linear equation RFI=3.7x-2.3 (r 2 =95%). The data indicated that DNA flow cytometry could be useful for the determination of ploidy level at the seedling stage in blueberry. They also suggest that “conventional polyploid evolution” has occurred in this section of the genus Vaccinium with an increase in nuclear DNA content concurrent with the increase in chromosome number. The nuclear DNA content of diploid species of Vaccinium section Cyanococcus was estimated from the relationship of the observed RFI to an internal known DNA standard (trout red blood cells). A nested analysis of variance indicated significant variation among species, as well as among populations within species, in nuclear DNA content, although this variation was small compared to the variation among ploidy levels. The variation in nuclear DNA content corresponded to the phylogenetic relationships among species determined from previous studies.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/46013/1/122_2004_Article_BF00211053.pd
    corecore