84 research outputs found
Mitochondrial complex 1 activity measured by spectrophotometry is reduced across all brain regions in ageing and more specifically in neurodegeneration
Mitochondrial function, in particular complex 1 of the electron transport chain (ETC), has been shown to decrease during normal ageing and in neurodegenerative disease. However, there is some debate concerning which area of the brain has the greatest complex 1 activity. It is important to identify the pattern of activity in order to be able to gauge the effect of age or disease related changes. We determined complex 1 activity spectrophotometrically in the cortex, brainstem and cerebellum of middle aged mice (70–71 weeks), a cerebellar ataxic neurodegeneration model (pcd5J) and young wild type controls. We share our updated protocol on the measurements of complex1 activity and find that mitochondrial fractions isolated from frozen tissues can be measured for robust activity. We show that complex 1 activity is clearly highest in the cortex when compared with brainstem and cerebellum (p<0.003). Cerebellum and brainstem mitochondria exhibit similar levels of complex 1 activity in wild type brains. In the aged brain we see similar levels of complex 1 activity in all three-brain regions. The specific activity of complex 1 measured in the aged cortex is significantly decreased when compared with controls (p<0.0001). Both the cerebellum and brainstem mitochondria also show significantly reduced activity with ageing (p<0.05). The mouse model of ataxia predictably has a lower complex 1 activity in the cerebellum, and although reductions are measured in the cortex and brain stem, the remaining activity is higher than in the aged brains. We present clear evidence that complex 1 activity decreases across the brain with age and much more specifically in the cerebellum of the pcd5j mouse. Mitochondrial impairment can be a region specific phenomenon in disease, but in ageing appears to affect the entire brain, abolishing the pattern of higher activity in cortical regions
Recommended from our members
Bioavailability in soils
The consumption of locally-produced vegetables by humans may be an important exposure pathway for soil contaminants in many urban settings and for agricultural land use. Hence, prediction of metal and metalloid uptake by vegetables from contaminated soils is an important part of the Human Health Risk Assessment procedure. The behaviour of metals (cadmium, chromium, cobalt, copper, mercury, molybdenum, nickel, lead and zinc) and metalloids (arsenic, boron and selenium) in contaminated soils depends to a large extent on the intrinsic charge, valence and speciation of the contaminant ion, and soil properties such as pH, redox status and contents of clay and/or organic matter. However, chemistry and behaviour of the contaminant in soil alone cannot predict soil-to-plant transfer. Root uptake, root selectivity, ion interactions, rhizosphere processes, leaf uptake from the atmosphere, and plant partitioning are important processes that ultimately govern the accumulation ofmetals and metalloids in edible vegetable tissues. Mechanistic models to accurately describe all these processes have not yet been developed, let alone validated under field conditions. Hence, to estimate risks by vegetable consumption, empirical models have been used to correlate concentrations of metals and metalloids in contaminated soils, soil physico-chemical characteristics, and concentrations of elements in vegetable tissues. These models should only be used within the bounds of their calibration, and often need to be re-calibrated or validated using local soil and environmental conditions on a regional or site-specific basis.Mike J. McLaughlin, Erik Smolders, Fien Degryse, and Rene Rietr
The Familial Intracranial Aneurysm (FIA) study protocol
BACKGROUND: Subarachnoid hemorrhage (SAH) due to ruptured intracranial aneurysms (IAs) occurs in about 20,000 people per year in the U.S. annually and nearly half of the affected persons are dead within the first 30 days. Survivors of ruptured IAs are often left with substantial disability. Thus, primary prevention of aneurysm formation and rupture is of paramount importance. Prior studies indicate that genetic factors are important in the formation and rupture of IAs. The long-term goal of the Familial Intracranial Aneurysm (FIA) Study is to identify genes that underlie the development and rupture of intracranial aneurysms (IA). METHODS/DESIGN: The FIA Study includes 26 clinical centers which have extensive experience in the clinical management and imaging of intracerebral aneurysms. 475 families with affected sib pairs or with multiple affected relatives will be enrolled through retrospective and prospective screening of potential subjects with an IA. After giving informed consent, the proband or their spokesperson invites other family members to participate. Each participant is interviewed using a standardized questionnaire which covers medical history, social history and demographic information. In addition blood is drawn from each participant for DNA isolation and immortalization of lymphocytes. High- risk family members without a previously diagnosed IA undergo magnetic resonance angiography (MRA) to identify asymptomatic unruptured aneurysms. A 10 cM genome screen will be performed to identify FIA susceptibility loci. Due to the significant mortality of affected individuals, novel approaches are employed to reconstruct the genotype of critical deceased individuals. These include the intensive recruitment of the spouse and children of deceased, affected individuals. DISCUSSION: A successful, adequately-powered genetic linkage study of IA is challenging given the very high, early mortality of ruptured IA. Design features in the FIA Study that address this challenge include recruitment at a large number of highly active clinical centers, comprehensive screening and recruitment techniques, non-invasive vascular imaging of high-risk subjects, genome reconstruction of dead affected individuals using marker data from closely related family members, and inclusion of environmental covariates in the statistical analysis
Comparison of Therapeutic Effects between Pulsed and Continuous Wave 810-nm Wavelength Laser Irradiation for Traumatic Brain Injury in Mice
Background and Objective
Transcranial low-level laser therapy (LLLT) using near-infrared light can efficiently penetrate through the scalp and skull and could allow non-invasive treatment for traumatic brain injury (TBI). In the present study, we compared the therapeutic effect using 810-nm wavelength laser light in continuous and pulsed wave modes in a mouse model of TBI.
Study Design/Materials and Methods
TBI was induced by a controlled cortical-impact device and 4-hours post-TBI 1-group received a sham treatment and 3-groups received a single exposure to transcranial LLLT, either continuous wave or pulsed at 10-Hz or 100-Hz with a 50% duty cycle. An 810-nm Ga-Al-As diode laser delivered a spot with diameter of 1-cm onto the injured head with a power density of 50-mW/cm2 for 12-minutes giving a fluence of 36-J/cm2. Neurological severity score (NSS) and body weight were measured up to 4 weeks. Mice were sacrificed at 2, 15 and 28 days post-TBI and the lesion size was histologically analyzed. The quantity of ATP production in the brain tissue was determined immediately after laser irradiation. We examined the role of LLLT on the psychological state of the mice at 1 day and 4 weeks after TBI using tail suspension test and forced swim test.
Results
The 810-nm laser pulsed at 10-Hz was the most effective judged by improvement in NSS and body weight although the other laser regimens were also effective. The brain lesion volume of mice treated with 10-Hz pulsed-laser irradiation was significantly lower than control group at 15-days and 4-weeks post-TBI. Moreover, we found an antidepressant effect of LLLT at 4-weeks as shown by forced swim and tail suspension tests.
Conclusion
The therapeutic effect of LLLT for TBI with an 810-nm laser was more effective at 10-Hz pulse frequency than at CW and 100-Hz. This finding may provide a new insight into biological mechanisms of LLLT.National Institutes of Health (U.S.) (NIH grant R01AI050875)Center for Integration of Medicine and Innovative Technology (DAMD17-02-2-0006)United States. Dept. of Defense. Congressionally Directed Medical Research Programs (W81XWH-09-1-0514)United States. Air Force Office of Scientific Research (Military Photomedicine Program (FA9950-04-1-0079))Japan. Ministry of Education, Culture, Sports, Science and TechnologyJapan Society for the Promotion of Scienc
Rhizobium leguminosarum bv. trifolii in soils amended with heavy metal contaminated sewage sludges.
Soils from a well controlled field experiment were screened for the presence and number of cells of Rhizobium leguminosarum bv. trifolii capable of effectively nodulating the host plant, white clover (Trifolium repens). Soils had been amended with anaerobically-digested or undigested sewage sludge at rates of 0, 100 and 300 m3 ha −1 yr−1on plots of differing pH since 1980 and up to the present. Applications of anaerobically-digested sludge included additions with or without heavy metal salts. Rhizobium were present in all of the treatments, apart from the most metal-contaminated treatment in the soil of lower pH, despite the absence of the host plant from the field sward. Lack of nodulation and nitrogen fixation (acetylene reduction activity) for T. repens growing in soils was, in some cases, probably caused by the high concentrations of extractable nitrate present as plants subsequently grown in N-free media were effectively nodulated. Important effects on the size of the effective rhizobial population were apparent in relation to the soil pH, sludge type and addition rates, and the concentration of heavy metals present
- …