200 research outputs found

    TOM40 Mediates Mitochondrial Dysfunction Induced by α-Synuclein Accumulation in Parkinson's Disease.

    Get PDF
    Alpha-synuclein (α-Syn) accumulation/aggregation and mitochondrial dysfunction play prominent roles in the pathology of Parkinson's disease. We have previously shown that postmortem human dopaminergic neurons from PD brains accumulate high levels of mitochondrial DNA (mtDNA) deletions. We now addressed the question, whether alterations in a component of the mitochondrial import machinery -TOM40- might contribute to the mitochondrial dysfunction and damage in PD. For this purpose, we studied levels of TOM40, mtDNA deletions, oxidative damage, energy production, and complexes of the respiratory chain in brain homogenates as well as in single neurons, using laser-capture-microdissection in transgenic mice overexpressing human wildtype α-Syn. Additionally, we used lentivirus-mediated stereotactic delivery of a component of this import machinery into mouse brain as a novel therapeutic strategy. We report here that TOM40 is significantly reduced in the brain of PD patients and in α-Syn transgenic mice. TOM40 deficits were associated with increased mtDNA deletions and oxidative DNA damage, and with decreased energy production and altered levels of complex I proteins in α-Syn transgenic mice. Lentiviral-mediated overexpression of Tom40 in α-Syn-transgenic mice brains ameliorated energy deficits as well as oxidative burden. Our results suggest that alterations in the mitochondrial protein transport machinery might contribute to mitochondrial impairment in α-Synucleinopathies

    A Stochastic Differential Equation Inventory Model

    Get PDF
    © 2018, The Author(s). Inventory for an item is being replenished at a constant rate whilst simultaneously being depleted by demand growing randomly and in relation to the inventory level. A stochastic differential equation is put forward to model this situation with solutions to it derived when analytically possible. Probabilities of reaching designated a priori inventory levels from some initial level are considered. Finally, the existence of stable inventory states is investigated by solving the Fokker–Planck equation for the diffusion process at the steady state. Investigation of the stability properties of the Fokker–Planck equation reveals that a judicious choice of control strategy allows the inventory level to remain in a stable regime

    Secular Evolution and the Formation of Pseudobulges in Disk Galaxies

    Full text link
    We review internal processes of secular evolution in galaxy disks, concentrating on the buildup of dense central features that look like classical, merger-built bulges but that were made slowly out of disk gas. We call these pseudobulges. As an existence proof, we review how bars rearrange disk gas into outer rings, inner rings, and gas dumped into the center. In simulations, this gas reaches high densities that plausibly feed star formation. In the observations, many SB and oval galaxies show central concentrations of gas and star formation. Star formation rates imply plausible pseudobulge growth times of a few billion years. If secular processes built dense central components that masquerade as bulges, can we distinguish them from merger-built bulges? Observations show that pseudobulges retain a memory of their disky origin. They have one or more characteristics of disks: (1) flatter shapes than those of classical bulges, (2) large ratios of ordered to random velocities indicative of disk dynamics, (3) small velocity dispersions, (4) spiral structure or nuclear bars in the bulge part of the light profile, (5) nearly exponential brightness profiles, and (6) starbursts. These structures occur preferentially in barred and oval galaxies in which secular evolution should be rapid. So the cleanest examples of pseudobulges are recognizable. Thus a large variety of observational and theoretical results contribute to a new picture of galaxy evolution that complements hierarchical clustering and merging.Comment: 92 pages, 21 figures in 30 Postscript files; to appear in Annual Review of Astronomy and Astrophysics, Vol. 42, 2004, in press; for a version with full resolution figures, see http://chandra.as.utexas.edu/~kormendy/ar3ss.htm

    Adjusting for multiple prognostic factors in the analysis of randomised trials

    Get PDF
    Background: When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods: We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results: Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions: It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size

    A self-rating scale for patient-perceived side effects of inhaled corticosteroids

    Get PDF
    BACKGROUND: Patient-reported side effect questionnaires offer a simple method for the systematic measurement of drug-related side effects. In order to measure patients' inhaled corticosteroids (ICS) related side effect perceptions the 14-day retrospective Inhaled Corticosteroid Questionnaire (ICQ) was developed. In this research we aim to assess the construct validity and reliability of the ICQ and test its responsiveness to dose changes in adult asthma patients. METHODS: In a cross-sectional study, current inhaler users with asthma completed the ICQ (27 with non ICS inhaler; 61 BDP equivalent daily ICS low dose ≤400 μg; 62 mid dose 401–800 μg; and 105 with high dose >800 μg). We generated 3 construct validity hypotheses: 1) a hierarchical dose-response pattern for scoring of the individual items on the ICQ, and statistically significant differences in the scores of each of the 15 ICQ domains by ICS dose group 2) an association between ICS dose and ICQ scoring after adjusting for appropriate confounders in multiple regression; 3) greater convergence between local side effect domains than between systemic and local domains of the scale. Test-retest reliability was assessed on a randomly selected subgroup of patients (n = 73) who also completed the ICQ a second time after 7 days. In a separate longitudinal study, 61 patients with asthma completed the ICQ at baseline and after changing their daily ICS dose, at 2- and 6- months, in order to test the ICQ's responsiveness. RESULTS: All three construct validity hypotheses were well supported: 1) a statistically significant difference existed in scores for 14 domains, the high ICS dose group scoring highest; 2) ICS dose independently predicted ICQ scoring after adjusting for confounders; 3) greater convergence existed between local ICQ domains than between local and systemic domains. The ICQ had good reproducibility: test-retest intraclass correlation coefficients were ≥0.69 for all but the 'Facial Oedema' domain. In the longitudinal study, ICQ scores for 'Voice Problems' changed significantly at 2- and 6-months from baseline and other ICQ domains displayed trends in scoring change accordant with dose modulation at 6-months. CONCLUSION: The ICQ has good dose-related discriminative properties, is valid, reliable, and shows potential responsiveness to ICS dose change

    A review of elliptical and disc galaxy structure, and modern scaling laws

    Full text link
    A century ago, in 1911 and 1913, Plummer and then Reynolds introduced their models to describe the radial distribution of stars in `nebulae'. This article reviews the progress since then, providing both an historical perspective and a contemporary review of the stellar structure of bulges, discs and elliptical galaxies. The quantification of galaxy nuclei, such as central mass deficits and excess nuclear light, plus the structure of dark matter halos and cD galaxy envelopes, are discussed. Issues pertaining to spiral galaxies including dust, bulge-to-disc ratios, bulgeless galaxies, bars and the identification of pseudobulges are also reviewed. An array of modern scaling relations involving sizes, luminosities, surface brightnesses and stellar concentrations are presented, many of which are shown to be curved. These 'redshift zero' relations not only quantify the behavior and nature of galaxies in the Universe today, but are the modern benchmark for evolutionary studies of galaxies, whether based on observations, N-body-simulations or semi-analytical modelling. For example, it is shown that some of the recently discovered compact elliptical galaxies at 1.5 < z < 2.5 may be the bulges of modern disc galaxies.Comment: Condensed version (due to Contract) of an invited review article to appear in "Planets, Stars and Stellar Systems"(www.springer.com/astronomy/book/978-90-481-8818-5). 500+ references incl. many somewhat forgotten, pioneer papers. Original submission to Springer: 07-June-201

    An in silico model of the ubiquitin-proteasome system that incorporates normal homeostasis and age-related decline

    Get PDF
    BACKGROUND: The ubiquitin-proteasome system is responsible for homeostatic degradation of intact protein substrates as well as the elimination of damaged or misfolded proteins that might otherwise aggregate. During ageing there is a decline in proteasome activity and an increase in aggregated proteins. Many neurodegenerative diseases are characterised by the presence of distinctive ubiquitin-positive inclusion bodies in affected regions of the brain. These inclusions consist of insoluble, unfolded, ubiquitinated polypeptides that fail to be targeted and degraded by the proteasome. We are using a systems biology approach to try and determine the primary event in the decline in proteolytic capacity with age and whether there is in fact a vicious cycle of inhibition, with accumulating aggregates further inhibiting proteolysis, prompting accumulation of aggregates and so on. A stochastic model of the ubiquitin-proteasome system has been developed using the Systems Biology Mark-up Language (SBML). Simulations are carried out on the BASIS (Biology of Ageing e-Science Integration and Simulation) system and the model output is compared to experimental data wherein levels of ubiquitin and ubiquitinated substrates are monitored in cultured cells under various conditions. The model can be used to predict the effects of different experimental procedures such as inhibition of the proteasome or shutting down the enzyme cascade responsible for ubiquitin conjugation. RESULTS: The model output shows good agreement with experimental data under a number of different conditions. However, our model predicts that monomeric ubiquitin pools are always depleted under conditions of proteasome inhibition, whereas experimental data show that monomeric pools were depleted in IMR-90 cells but not in ts20 cells, suggesting that cell lines vary in their ability to replenish ubiquitin pools and there is the need to incorporate ubiquitin turnover into the model. Sensitivity analysis of the model revealed which parameters have an important effect on protein turnover and aggregation kinetics. CONCLUSION: We have developed a model of the ubiquitin-proteasome system using an iterative approach of model building and validation against experimental data. Using SBML to encode the model ensures that it can be easily modified and extended as more data become available. Important aspects to be included in subsequent models are details of ubiquitin turnover, models of autophagy, the inclusion of a pool of short-lived proteins and further details of the aggregation process

    Safety and Immunogenicity of H5N1 Influenza Vaccine Based on Baculovirus Surface Display System of Bombyx mori

    Get PDF
    Avian influenza virus (H5N1) has caused serious infections in human beings. This virus has the potential to emerge as a pandemic threat in humans. Effective vaccines against H5N1 virus are needed. A recombinant Bombyx mori baculovirus, Bmg64HA, was constructed for the expression of HA protein of H5N1 influenza virus displaying on the viral envelope surface. The HA protein accounted for approximately 3% of the total viral proteins in silkworm pupae infected with the recombinant virus. Using a series of separation and purification methods, pure Bmgp64HA virus was isolated from these silkworm pupae bioreactors. Aluminum hydroxide adjuvant was used for an H5N1 influenza vaccine. Immunization with this vaccine at doses of 2 mg/kg and 0.67 mg/kg was carried out to induce the production of neutralizing antibodies, which protected monkeys against influenza virus infection. At these doses, the vaccine induced 1:40 antibody titers in 50% and 67% of the monkeys, respectively. The results of safety evaluation indicated that the vaccine did not cause any toxicity at the dosage as large as 3.2 mg/kg in cynomolgus monkeys and 1.6 mg/kg in mice. The results of dose safety evaluation of vaccine indicated that the safe dose of the vaccine were higher than 0.375 mg/kg in rats and 3.2 mg/kg in cynomolgus monkeys. Our work showed the vaccine may be a candidate for a highly effective, cheap, and safe influenza vaccine for use in humans
    corecore