563 research outputs found

    Inhibition of the Mitochondrial Permeability Transition for Cytoprotection: Direct versus Indirect Mechanisms

    Get PDF
    Mitochondria are fascinating organelles, which fulfill multiple cellular functions, as diverse as energy production, fatty acid β oxidation, reactive oxygen species (ROS) production and detoxification, and cell death regulation. The coordination of these functions relies on autonomous mitochondrial processes as well as on sustained cross-talk with other organelles and/or the cytosol. Therefore, this implies a tight regulation of mitochondrial functions to ensure cell homeostasis. In many diseases (e.g., cancer, cardiopathies, nonalcoholic fatty liver diseases, and neurodegenerative diseases), mitochondria can receive harmful signals, dysfunction and then, participate to pathogenesis. They can undergo either a decrease of their bioenergetic function or a process called mitochondrial permeability transition (MPT) that can coordinate cell death execution. Many studies present evidence that protection of mitochondria limits disease progression and severity. Here, we will review recent strategies to preserve mitochondrial functions via direct or indirect mechanisms of MPT inhibition. Thus, several mitochondrial proteins may be considered for cytoprotective-targeted therapies

    Soil Moisture and Porosity Affects the Abundance and Distribution of Ageratum houstonianum

    Get PDF
    Introduction: Ageratum houstonianum is an herbaceous, drought-tolerant plant also known as Blue billygoat weed. It grows well in drained soil and shaded areas. Soil moisture and porosity are two abiotic factors that affect the abundance and distribution of A. houstonianum. An ideal condition for plants to grow includes a greater amount of soil moisture and porosity. Higher porosity would mean that there is a greater number of pores, which would result in more significant plant nutrients because of its ability to retain more water. The purpose of this research was to see how soil moisture and porosity based on the gradient with regards to distance from the tree impact the abundance and distribution of A. houstonianum. Materials and Methods: The belt transect method was used to test the soil moisture and porosity, and three belts with four quadrants in each were formed. For each of the belts, the first two quadrants closer to the tree were called zone 1, and the last two quadrants were labeled as zone 2. We hypothesized that there was an increase in abundance and distribution further away from the tree. Abundance was calculated by finding the density of the total number of species over each quadrant area. Soil samples were collected to test the soil moisture and porosity. Paired two-sample t-tests and ANOVA single factor tests were performed. Results and Conclusion: The t-tests showed a difference between the relationship of abundance/moisture, abundance/porosity, and moisture/porosity. The ANOVA test compared the means of density/moisture/porosity between zone 1 and 2 to see if they were statistically different from each other. Based on the results, there was a decrease in the density as the distance from the tree increased. Soil moisture and porosity also decreased as the distance from the tree increased, which rejected the hypothesis. Closer to the tree, there was an increase in moisture, density, and porosity, which led to the abundance of A. houstonianum species because the ideal conditions were met

    Les ontologies du prototype LUISA, une architecture fondée sur des Web Services Sémantiques pour les ressources de formation.

    Get PDF
    National audienceCet article décrit les ontologies d'une application conçue pour expérimenter les technologies du Web Sémantique et en particulier l'usage intensif de Web Services reposant sur des ontologies. Le domaine retenu est celui des ressources de formation en ligne. Le texte présente les objectifs et l'architecture générale de l'application, les ontologies partagées sur deux cas développés l'un en milieu industriel, l'autre en milieu académique, et les ontologies spécifiques aux cas d'application. Il montre ainsi la faisabilité et l'intérêt de modèles partagés dans le domaine des ressources de formation. Il s'achève par une comparaison avec d'autres projets voisins, une analyse des difficultés rencontrées et des perspectives offertes

    Long term impact of hyperleukocytosis in newly diagnosed acute myeloid leukemia patients undergoing allogeneic stem cell transplantation : An analysis from the acute leukemia working party of the EBMT

    Get PDF
    Up to 20% of acute myeloid leukemia (AML) patients present initially with hyperleukocytosis, placing them at increased risk for early mortality during induction. Yet, it is unknown whether hyperleukocytosis still retains prognostic value for AML patients undergoing hematopoietic stem cell transplantation (HSCT). Furthermore, it is unknown whether hyperleukocytosis holds prognostic significance when modern molecular markers such as FLT3-ITD and NPM1 are accounted for. To determine whether hyperleukocytosis is an independent prognostic factor influencing outcome in transplanted AML patients we performed a retrospective analysis using the registry of the acute leukemia working party of the European Society of Blood and Marrow Transplantation. A cohort of 357 patients with hyperleukocytosis (159 patients with white blood count [WBC] 50 K-100 K, 198 patients with WBC >= 100 K) was compared to 918 patients without hyperleukocytosis. Patients with hyperleukocytosis were younger, had an increased rate of favorable risk cytogenetics, and more likely to be FLT3 and NPM1 mutated. In multivariate analysis, hyperleukocytosis was independently associated with increased relapse incidence (hazard ratio [HR] of 1.55, 95% confidence interval [CI], 1.14-2.12; P = .004), decreased leukemia-free survival (HR of 1.38, 95% CI, 1.07-1.78; P = .013), and inferior overall survival (HR of 1.4, 95% CI, 1.07-1.84; P = .013). Hyperleukocytosis retains a significant prognostic role for AML patients undergoing HSCT.Peer reviewe

    Dorsal-Ventral Differences in Retinal Structure in the Pigmented Royal College of Surgeons Model of Retinal Degeneration: Retinal Changes in the RCS With Age

    Get PDF
    Retinitis pigmentosa is a family of inherited retinal degenerations associated with gradual loss of photoreceptors, that ultimately leads to irreversible vision loss. The Royal College of Surgeon's (RCS) rat carries a recessive mutation affecting mer proto-oncogene tyrosine kinase (merTK), that models autosomal recessive disease. The aim of this study was to understand the glial, microglial, and photoreceptor changes that occur in different retinal locations with advancing disease. Pigmented RCS rats (RCS-p+/LAV) and age-matched isogenic control rdy (RCS-rdy +p+/LAV) rats aged postnatal day 18 to 6 months were evaluated for in vivo retinal structure and function using optical coherence tomography and electroretinography. Retinal tissues were assessed using high resolution immunohistochemistry to evaluate changes in photoreceptors, glia and microglia in the dorsal, and ventral retina. Photoreceptor dysfunction and death occurred from 1 month of age. There was a striking difference in loss of photoreceptors between the dorsal and ventral retina, with a greater number of photoreceptors surviving in the dorsal retina, despite being adjacent a layer of photoreceptor debris within the subretinal space. Loss of photoreceptors in the ventral retina was associated with fragmentation of the outer limiting membrane, extension of glial processes into the subretinal space that was accompanied by possible adhesion and migration of mononuclear phagocytes in the subretinal space. Overall, these findings highlight that breakdown of the outer limiting membrane could play an important role in exacerbating photoreceptor loss in the ventral retina. Our results also highlight the value of using the RCS rat to model sectorial retinitis pigmentosa, a disease known to predominantly effect the inferior retina

    Creating an appropriate tenure foundation for REDD+: The record to date and prospects for the future

    Get PDF
    Attention to tenure is a fundamental step in preparation for REDD+ implementation. Unclear and conflicting tenure has been the main challenge faced by the proponents of subnational REDD+ initiatives, and accordingly, they have expended much effort to remedy the problem. This article assesses how well REDD+ has performed in laying an appropriate tenure foundation. Field research was carried out in two phases (2010-2012 and 2013-2014) in five countries (Brazil, Peru, Cameroon, Tanzania, Indonesia) at 21 subnational initiatives, 141 villages (half targeted for REDD+ interventions), and 3,754 households. Three questions are posed: 1) What was the effect of REDD+ on perceived tenure insecurity of village residents?; 2) What are the main reasons for change in the level of tenure insecurity and security from Phase 1 to Phase 2 perceived by village residents in control and intervention villages?; and 3) How do intervention village residents evaluate the impact of tenure-related interventions on community well-being? Among the notable findings are that: 1) tenure insecurity decreases slightly across the whole sample of villages, but we only find that REDD+ significantly reduces tenure insecurity in Cameroon, while actually increasing insecurity of smallholder agricultural land tenure in Brazil at the household level; 2) among the main reported reasons for increasing tenure insecurity (where it occurs) are problems with outside companies, lack of title, and competition from neighboring villagers; and 3) views on the effect of REDD+ tenure-related interventions on community well-being lean towards the positive, including for interventions that restrain access to forest. Thus, while there is little evidence that REDD+ interventions have worsened smallholder tenure insecurity (as feared by critics), there is also little evidence that the proponents' efforts to address tenure insecurity have produced results. Work on tenure remains an urgent priority for safeguarding local livelihoods as well as for reducing deforestation. This will require increased attention to participatory engagement, improved reward systems, tenure policy reform, integration of national and local efforts, and "business-as-usual" interestsThis research is part of CIFOR’s Global Comparative Study on REDD+ (www.cifor.org/gcs). The funding partners that have supported this research include the Norwegian Agency for Development Cooperation (Norad) [grant numbers QZA-10/0468, QZA-12/0882, QZA-16/0110], the Australian Department of Foreign Affairs and Trade (DFAT) [grant numbers 46167, 63560], the European Commission (EC) [grant number DCI-ENV/2011/269-520], the International Climate Initiative (IKI) of the German Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB) [grant number KI II 7 - 42206-6/75], the United Kingdom Department for International Development (UKAID) [grant number TF069018], and the CGIAR Research Program on Forests, Trees and Agroforestry (CRP-FTA) [grant number TF No. 069018], with financial support from the donors contributing to the CGIAR Fund. David Solis provided a valuable service in reviewing our methods for taking into account attrition of households over time

    CNS Involvement at Initial Diagnosis and Risk of Relapse After Allogeneic HCT for Acute Lymphoblastic Leukemia in First Complete Remission

    Get PDF
    Outcomes of allogeneic hematopoietic cell transplantation (allo-HCT) for adult acute lymphoblastic leukemia (ALL) have improved over time. Studies have shown that total body irradiation (TBI) is the preferable type of myeloablative conditioning (MAC). However, outcomes based on central nervous system (CNS) involvement, namely CNS-positive versus CNS-negative, have not been compared. Here, we evaluated outcomes of 547 patients (CNS-positive = 96, CNS-negative = 451) who were allografted in the first complete remission (CR1) between 2009 and 2019. Primary endpoint was leukemia-free survival (LFS). Median follow-up was not different between the CNS-positive and CNS-negative groups (79 versus 67.2 months, P = 0.58). The CNS-positive group were younger (median age 31.3 versus 39.7 years, P = 0.004) and were allografted more recently (median year 2012 versus 2010, P = 0.003). In both groups, MAC was the preferred approach (82.3% versus 85.6%, P = 0.41). On multivariate analysis, the CNS-positive group had higher incidence of relapse (RI) (hazard ratio [HR] = 1.58 [95% confidence interval (CI) = 1.06-2.35], P = 0.025), but no adverse effect on LFS (HR = 1.38 [95% CI = 0.99-1.92], P = 0.057) or overall survival (OS) (HR = 1.28 [95% CI = 0.89-1.85], P = 0.18). A subgroup multivariate analysis limited to CNS-positive patients showed that a TBI-based MAC regimen resulted in better LFS (HR = 0.43 [95% CI = 0.22-0.83], P = 0.01) and OS (HR = 0.44 [95% CI = 0.21-0.92], P = 0.03) and lower RI (HR = 0.35 [95% CI = 0.15-0.79], P = 0.01). Another subgroup analysis in CNS-negative patients showed that MAC-TBI preparative regimens also showed a lower RI without a benefit in LFS or OS. While a MAC-TBI allo-HCT regimen may not be suitable to all, particularly for older patients with comorbidities, this approach should be considered for patients who are deemed fit and able to tolerate.Peer reviewe
    corecore