20 research outputs found

    Blue intensity from a tropical conifer’s annual rings for climate reconstruction : an ecophysiological perspective

    Get PDF
    This research was funded by the National Science Foundation of the USA research grants AGS 12-03818 and AGS 13-03976, with additional funding from the Lamont-Doherty Earth Observatory’s Climate Center and Climate and Life initiatives.We developed Blue Intensity (BI) measurements from the crossdated ring sequences of Fokienia hodginsii (of the family Cupressaceae) from central Vietnam. BI has been utilized primarily as an indirect proxy measurement of latewood (LW) density of conifers (i.e., LWBI) from high latitude, temperature-limited boreal forests. As such, BI closely approximates maximum latewood density (MXD) measurements made from soft x-ray. The less commonly used earlywood (EW) BI (EWBI) represents the minimum density of EW and is influenced by the lighter pixels from the vacuoles or lumens of cells. The correlation of our BI measurements with climate, strongest for EWBI, rivals that for total ring width (RW), and we demonstrate that it can be successfully employed as an independent predictor for reconstruction models. EWBI exhibits robust spatial correlations with winter and spring land temperature, sea surface temperature (SST) over the regional domain of ENSO, and the Standardized Precipitation Evapotranspiration Index (SPEI) over Indochina. However, in order to mitigate the effects of color changes at the heartwood – sapwood boundary we calculated ΔBI (EWBI-LWBI), and it too exhibits a significant (p < 0.05), temporally stable response to prior autumn (Oct-Nov) rainfall and winter (December to April) dry season temperature. We interpret this response as reflecting a potential cavitation defense by reducing lumen diameter as a means to safeguard hydraulic conductivity in the stem, and to prevent the xylem from imploding due to negative pressure. This study has wide implications for the further use of BI from the global tropics, though it is unclear how many tropical tree species will be appropriate for use. It seems very likely that other wood anatomical measurements can be combined with BI and RW for climate reconstruction.Publisher PDFPeer reviewe

    Outcome after allogeneic stem cell transplantation with haploidentical versus HLA-matched donors in patients with higher-risk MDS.

    Full text link
    peer reviewedAllogeneic hematopoietic stem cell transplantation remains the best curative option for higher-risk myelodysplastic syndrome. The presence of monosomal karyotype and/or complex karyotype abnormalities predicts inferior survival after allo-SCT in MDS patients. Haploidentical allo-SCT has been increasingly used in acute leukemia (AL) and has similar results as using HLA-matched donors, but data on higher-risk MDS is sparse. We compared outcomes in 266 patients with higher-risk MDS after HLA-matched sibling donor (MSD, n = 79), HLA-matched unrelated donor (MUD, n = 139) and HLA haploidentical donor (HID, n = 48) from 2010 to 2019. Median donor age differed between the three groups (p < 0.001). The overall survival was significantly different between the three groups with a better OS observed in the MUD group (p = 0.014). This observation could be explained by a higher progression-free survival with MUD (p = 0.014). The cumulative incidence of grade 2-4 acute GvHD was significantly higher in the HID group (p = 0.051). However, in multivariable analysis, patients transplanted using an HID had comparable mortality to patients transplanted using a MUD (subdistribution hazard ratio [sHR]: 0.58 [0.32-1.07]; p = 0.080) and a MSD ([sHR]: 0.56 [0.28-1.11]; p = 0.094). MUD do not remain a significant positive predictor of survival, suggesting that beyond the donor-recipient HLA matching, the donor age might impact recipient outcome

    Repositioning of the global epicentre of non-optimal cholesterol

    Get PDF
    High blood cholesterol is typically considered a feature of wealthy western countries(1,2). However, dietary and behavioural determinants of blood cholesterol are changing rapidly throughout the world(3) and countries are using lipid-lowering medications at varying rates. These changes can have distinct effects on the levels of high-density lipoprotein (HDL) cholesterol and non-HDL cholesterol, which have different effects on human health(4,5). However, the trends of HDL and non-HDL cholesterol levels over time have not been previously reported in a global analysis. Here we pooled 1,127 population-based studies that measured blood lipids in 102.6 million individuals aged 18 years and older to estimate trends from 1980 to 2018 in mean total, non-HDL and HDL cholesterol levels for 200 countries. Globally, there was little change in total or non-HDL cholesterol from 1980 to 2018. This was a net effect of increases in low- and middle-income countries, especially in east and southeast Asia, and decreases in high-income western countries, especially those in northwestern Europe, and in central and eastern Europe. As a result, countries with the highest level of non-HDL cholesterol-which is a marker of cardiovascular riskchanged from those in western Europe such as Belgium, Finland, Greenland, Iceland, Norway, Sweden, Switzerland and Malta in 1980 to those in Asia and the Pacific, such as Tokelau, Malaysia, The Philippines and Thailand. In 2017, high non-HDL cholesterol was responsible for an estimated 3.9 million (95% credible interval 3.7 million-4.2 million) worldwide deaths, half of which occurred in east, southeast and south Asia. The global repositioning of lipid-related risk, with non-optimal cholesterol shifting from a distinct feature of high-income countries in northwestern Europe, north America and Australasia to one that affects countries in east and southeast Asia and Oceania should motivate the use of population-based policies and personal interventions to improve nutrition and enhance access to treatment throughout the world.Peer reviewe

    Temporal trends in receipt of adequate lymphadenectomy in bladder cancer 1988 to 2010.

    No full text
    INTRODUCTION AND OBJECTIVE: The importance of pelvic lymphadenectomy (LND) for diagnostic and therapeutic purposes at the time of radical cystectomy (RC) for bladder cancer is well documented. Although some debate remains on the optimal number of lymph nodes removed, 10 nodes has been proposed as constituting an adequate LND. We used data from the Surveillance, Epidemiology, and End Results database to examine predictors and temporal trends in the receipt of an adequate LND at the time of RC for bladder cancer. MATERIAL AND METHODS: Within the Surveillance, Epidemiology, and End Results database, we extracted data on all patients with nonmetastatic bladder cancer receiving RC in the years 1988 to 2010. First, we assess the proportion of individuals undergoing RC who received an adequate LND (≥10 nodes removed) over time. Second, we calculate odds ratios (ORs) of receiving an adequate LND using logistic regression modeling to compare study periods. Covariates included sex, race, age, region, tumor stage, urban vs. rural location, and insurance status. RESULTS: Among the 5,696 individuals receiving RC during the years 1988 to 2010, 2,576 (45.2%) received an adequate LND. Over the study period, the proportion of individuals receiving an adequate LND increased from 26.4% to 61.3%. The odds of receiving an adequate LND increased over the study period; a patient undergoing RC in 2008 to 2010 was over 4-fold more likely to receive an adequate LND relative to a patient treated in 1988 to 1991 (OR = 4.63, 95% CI: 3.32-6.45). In addition to time of surgery, tumor stage had a positive association with receipt of adequate LND (OR = 1.49 for stage IV [T4 N1 or N0] vs. stage I [T1 or Tis], 95% CI: 1.22-1.82). Age, sex, marital status, and race were not significant predictors of adequate LND. CONCLUSION: Adequacy of pelvic LND remains an important measure of surgical quality in bladder cancer. Our data show that over the years 1988 to 2010, the likelihood of receiving an adequate LND has increased substantially; however, a substantial minority of patients still does not receive LND. Further study into factors leading to adequate LND is needed to increase the use of this important technique

    Hepatitis E and Allogeneic Hematopoietic Stem Cell Transplantation: A French Nationwide SFGM-TC Retrospective Study

    No full text
    International audienceUsually self-limited, hepatitis E virus (HEV) infection may evolve to chronicity and cirrhosis in immunosuppressed patients. HEV infection has been described in solid-organ transplantation and hematology patients, but for allogeneic hematopoietic stem cell transplant (alloHSCT) recipients, only small cohorts are available. This retrospective nationwide multi-center series aimed to describe HEV diagnostic practices in alloHSCT French centers, and the course of infection in the context of alloHSCT. Twenty-nine out of 37 centers participated. HEV search in case of liver function tests (LFT) abnormalities was never performed in 24% of centers, occasionally in 55%, and systematically in 21%. Twenty-five cases of active HEV infection were diagnosed in seven centers, all because of LFT abnormalities, by blood nucleic acid testing. HEV infection was diagnosed in three patients before alloHSCT; HEV infection did not influence transplantation planning, and resolved spontaneously before or after alloHSCT. Twenty-two patients were diagnosed a median of 283 days after alloHSCT. Nine patients (41%) had spontaneous viral clearance, mostly after immunosuppressive treatment decrease. Thirteen patients (59%) received ribavirin, with sustained viral clearance in 11/12 evaluable patients. We observed three HEV recurrences but no HEV-related death or liver failure, nor evolution to cirrhosis

    Outcomes of unrelated cord blood transplantation in patients with multiple myeloma: a survey on behalf of Eurocord, the Cord Blood Committee of Cellular Therapy and Immunobiology Working Party, and the Chronic Leukemia Working Party of the EBMT

    No full text
    International audienceAlthough allogeneic stem cell transplantation is not a standard therapy for multiple myeloma, some patients can benefit from this intense therapy. There are few reports on outcomes after umbilical cord blood transplantation in multiple myeloma, and investigation of this procedure is warranted. We retrospectively analyzed 95 patients, 85 with multiple myeloma and 10 with plasma cell leukemia, receiving single or double umbilical cord blood transplantation from 2001 to 2013. Median follow up was 41 months. The majority of patients received a reduced intensity conditioning. The cumulative incidence of neutrophil engraftment was 97%±3% at 60 days, and that of 100-day acute graft-versus-host disease grade II-IV was 41%±5%. Chronic graft-versus-host disease at two years was 22%±4%. Relapse and non-relapse mortality was 47%±5% and 29%±5% at three years, respectively. Three-year progression-free survival and overall survival were 24%±5% and 40%±5%, respectively. Anti-thymocyte globulin was associated with decreased incidence of acute graft-versus-host disease, higher non-relapse mortality, decreased overall and progression-free survival. Patients with high cytogenetic risk had higher relapse, and worse overall and progression-free survival. In conclusion, umbilical cord blood transplantation is feasible for multiple myeloma patients

    On Behalf of the SFGM-TC: Retrospective Comparison of Reduced and Higher Intensity Conditioning for High-Risk Myelodysplastic Syndrome Treated With Allogeneic Stem-Cell Transplantation

    No full text
    International audienceBackgroundAllogeneic hematopoietic stem-cell transplantation (allo-HSCT) remains the best curative option for high-risk myelodysplastic syndrome . We retrospectively compared patient outcomes after allo-HSCT according to the intensity of the conditioning regimen.Patients and MethodsThree conditioning regimens were compared in 427 patients allografted for high-risk myelodysplastic syndrome: reduced-intensity conditioning (RIC), fludarabine (150-160 mg/m2) and busulfan (6.4 mg/kg); sequential FLAMSA-RIC, fludarabine, amsacrine, and aracytine followed by RIC; and myeloablative with reduced toxicity (RTC), fludarabine and busulfan (9.6 mg/kg or 12.8 mg/kg).ResultsThe patients in the 3 conditioning groups were different in regards to the number of treatment lines (P< .001), percentage of blasts in bone marrow (P< .001), and disease status at transplantation (P< .001). No significant differences in outcomes (overall survival, progression-free survival, nonrelapse mortality, relapse incidence, and graft versus host disease relapse-free survival) were observed between the 3 groups. Using propensity score analysis to overcome baseline imbalances, we compared 70 patients receiving FLAMSA-RIC to 260 patients receiving RIC, and compared 83 patients receiving RTC to 252 patients receiving RIC. The only factor influencing overall and progression-free survival was cytogenetic risk at transplantation. After the covariate adjustment using propensity score to reduce baseline imbalances, the only factor influencing overall and progression-free survival was still cytogenetic risk at transplantation.ConclusionOverall survival appears to be similar with the 3 conditioning regimens. The only factor influencing survival is cytogenetic risk at transplantation, suggesting that new promising drugs in the conditioning and/or early interventions after transplantation are needed to improve outcomes in these patients
    corecore