110 research outputs found

    Immune status of recipients following bone marrow - Augmented solid organ transplantation

    Get PDF
    It has been postulated that the resident “passenger” leukocytes of hematolymphoid origin that migrate from whole organ grafts and subsequently establish systemic chimerism are essential for graft acceptance and the induction of donor-specific nonreactivity. This phenomenon was augmented by infusing 3 × 108 unmodified donor bone-marrow cells into 40 patients at the time of organ transplantation. Fifteen of the first 18 analyzable patients had sequential immunological evaluation over postoperative intervals of 5 to 17 months, (which included 7 kidney (two with islets), 7 liver (one with islets), and one heart recipient). The evolution of changes was compared with that in 16 kidney and liver nonmarrow controls followed for 4 to5 months. The generic immune reactivity of peripheral blood mononuclear cells (PBMC) was determined by their proliferative responses to mitogens (PHA, ConA). Alloreactivity was measured by the recipient mixed lymphocyte reaction (MLR) to donor and HLA-mis-matched third-party panel cells. Based on all 3 tests,the recipients were classified as donor-specific hyporeactive, intermediate, and responsive; patients who were globally suppressed made up a fourth category. Eight (53%) of the 15 marrow-treated recipients exhibited progressive modulation of donor-specific reactivity (3 hyporeactive and 5 intermediate) while 7 remained antidonor-responsive. In the nonmarrow controls, 2 (12.5%) of the 16 patients showed donor-specific hyporeactivity, 10 (62.5%) were reactive, and 4 (25%) studied during a CMV infection had global suppression of responsiveness to all stimuli. © 1995 by Williams and Wilkins

    Tacrolimus in pediatric renal transplantation

    Get PDF
    Tacrolimus was used as the primary immunosuppressive agent in 69 pediatric renal transplantations between December 17, 1989, and June 30, 1995. Children undergoing concomitant or prior liver and/or intestinal transplantation were excluded from analysis. The mean recipient age was 10.3±5.0 years (range, 0.7-17.5 years). Seventeen (24.6%) children were undergoing retransplantation, and six (8.7%) had a panel reactive antibody level of 40% or higher. Thirty-nine (57%) cases were with cadaveric kidneys, and 30 (43%) were with living donors. The mean donor age was 28.0±14.7 years (range, 1.0-50.0 years), and the mean cold ischemia time for the cadaveric kidneys was 27.0±9.4 hr. The antigen match was 2.7±1.2, and the mismatch was 3.1±1.2. All patients received tacrolimus and steroids, without antibody induction, and 26% received azathioprine as well. The mean follow-up was 32±20 months. One- and 4-year actuarial patient survival rates were 100% and 95%. One- and 4-year actuarial graft survival rates were 99% and 85%. The mean serum creatinine level was 1.2±0.8 mg/dl, and the calculated creatinine clearance was 82±26 ml/min/1.73 m2. The mean tacrolimus dose was 0.22±0.14 mg/kg/day, and the level was 9.5±4.8 ng/ml. The mean prednisone dose was 2.1±4.9 mg/day (0.07±0.17 mg/kg/day), and 73% of successfully transplanted children were off prednisone. Seventy-nine percent were not taking any antihypertensive medications. The mean serum cholesterol level was 158±54 mg/dl. The incidence of delayed graft function was 4.3%. The incidence of rejection was 49%, and the incidence of steroid-resistant rejection was 6%. The incidence of rejection decreased to 27% in the most recent 26 cases (January 1994 through June 1995). The incidence of new-onset diabetes was 10.1%; six of the seven affected children were able to be weaned off insulin. The incidence of cytomegalovirus disease was 13%, and that of posttransplant lymphoproliferative disorder was 10%; the incidence of posttransplant lymphoproliferative disorder in the last 40 transplants was 5% (two cases). All of the children who developed posttransplant lymphoproliferative disorder are alive and have functioning allografts. Based on this data, we believe that tacrolimus is a superior immunosuppressive agent in pediatric renal transplant patients, with excellent short- and medium-term patient and graft survival, an ability to withdraw steroids in the majority of patients, and, with more experience, a decreasing rate of rejection and vital complications

    Posttransplant lymphoproliferative disorders in adult and pediatric renal transplant patients receiving tacrolimus-based immunosuppression

    Get PDF
    Between March 27, 1989 and December 31, 1997, 1316 kidney transplantations alone were performed under tacrolimus-based immunosuppression at our center. Posttransplant lymphoproliferative disorders (PTLD) developed in 25 (1.9%) cases; the incidence in adults was 1.2% (15/1217), whereas in pediatric patients it was 10.1% (10/99; P<.0001). PTLD was diagnosed 21.0±22.5 months after transplantation, 25.0±24.7 months in adults and 14.4±18.2 months in pediatric patients. Of the 4 adult cases in whom both the donor and recipient Epstein Barr virus (EBV) serologies were known, 2 (50%) were seropositive donor → seronegative recipient. Of 7 pediatric cases in whom both the donor and recipient EBV serologies were known, 6 (86%) were EBV seropositive donor → seronegative recipient. Acute rejection was observed before the diagnosis of PTLD in 8 (53%) of 15 adults and 3 (30%) of 10 pediatric patients. Initial treatment of PTLD included a marked decrease or cessation of immunosuppression with concomitant ganciclovir therapy; two adults and two pediatric patients required chemotherapy. With a mean follow-up of 24.9 ±30.1 months after transplantation, the 1- and 5-year actuarial patient and graft survival rates in adults were 93% and 86%, and 80% and 60%, respectively. Two adults died, 3.7 and 46.2 months after transplantation, of complications related to PTLD, and 10 (including the 2 deaths) lost their allograft 3.7-84.7 months after transplantation. In children, the 1- and 5-year actuarial patient and graft survival rates were 100% and 100%, and 100% and 89%, respectively. No child died; one child lost his allograft 41.3 months after transplantation. One child had presumed recurrent PTLD that responded to discontinuation of tacrolimus and reinitiation of antiviral therapy. The mean serum creatinine level in adults was 2.5±1.2 mg/dl, and in children, it was 1.3±0.6 mg/dl. Under tacrolimus-based immunosuppression, PTLD is less common after renal transplantation in adults than in children, but PTLD in children is associated with more favorable outcomes than in adults

    Investigating the response of leaf area index to droughts in southern African vegetation using observations and model-simulations

    Get PDF
    In many regions of the world, frequent and continual dry spells are exacerbating drought conditions, which have severe impacts on vegetation biomes. Vegetation in southern Africa is among the most affected by drought. Here, we assessed the spatiotemporal characteristics of meteorological drought in southern Africa using the standardized precipitation evapotranspiration index (SPEI) over a 30-year period (1982–2011). The severity and the effects of droughts on vegetation productiveness were examined at different drought timescales (1- to 24-month timescales). In this study, we characterized vegetation using the leaf area index (LAI) after evaluating its relationship with the normalized difference vegetation index (NDVI). Correlating the LAI with the SPEI, we found that the LAI responds strongly (r=0.6) to drought over the central and southeastern parts of the region, with weaker impacts (r<0.4) over parts of Madagascar, Angola, and the western parts of South Africa. Furthermore, the latitudinal distribution of LAI responses to drought indicates a similar temporal pattern but different magnitudes across timescales. The results of the study also showed that the seasonal response across different southern African biomes varies in magnitude and occurs mostly at shorter to intermediate timescales. The semi-desert biome strongly correlates (r=0.95) to drought as characterized by the SPEI at a 6-month timescale in the MAM (March–May; summer) season, while the tropical forest biome shows the weakest response (r=0.35) at a 6-month timescale in the DJF (December–February; hot and rainy) season. In addition, we found that the spatial pattern of change of LAI and SPEI are mostly similar during extremely dry and wet years, with the highest anomaly observed in the dry year of 1991, and we found different temporal variability in global and regional responses across different biomes. We also examined how well an ensemble of state-of-the-art dynamic global vegetation models (DGVMs) simulate the LAI and its response to drought. The spatial and seasonal response of the LAI to drought is mostly overestimated in the DGVM multimodel ensemble compared to the response calculated for the observation-based data. The correlation coefficient values for the multimodel ensemble are as high as 0.76 (annual) over South Africa and 0.98 in the MAM season over the temperate grassland biome. Furthermore, the DGVM model ensemble shows positive biases (3 months or longer) in the simulation of spatial distribution of drought timescales and overestimates the seasonal distribution timescales. The results of this study highlight the areas to target for further development of DGVMs and can be used to improve the models' capability in simulating the drought–vegetation relationship

    Observation-based sowing dates and cultivars significantly affect yield and irrigation for some crops in the Community Land Model (CLM5)

    Get PDF
    Farmers around the world time the planting of their crops to optimize growing season conditions and choose varieties that grow slowly enough to take advantage of the entire growing season while minimizing the risk of late-season kill. As climate changes, these strategies will be an important component of agricultural adaptation. Thus, it is critical that the global models used to project crop productivity under future conditions are able to realistically simulate growing season timing. This is especially important for climate- and hydrosphere-coupled crop models, where the intra-annual timing of crop growth and management affects regional weather and water availability. We have improved the crop module of the Community Land Model (CLM) to allow the use of externally specified crop planting dates and maturity requirements. In this way, CLM can use alternative algorithms for future crop calendars that are potentially more accurate and/or flexible than the built-in methods. Using observation-derived planting and maturity inputs reduces bias in the mean simulated global yield of sugarcane and cotton but increases bias for corn, spring wheat, and especially rice. These inputs also reduce simulated global irrigation demand by 15 %, much of which is associated with particular regions of corn and rice cultivation. Finally, we discuss how our results suggest areas for improvement in CLM and, potentially, similar crop models.</p

    Slowdown of the greening trend in natural vegetation with further rise in atmospheric CO2_{2}

    Get PDF
    Satellite data reveal widespread changes in Earth\u27s vegetation cover. Regions intensively attended to by humans are mostly greening due to land management. Natural vegetation, on the other hand, is exhibiting patterns of both greening and browning in all continents. Factors linked to anthropogenic carbon emissions, such as CO2_{2} fertilization, climate change, and consequent disturbances such as fires and droughts, are hypothesized to be key drivers of changes in natural vegetation. A rigorous regional attribution at the biome level that can be scaled to a global picture of what is behind the observed changes is currently lacking. Here we analyze different datasets of decades-long satellite observations of global leaf area index (LAI, 1981–2017) as well as other proxies for vegetation changes and identify several clusters of significant long-term changes. Using process-based model simulations (Earth system and land surface models), we disentangle the effects of anthropogenic carbon emissions on LAI in a probabilistic setting applying causal counterfactual theory. The analysis prominently indicates the effects of climate change on many biomes – warming in northern ecosystems (greening) and rainfall anomalies in tropical biomes (browning). The probabilistic attribution method clearly identifies the CO2_{2} fertilization effect as the dominant driver in only two biomes, the temperate forests and cool grasslands, challenging the view of a dominant global-scale effect. Altogether, our analysis reveals a slowing down of greening and strengthening of browning trends, particularly in the last 2 decades. Most models substantially underestimate the emerging vegetation browning, especially in the tropical rainforests. Leaf area loss in these productive ecosystems could be an early indicator of a slowdown in the terrestrial carbon sink. Models need to account for this effect to realize plausible climate projections of the 21st century

    Climate-Driven Variability and Trends in Plant Productivity Over Recent Decades Based on Three Global Products

    Get PDF
    Variability in climate exerts a strong influence on vegetation productivity (gross primary productivity; GPP), and therefore has a large impact on the land carbon sink. However, no direct observations of global GPP exist, and estimates rely on models that are constrained by observations at various spatial and temporal scales. Here, we assess the consistency in GPP from global products which extend for more than three decades; two observation‐based approaches, the upscaling of FLUXNET site observations (FLUXCOM) and a remote sensing derived light use efficiency model (RS‐LUE), and from a suite of terrestrial biosphere models (TRENDYv6). At local scales, we find high correlations in annual GPP among the products, with exceptions in tropical and high northern latitudes. On longer time scales, the products agree on the direction of trends over 58% of the land, with large increases across northern latitudes driven by warming trends. Further, tropical regions exhibit the largest interannual variability in GPP, with both rainforests and savannas contributing substantially. Variability in savanna GPP is likely predominantly driven by water availability, although temperature could play a role via soil moisture‐atmosphere feedbacks. There is, however, no consensus on the magnitude and driver of variability of tropical forests, which suggest uncertainties in process representations and underlying observations remain. These results emphasize the need for more direct long‐term observations of GPP along with an extension of in situ networks in underrepresented regions (e.g., tropical forests). Such capabilities would support efforts to better validate relevant processes in models, to more accurately estimate GPP
    • 

    corecore