12 research outputs found

    Tacrolimus in pancreas transplantation: A multicenter analysis

    Full text link
    This follow-up multicenter analysis is based on 362 pancreas allograft recipients at 14 institutions who were given tacrolimus between 1 May 1994 and 15 November 1995. Three groups were studied: (1) recipients given tacrolimus initially for induction and maintenance therapy (n = 250; 215 without, 35 with, a concurrent bone marrow transplant), (2) recipients who converted to tacrolimus for rescue or rejection therapy (n = 89), and (3) recipients who converted to tacrolimus for other reasons (n = 23). Of 215 recipients without a bone marrow transplant in the induction group, 166 (77%) underwent a simultaneous pancreas-kidney transplant (SPK), 29 (14%) a pancreas transplant alone (PTA), and 20 (9%) a pancreas after previous kidney transplant (PAK). Initial antibody therapy was given to 185 (86%) recipients. All 215 received tacrolimus and prednisone; 202 (94%) also received azathioprine (AZA) and 11 (5%) mycophenolate mofetil (MMF). The most common side effects of tacrolimus were neurotoxicity in 21%, nephrotoxicity in 21%, gastrointestinal (GI) toxicity in 13%, and diabetogenicity in 13% of these recipients. No recipient in this group developed new-onset insulin-dependent diabetes mellitus. Of 89 recipients in the rescue group, 71 (79%) had an SPK, 11 (13%) a PTA, and 7 (8%) a PAK. Before conversion, all had been on cyclosporine (CsA)-based immunosoppression; 74% of them had 2 or more rejection episodes previously. The most common side effects were nephrotoxicity in 27%, neurotoxicity in 26%, GI toxicity in 18%, and diabetogenicity in 8% of these recipients. No recipient in this group developed new-onset insulin-dependent diabetes mellitus. In the induction group, patient survival at 1 yr was 98% for SPK, 79% for PTA, and 100% for PAK recipients. According to a matched-pair analysis, pancreas graft survival for SPK recipients at 1 yr was 88% with tacrolimus vs. 73% with CsA (p = 0.002); for PTA recipients, 68% vs. 70% (p > 0.35); and for PAK recipients, 85% vs, 65% (p = 0.13). Graft loss from rejection was not different with tacrolimus vs. CsA in all 3 pancreas recipient categories. At 1 yr, 17% of recipients had converted from tacrolimus to CsA for diabetogenicity, nephrotoxicity, or rejection; 23% had converted from AZA to MMF. The incidence of post-transplant lymphoma was < 2%. In the rescue group, patient survival rates at 1 yr were 96% for SPK, 100% for PTA, and 86% for PAK recipients (p < 0.08). Pancreas graft survival at 1 yr was 89% for SPK, 58% for PTA, and 69% for PAK recipients (p = 0.004). Graft loss from rejection was significantly lower for SPK vs. PTA or PAK recipients. At 1 yr, 20% of recipients had reconverted from tacrolimus to CsA for rejection, neurotoxicity, or nephrotoxicity; 19% had converted from AZA to MMF. There were no post-transplant lymphomas in the rescue group. This follow-up multicenter analysis shows that tacrolimus after pancreas transplantation is associated with high graft survival rates when used for induction and with high graft salvage rates when used for rescue therapy. The rate of graft loss from rejection is low in all 3 pancreas recipient categories. The overall incidence of new-onset insulin-dependent diabetes mellitus is < 1%, as is the incidence of post-transplant lymphoma. Converting from tacrolimus to CsA and, in patients on tacrolimus, from AZA to MMF, is safe; interchangeable use of drugs appears to be of immunologic benefit. To determine the best immunosuppressive regimen after pancreas transplantation, a prospective randomized study comparing tacrolimus and MMF vs. Neoral plus MMF is mandatory

    Spherical and columnar, septarian,18 O-depleted, calcite concretions from Middle-Upper Permian lacustrine siltstones in northern Mozambique : evidence for very early diagenesis and multiple fluids

    Full text link
    Calcite septarian concretions from the Permian Beaufort Group in the Maniamba Graben (NW Mozambique) allow controls on the composition and nature of diagenetic fluids to be investigated.The concretions formedinlacustrine siltstones, where they occur in spherical (1 to 70 cm in diameter) and columnar (up to 50 cm long) forms within three closely spaced, discrete beds totalling 2Æ5 min thickness. Cementation began at an early stage of diagenesis and entrapped non-compacted burrows and calcified plant roots. The cylindrical concretions overgrew calcified vertical plant roots, which experienced shrinkage cracking after entrapment. Two generations of concretionary body cement and two generations of septarian crack infill are distinguished. The early generation in both cases is a low-Mn, Mg-rich calcite, whereas the later generation is a low-Mg, Mn-rich calcite. The change in chemistry is broadly consistent with a time (burial)-related transition from oxic to sub-oxic/anoxic conditions close to the sediment–water interface. Geochemical features of all types of cement were controlled by the sulphate-poor environment and by the absence of bacterial sulphate reduction. All types of cement present have d13C ranging between 0&and )15&(Vienna Peedee Belemnite, V-PDB), and highly variable and highly depleted d18O (down to 14& Vienna Standard Mean Ocean Water, V-SMOW). The late generation of cement is most depleted in both 13C and 18O. The geochemical and isotopic patterns are best explained by interaction between surface oxic waters, pore waters and underground, 18Odepleted, reducing, ice-meltwaters accumulated in the underlying coal-bearing sediments during the Permian deglaciation. The invariant d13C distribution across core-to-rim transects for each individual concretion is consistent with rapid lithification and involvement of a limited range of carbon sources derived via oxidation of buried plant material and from dissolved clastic carbonates. Syneresis of the cement during an advanced stage of lithification at early diagenesis is considered to be the cause of development of the septarian cracks. After cracking, the concretions retained a small volume of porosity, allowing infiltration of anoxic, Ba-bearing fluids, resulting in the formation of barite. The results obtained contribute to a better understanding of diagenetic processes at the shallow burial depths occurring in rift-bound, lacustrine depositional systems
    corecore