1,444 research outputs found

    ALLOREACTIVE T LYMPHOCYTES CULTURED FROM LIVER TRANSPLANT BIOPSIES: ASSOCIATIONS OF HLA SPECIFICITY WITH CLINICOPATHOLOGICAL FINDINGS.

    Get PDF
    Lymphocyte cultures grown from liver allograft biopsies were shown to exhibit alloreactivity towards donor cells as measured by primed lymphocyte testing (PLT). The PLT specificity was determined in assays using HLA typed panel cells and/or by inhibition testing with HLA specific monoclonal antibodies. Certain cultures exhibited PLT specificity towards class I HLA antigens of the donor, whereas others were specific for class II HLA antigens or recognized mixtures of class I and II antigens. These PLT specificity patterns were compared with clinical, histological and laboratory findings on the liver transplant patients at the time of the biopsy. Biopsies yielding class I specific PLT cells were taken generally during the earlier posttransplant period, whereas class II specific cells were grown from later biopsies. There was no significant correlation of the PLT specificity towards class I vs II antigens with the levels of total or direct bilirubin, serum glutamate oxaloacetic transaminase (SGOT), and serum glutamate pyruvate transaminase (SGPT), although a trend towards higher values was noted for biopsies presenting with a class II specific infiltrate. However, the levels of gamma glutamyl transpeptidase (GGTP) and alkaline phosphatase (AP) were significantly increased when biopsies yielded class II specific rather than class I specific PLT cells. Biopsy histology showed more damage to bile duct epithelium in association with class II PLT specificity whereas intense but often reversible infiltrates were found in biopsies yielding class I specific cells. The elevated GGTP and AP levels are probably related to the interaction of class II specific T cells with bile duct epithelium, which has been shown to express induced class II HLA antigens on their cell surface

    Weaning of immunosuppression in liver transplant recipients

    Get PDF
    Immunosuppression has been sporadically discontinued by noncompliant liver allograft recipients for whom an additional 4 1/2 years of follow-up is provided. These anecdotal observations prompted a previously reported prospective drug withdrawal program in 59 liver recipients. This prospective series has been increased to 95 patients whose weaning was begun between June 1992 and March 1996, 8.4±4.4 (SD) years after liver replacement. A further 4 1/2 years follow-up was obtained of the 5 self-weaned patients. The prospectively weaned recipients (93 livers; 2 liver/kidney) had undergone transplantation under immunosuppression based on azathioprine (AZA, through 1979), cyclosporine (CsA, 1980-1989), or tacrolimus (TAC, 1989-1994). In patients on CsA or TAC based cocktails, the adjunct drugs were weaned first in the early part of the trial. Since 1994, the T cell-directed drugs were weaned first. Three of the 5 original self-weaned recipients remain well after drug-free intervals of 14 to 17 years. A fourth patient died in a vehicular accident after 11 years off immunosuppression, and the fifth patient underwent retransplantation because of hepatitis C infection after 9 drug-free years; their allografts had no histopathologic evidence of rejection. Eighteen (19%) of the 95 patients in the prospective series have been drug free for from 10 months to 4.8 years. In the total group, 18 (19%) have had biopsy proved acute rejection; 7 (7%) had a presumed acute rejection without biopsy; 37 (39%) are still weaning; and 12 (13%, all well) were withdrawn from the protocol at reduced immunosuppression because of noncompliance (n=8), recurrent PBC (n=2), pregnancy (n=1), and renal failure necessitating kidney transplantation (n=1). No patients were formally diagnosed with chronic rejection, but 3 (3%) were placed back on preexisting immunosuppression or switched from cyclosporine (CsA) to tacrolimus (TAC) because of histopathologic evidence of duct injury. Two patients with normal liver function died during the trial, both from complications of prior chronic immunosuppression. No grafts suffered permanent functional impairment and only one patient developed temporary jaundice. Long surviving liver transplant recipients are systematically overimmunosuppressed. Consequently, drug weaning, whether incomplete or complete, is an important management strategy providing it is done slowly under careful physician surveillance. Complete weaning from CsA-based regimens has been difficult. Disease recurrence during drug withdrawal was documented in 2 of 13 patients with PBC and could be a risk with other autoimmune disorders

    Tolerance-Inducing Strategies in Islet Transplantation

    Get PDF
    Allogeneic islet transplantation is a promising approach for restoring normoglycemia in type 1 diabetic patients. Current use of immunosuppressive therapies for management of islet transplant recipients can be counterintuitive to islet function and can lead to complications in the long term. The induction of donor-specific tolerance eliminates the dependency on immunosuppression and allows recipients to retain responses to foreign antigens. The mechanisms by which tolerance is achieved involve the deletion of donor-reactive T cells, induction of T-cell anergy, immune deviation, and generation of regulatory T cells. This review will outline the various methods used for inducing donor-specific tolerance in islet transplantation and will highlight the previously unforeseen potential of tissue stromal cells in promoting islet engraftment

    Combined liver-kidney transplantation and the effect of preformed lymphocytotoxic antibodies

    Get PDF
    Thirty-eight sequentially placed liver and kidney allografts were evaluated with respect to patient and graft survival, and the influence of preformed lymphocytotoxic antibodies was analysed. The results suggest that the survival rate of combined liver and kidney transplantation is similar to the survival rate of liver transplantation alone. Sequentially placed kidney allografts may be protected from hyperacute rejection in the presence of donor specific lymphocytotoxic antibodies, but not in all instances. Both patient and kidney allograft survival was lower in positive crossmatch patients (33% and 17% respectively) than in negative crossmatch patients (78% and 75%). High levels of panel reactive antibodies (>10%) also appeared to have a deleterious effect on survival, although the majority of the patients who failed also had a positive crossmatch. Although preformed lymphocytotoxic antibodies are not an absolute contraindication to combined liver-kidney transplantation, they do appear to have a deleterious effect on long-term graft survival. However, more correlation with clinical parameters is needed. © 1994

    Disease gravity and urgency of need as guidelines for liver allocation

    Get PDF
    One thousand one hundred and twenty-eight candidates for liver transplantation were stratified into five urgency-of-need categories by the United Network for Organ Sharing (UNOS) criteria. Most patients of low-risk UNOS 1 status remained alive after 1 yr without transplantation; the mortality while waiting was 3% after a median of 229.5 days. In contrast, only 3% of those entered at the highest risk UNOS 5 category survived without transplantation; 28% died while waiting, the deaths occurring at a median of 5.5 days. The UNOS categories in between showed the expected gradations, in which at each higher level fewer patients remained as candidates throughout the 1-yr duration of study while progressively more died at earlier and earlier times while waiting for an organ. In a separate study of posttransplantation survival during the same time period, the best postoperative results were in the lowest-risk UNOS 1 and 2 patients (88% combined), and the worst results were those in UNOS 5 (71%). However, a relative risk cross-analysis showed that a negative benefit of transplantation may have been the result in terms of 1-yr survival for the low-risk elective patients, but that a gain in life extension was achieved in the potentially lethal UNOS categories 3, 4 and 5 (greatest for UNOS 3). These findings and conclusions are discussed in terms of total care of patients with liver disease, and in the context of organ allocation policies of the United States and Europe

    Pregnancy after liver transplantation under tacrolimus

    Get PDF
    Background. The maternal and fetal risk of pregnancy after organ transplantation under tacrolimus has not been reported. This was prospectively studied in 27 pregnancies by 21 female liver recipients who were treated with tacrolimus before and throughout gestation. Method. Twenty- seven babies were born between October 1990 and April 1996. In 15 cases, samples were obtained at or after delivery and stored (-40°C) for comparison of tacrolimus concentration in the maternal blood with different combinations of cord and infant venous blood, breast milk, or a section of the placenta. Results. The 21 mothers had surprisingly few serious complications of pregnancy and no mortality. Two infants with 23 and 24 weeks gestation died shortly after birth. The mean birth weight of the other 25 was 2638±781 g after a gestational period of 36.0±3.3 weeks. Mean birth weight percentile for gestational age was 50.2±26.2 (median 40). On the day of delivery, the mean tacrolimus concentrations (ng/ml) were 4.3 in placenta versus 1.5, 0.7, and 0.5 in maternal, cord, and child plasma, and 0.6 in the first breast milk specimens. The infants had a 36% incidence of transient perinatal hyperkalemia (K+>7.0 meq/L) and a mild reversible renal impairment, which were thought to reflect in part maternal homeostasis. One newborn had unilateral polycystic renal disease (the only anomaly). All 25 babies have had satisfactory postnatal growth and development with a current mean weight percentile of 62±37 (median 80). Conclusions. Pregnancy by postliver transplant mothers under tacrolimus was possible with a surprisingly low incidence of the hypertension, preeclampsia, and other maternal complications historically associated with such gestations. As in previous experience with other immunosuppressive regimens, preterm deliveries were common. However, prenatal growth for gestational age and postnatal infant growth for post- partum age were normal

    Altered Patterns of Fungal Keratitis at a London Ophthalmic Referral Hospital: An Eight-Year Retrospective Observational Study.

    Get PDF
    PURPOSE: In previous studies of fungal keratitis (FK) from temperate countries, yeasts were the predominant isolates, with ocular surface disease (OSD) being the leading risk factor. Since the 2005-2006 outbreak of contact lens (CL)-associated Fusarium keratitis, there may have been a rise in CL-associated filamentary FK in the United Kingdom. This retrospective case series investigated the patterns of FK from 2007 to 2014. We compared these to 1994-2006 data from the same hospital. DESIGN: Retrospective observational study. METHODS: All cases of FK presenting to Moorfields Eye Hospital between 2007 and 2014 were identified. The definition of FK was either a fungal organism isolated by culture or fungal structures identified by light microscopy (LM) of scrape material, histopathology, or in vivo corneal confocal microscopy (IVCM). Main outcome measure was cases of FK per year. RESULTS: A total of 112 patients had confirmed FK. Median age was 47.2 years. Between 2007 and 2014, there was an increase in annual numbers of FK (Poisson regression, P = .0001). FK was confirmed using various modalities: 79 (70.5%) by positive culture, 16 (14.3%) by LM, and 61 (54.5%) by IVCM. Seventy-eight patients (69.6%) were diagnosed with filamentary fungus alone, 28 (25%) with yeast alone, and 6 (5.4%) with mixed filamentary and yeast infections. This represents an increase in the proportion of filamentary fungal infections from the pre-2007 data. Filamentary fungal and yeast infections were associated with CL use and OSD, respectively. CONCLUSIONS: The number of FK cases has increased. This increase is due to CL-associated filamentary FK. Clinicians should be aware of these changes, which warrant epidemiologic investigations to identify modifiable risk factors

    Differential Muon Tomography to Continuously Monitor Changes in the Composition of Subsurface Fluids

    Get PDF
    Muon tomography has been used to seek hidden chambers in Egyptian pyramids and image subsurface features in volcanoes. It seemed likely that it could be used to image injected, supercritical carbon dioxide as it is emplaced in porous geological structures being used for carbon sequestration, and also to check on subsequent leakage. It should work equally well in any other application where there are two fluids of different densities, such as water and oil, or carbon dioxide and heavy oil in oil reservoirs. Continuous monitoring of movement of oil and/or flood fluid during enhanced oil recovery activities for managing injection is important for economic reasons. Checking on leakage for geological carbon storage is essential both for safety and for economic purposes. Current technology (for example, repeat 3D seismic surveys) is expensive and episodic. Muons are generated by high- energy cosmic rays resulting from supernova explosions, and interact with gas molecules in the atmosphere. This innovation has produced a theoretical model of muon attenuation in the thickness of rock above and within a typical sandstone reservoir at a depth of between 1.00 and 1.25 km. Because this first simulation was focused on carbon sequestration, the innovators chose depths sufficient for the pressure there to ensure that the carbon dioxide would be supercritical. This innovation demonstrates for the first time the feasibility of using the natural cosmic-ray muon flux to generate continuous tomographic images of carbon dioxide in a storage site. The muon flux is attenuated to an extent dependent on, amongst other things, the density of the materials through which it passes. The density of supercritical carbon dioxide is only three quarters that of the brine in the reservoir that it displaces. The first realistic simulations indicate that changes as small as 0.4% in the storage site bulk density could be detected (equivalent to 7% of the porosity, in this specific case). The initial muon flux is effectively constant at the surface of the Earth. Sensitivity of the method would be decreased with increasing depth. However, sensitivity can be improved by emplacing a greater array of particle detectors at the base of the reservoir
    corecore