168 research outputs found

    Predicting Kidney Transplant Survival using Multiple Feature Representations for HLAs

    Full text link
    Kidney transplantation can significantly enhance living standards for people suffering from end-stage renal disease. A significant factor that affects graft survival time (the time until the transplant fails and the patient requires another transplant) for kidney transplantation is the compatibility of the Human Leukocyte Antigens (HLAs) between the donor and recipient. In this paper, we propose new biologically-relevant feature representations for incorporating HLA information into machine learning-based survival analysis algorithms. We evaluate our proposed HLA feature representations on a database of over 100,000 transplants and find that they improve prediction accuracy by about 1%, modest at the patient level but potentially significant at a societal level. Accurate prediction of survival times can improve transplant survival outcomes, enabling better allocation of donors to recipients and reducing the number of re-transplants due to graft failure with poorly matched donors

    Low Hydrophobic Mismatch Scores Calculated for HLA-A/B/DR/DQ Loci Improve Kidney Allograft Survival

    Get PDF
    We evaluated the impact of human leukocyte antigen (HLA) disparity (immunogenicity; IM) on long-term kidney allograft survival. The IM was quantified based on physicochemical properties of the polymorphic linear donor/recipient HLA amino acids (the Cambridge algorithm) as a hydrophobic, electrostatic, amino acid mismatch scores (HMS\AMS\EMS) or eplet mismatch (EpMM) load. High-resolution HLA-A/B/DRB1/DQB1 types were imputed to calculate HMS for primary/re-transplant recipients of deceased donor transplants. The multiple Cox regression showed the association of HMS with graft survival and other confounders. The HMS integer 0-10 scale showed the most survival benefit between HMS 0 and 3. The Kaplan-Meier analysis showed that: the HMS=0 group had 18.1-year median graft survival, a 5-year benefit over HMS\u3e0 group; HMS ≤ 3.0 had 16.7-year graft survival, a 3.8-year better than HMS\u3e3.0 group; and, HMS ≤ 7.8 had 14.3-year grafts survival, a 1.8-year improvement over HMS\u3e7.8 group. Stratification based on EMS, AMS or EpMM produced similar results. Additionally, the importance of HLA-DR with/without -DQ IM for graft survival was shown. In our simulation of 1,000 random donor/recipient pairs, 75% with HMS\u3e3.0 were re-matched into HMS ≤ 3.0 and the remaining 25% into HMS≥7.8: after re-matching, the 13.5 years graft survival would increase to 16.3 years. This approach matches donors to recipients with low/medium IM donors thus preventing transplants with high IM donors

    Pre-transplant CD45RC expression on blood T cells differentiates patients with cancer and rejection after kidney transplantation

    Get PDF
    Background Biological biomarkers to stratify cancer risk before kidney transplantation are lacking. Several data support that tumor development and growth is associated with a tolerant immune profile. T cells expressing low levels of CD45RC preferentially secrete regulatory cytokines and contain regulatory T cell subset. In contrast, T cells expressing high levels of CD45RC have been shown to secrete proinflammatory cytokines, to drive alloreactivity and to predict acute rejection (AR) in kidney transplant patients. In the present work, we evaluated whether pre-transplant CD45RClow T cell subset was predictive of post-transplant cancer occurrence. Methods We performed an observational cohort study of 89 consecutive first time kidney transplant patients whose CD45RC T cell expression was determined by flow cytometry before transplantation. Post-transplant events including cancer, AR, and death were assessed retrospectively. Results After a mean follow-up of 11.1±4.1 years, cancer occurred in 25 patients (28.1%) and was associated with a decreased pre-transplant proportion of CD4+CD45RChigh T cells, with a frequency below 51.9% conferring a 3.7-fold increased risk of post-transplant malignancy (HR 3.71 [1.24–11.1], p = 0.019). The sensibility, specificity, negative predictive and positive predictive values of CD4+CD45RChigh<51.9% were 84.0, 54.7, 89.8 and 42.0% respectively. Confirming our previous results, frequency of CD8+CD45RChigh T cells above 52.1% was associated with AR, conferring a 20-fold increased risk (HR 21.7 [2.67–176.2], p = 0.0004). The sensibility, specificity, negative predictive and positive predictive values of CD8+CD45RChigh>52.1% were 94.5, 68.0, 34.7 and 98.6% respectively. Frequency of CD4+CD45RChigh T cells was positively correlated with those of CD8+CD45RChigh (p<0.0001), suggesting that recipients with high AR risk display a low cancer risk. Conclusion High frequency of CD45RChigh T cells was associated with AR, while low frequency was associated with cancer. Thus, CD45RC expression on T cells appears as a double-edged sword biomarker of promising interest to assess both cancer and AR risk before kidney transplantation

    Role of biomarkers in early infectious complications after lung transplantation

    Get PDF
    Background Infections and primary graft dysfunction are devastating complications in the immediate postoperative period following lung transplantation. Nowadays, reliable diagnostic tools are not available. Biomarkers could improve early infection diagnosis. Methods Multicentre prospective observational study that included all centres authorized to perform lung transplantation in Spain. Lung infection and/or primary graft dysfunction presentation during study period (first postoperative week) was determined. Biomarkers were measured on ICU admission and daily till ICU discharge or for the following 6 consecutive postoperative days. Results We included 233 patients. Median PCT levels were significantly lower in patients with no infection than in patients with Infection on all follow up days. PCT levels were similar for PGD grades 1 and 2 and increased significantly in grade 3. CRP levels were similar in all groups, and no significant differences were observed at any study time point. In the absence of PGD grade 3, PCT levels above median (0.50 ng/ml on admission or 1.17 ng/ml on day 1) were significantly associated with more than two- and three-fold increase in the risk of infection (adjusted Odds Ratio 2.37, 95% confidence interval 1.06 to 5.30 and 3.44, 95% confidence interval 1.52 to 7.78, respectively). Conclusions In the absence of severe primary graft dysfunction, procalcitonin can be useful in detecting infections during the first postoperative week. PGD grade 3 significantly increases PCT levels and interferes with the capacity of PCT as a marker of infection. PCT was superior to CRP in the diagnosis of infection during the study period

    Rapamycin Pharmacokinetic and Pharmacodynamic Relationships in Osteosarcoma: A Comparative Oncology Study in Dogs

    Get PDF
    Signaling through the mTOR pathway contributes to growth, progression and chemoresistance of several cancers. Accordingly, inhibitors have been developed as potentially valuable therapeutics. Their optimal development requires consideration of dose, regimen, biomarkers and a rationale for their use in combination with other agents. Using the infrastructure of the Comparative Oncology Trials Consortium many of these complex questions were asked within a relevant population of dogs with osteosarcoma to inform the development of mTOR inhibitors for future use in pediatric osteosarcoma patients.This prospective dose escalation study of a parenteral formulation of rapamycin sought to define a safe, pharmacokinetically relevant, and pharmacodynamically active dose of rapamycin in dogs with appendicular osteosarcoma. Dogs entered into dose cohorts consisting of 3 dogs/cohort. Dogs underwent a pre-treatment tumor biopsy and collection of baseline PBMC. Dogs received a single intramuscular dose of rapamycin and underwent 48-hour whole blood pharmacokinetic sampling. Additionally, daily intramuscular doses of rapamycin were administered for 7 days with blood rapamycin trough levels collected on Day 8, 9 and 15. At Day 8 post-treatment collection of tumor and PBMC were obtained. No maximally tolerated dose of rapamycin was attained through escalation to the maximal planned dose of 0.08 mg/kg (2.5 mg/30 kg dog). Pharmacokinetic analysis revealed a dose-dependent exposure. In all cohorts modulation of the mTOR pathway in tumor and PBMC (pS6RP/S6RP) was demonstrated. No change in pAKT/AKT was seen in tumor samples following rapamycin therapy.Rapamycin may be safely administered to dogs and can yield therapeutic exposures. Modulation pS6RP/S6RP in tumor tissue and PBMCs was not dependent on dose. Results from this study confirm that the dog may be included in the translational development of rapamycin and potentially other mTOR inhibitors. Ongoing studies of rapamycin in dogs will define optimal schedules for their use in cancer and evaluate the role of rapamycin use in the setting of minimal residual disease

    Phosphorothioate antisense oligonucleotides induce the formation of nuclear bodies

    Get PDF
    Antisense oligonucleotides are powerful tools for the in vivo regulation of gene expression. We have characterized the intracellular distribution of fluorescently tagged phosphorothioate oligodeoxynucleotides (PS-ONs) at high resolution under conditions in which PS-ONs have the potential to display antisense activity. Under these conditions PS-ONs predominantly localized to the cell nucleus where they accumulated in 20-30 bright spherical foci designated phosphorothioate bodies (PS bodies), which were set against a diffuse nucleoplasmic population excluding nucleoli. PS bodies are nuclear structures that formed in cells after PS-ON delivery by transfection agents or microinjection but were observed irrespectively of antisense activity or sequence. Ultrastructurally, PS bodies corresponded to electron-dense structures of 150-300 nm diameter and resembled nuclear bodies that were found with lower frequency in cells lacking PS-ONs. The environment of a living cell was required for the de novo formation of PS bodies, which occurred within minutes after the introduction of PS-ONs. PS bodies were stable entities that underwent noticeable reorganization only during mitosis. Upon exit from mitosis, PS bodies were assembled de novo from diffuse PS-ON pools in the daughter nuclei. In situ fractionation demonstrated an association of PS-ONs with the nuclear matrix. Taken together, our data provide evidence for the formation of a nuclear body in cells after introduction of phosphorothioate oligodeoxynucleotides

    The allometry of the smallest: superlinear scaling of microbial metabolic rates in the Atlantic Ocean

    Get PDF
    Prokaryotic planktonic organisms are small in size but largely relevant in marine biogeochemical cycles. Due to their reduced size range (0.2 to 1 mu m in diameter), the effects of cell size on their metabolism have been hardly considered and are usually not examined in field studies. Here, we show the results of size-fractionated experiments of marine microbial respiration rate along a latitudinal transect in the Atlantic Ocean. The scaling exponents obtained from the power relationship between respiration rate and size were significantly higher than one. This superlinearity was ubiquitous across the latitudinal transect but its value was not universal revealing a strong albeit heterogeneous effect of cell size on microbial metabolism. Our results suggest that the latitudinal differences observed are the combined result of changes in cell size and composition between functional groups within prokaryotes. Communities where the largest size fraction was dominated by prokaryotic cyanobacteria, especially Prochlorococcus, have lower allometric exponents. We hypothesize that these larger, more complex prokaryotes fall close to the evolutionary transition between prokaryotes and protists, in a range where surface area starts to constrain metabolism and, hence, are expected to follow a scaling closer to linearity.Versión del editor8,951

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field
    corecore