57 research outputs found

    Análisis de la sensibilización HLA pretrasplante y postrasplante en pacientes receptores de un injerto renal: factores de riesgo e implicaciones pronósticas.

    Get PDF
    El trasplante de órgano sólido se ha convertido en una alternativa establecida para el fallo orgánico de muchos pacientes. Los resultados globales del trasplante renal han dado un giro positivo espectacular en el seguimiento a corto y medio plazo, pero para mejorar los resultados a largo plazo quedan por conocer mecanismos de la respuesta inmune, disponer de una monitorización inmunológica precisa y a tiempo real con unos biomarcadores adecuados. La inmunosupresión farmacológica ha supuesto la estrategia principal para prevenir y tratar los episodios de rechazo agudo (RA). El desarrollo de fármacos inmunosupresores cada vez más potentes ha disminuido de forma importante la tasa de rechazo y pérdida del injerto por esta causa, no obstante a pesar de los notables avances en el control del RA del trasplante, no puede decirse lo mismo del manejo a largo plazo, calculándose una pérdida anual del 5% de los injertos renales. La evidencia de que los anticuerpos circulantes anti-HLA donante específicos (ADE) aumentan el riesgo de pérdida tardía del injerto pone de manifiesto la importancia del rechazo mediado por anticuerpos. Además la aparición de sensibilización HLA postrasplante “de novo”, particularmente si los anticuerpos son específicos contra el donante, se ha relacionado con una peor supervivencia del injerto. El objetivo principal del presente estudio fue evaluar la respuesta humoral pretrasplante renal a través de la determinación de ADE mediante tecnología Luminex en nuestra población trasplantada renal y correlacionar la aparición de estos anticuerpos con la supervivencia del injerto, comparativamente en los pacientes con y sin anticuerpos. Otro objetivo fue determinar la patocronia de aparición de ADE en los primeros 24 meses postrasplante renal. La hipótesis del presente estudio consiste en que la presencia de ADE en el trasplante renal repercute negativamente en la funcionalidad del injerto, pudiendo condicionar la supervivencia del mismo. El reconocimiento de los factores de riesgo tanto pre como postrasplante, debe aportar luz sobre la predicción del riesgo inmunológico, las estrategia de inmunosupresión a adoptar y el manejo de dichos pacientes. Se trata de un estudio mixto observacional. El estudio consta de una fase retrospectiva que incluye 320 pacientes trasplantados renales en el Hospital Universitario Dr. Peset entre el año 1996 y el 2006, en los cuales se analizó una muestra pretrasplante serotecada en el centro de transfusiones de la C.V. antes de la utilización en la práctica clínica de la tecnología Luminex. Además consta de una fase prospectiva donde se incluyeron 88 pacientes trasplantados de forma consecutiva entre los años 2010 y 2012 en el Hospital Universitario Dr. Peset, a los cuales se les realizó determinación de AC anti-HLA mediante técnicas basadas en Tecnología Luminex, de manera protocolizada a los 3, 6, 12 y 24 meses postrasplante. Se revaloraron los diagnósticos de las biopsias postrasplante realizadas por indicación clínica, siguiendo los criterios de la clasificación de Banff 2009. La presencia de ADE pretrasplante se relacionó con el sexo femenino, número de transfusiones, retrasplantes, VCH+ y mayor tiempo en diálisis del receptor. Se asoció así mismo con el uso de tacrolimus de inicio y con el uso de terapia de inducción con timoglobulina. Más de la mitad de los pacientes con ADE no fueron identificados mediante la técnica clásica de LCT. Casi un tercio de los pacientes con ADE analizados, negativizaron estos AC en el periodo postrasplante y fueron los que presentaron mejor supervivencia del injerto. Los pacientes con ADE se biopsiaron con mayor frecuencia durante el seguimiento. Los diagnósticos de RAH y RCH fueron mas prevalentes en este grupo, así mismo fueron mas frecuentes las lesiones de glomeurilitis, capilaritis y presencia de dobles contornos de las asas capilares periféricas en estos pacientes. El RAH se asocio a niveles mas altos de MFI. No se demostró relación entre la presencia de ADE y parámetros de función renal como las cifras de creatinina sérica o proteinuria durante el seguimiento postrasplante. La presencia de ADE pretrasplante se acompañó de peor supervivencia del injerto renal caracterizándose como un factor de riesgo independiente para el diagnóstico de RAH y de fracaso del injerto. La supervivencia del injerto fue peor en los pacientes con títulos de MFI mas altos (>3000 MFI). Las curvas de supervivencia del receptor se separaban a partir de los 6 años postrasplante, siendo peores en los pacientes con ADE pretrasplante. Los ADE detectados exclusivamente por Luminex y los ADE de “baja intensidad” se asociaron con peores resultados en la supervivencia del injerto, aunque este resultado dejaba de ser significativo al censurar por la muerte del receptor. La fase prospectiva estuvo compuesta de forma preferente por receptores del sexo masculino, de edad media elevada, habiendo recibido transfusiones un tercio de los mismos. El 7% se trasplantó con ADE preexistentes. Los donantes también presentaron una edad media elevada. El 100% de la población estudiada recibió tratamiento de inducción (60% con timoglobulina). Casi un 100% de la serie recibió tacrolimus como inmunosupresor principal. La función retrasada del injerto afectó a un tercio de la población. El RA se produjo en el 11% de los casos, tratándose en el 7% de éstos de RAH. La situación de retrasplante fue más frecuente en los pacientes con ADE postrasplante. Todos los pacientes que se trasplantaron con ADE, los mantuvieron en las determinaciones postrasplante, detectándose sólo un paciente con “ADE de novo” a partir de los 24 meses postrasplante, en el que se detectó falta de adherencia al tratamiento. La presencia de ADE se relacionó con un mayor número de biopsias durante el seguimiento, siendo más prevalente en este grupo el diagnóstico de rechazo agudo humoral, así como las lesiones de glomeurilitis, capilaritis y presencia de dobles contornos de las asas capilares periféricas. Los receptores con ADE postrasplante presentaron peores cifras de proteinuria a partir de los dos anos postrasplante, sin embargo no se objetivaron diferencias en cuanto a la supervivencia del injerto renal o del paciente. Concluimos que los resultados del trasplante renal en presencia de ADE son inferiores a los obtenidos con el trasplante renal sin anticuerpos preexistentes frente al donante. No obstante, ante el perfil menos patogénico de los ADE de baja intensidad y la observación de que la negativización de los mismos en el postrasplante se asocia a mejores resultados, pensamos que la presencia de ADE con niveles menores a 3000 MFI no debería ser una contraindicación para el trasplante, pudiendo beneficiarse de programas de desensibilización o de inducción potente con el objetivo de disminuir los tiempos de espera de los pacientes sensibilizados

    Willingness to Use a Wearable Device Capable of Detecting and Reversing Overdose Among People Who Use Opioids in Philadelphia

    Get PDF
    Background: The incidence of opioid-related overdose deaths has been rising for 30 years and has been further exacerbated amidst the COVID-19 pandemic. Naloxone can reverse opioid overdose, lower death rates, and enable a transition to medication for opioid use disorder. Though current formulations for community use of naloxone have been shown to be safe and effective public health interventions, they rely on bystander presence. We sought to understand the preferences and minimum necessary conditions for wearing a device capable of sensing and reversing opioid overdose among people who regularly use opioids. Methods: We conducted a combined cross-sectional survey and semi-structured interview at a respite center, shelter, and syringe exchange drop-in program in Philadelphia, Pennsylvania, USA during the COVID-19 pandemic in August and September 2020. The primary aim was to explore the proportion of participants who would use a wearable device to detect and reverse overdose. Preferences regarding designs and functionalities were collected via a questionnaire with items having Likert-based response options and a semi-structured interview intended to elicit feedback on prototype designs. Independent variables included demographics, opioid use habits, and previous experience with overdose. Results: A total of 97 adults with an opioid-use history of at least 3 months were interviewed. A majority of survey participants (76%) reported a willingness to use a device capable of detecting an overdose and automatically administering a reversal agent upon initial survey. When reflecting on the prototype, most respondents (75.5%) reported that they would wear the device always or most of the time. Respondents indicated discreetness and comfort as important factors that increased their chance of uptake. Respondents suggested that people experiencing homelessness and those with low tolerance for opioids would be in greatest need of the device. Conclusions: The majority of people sampled with a history of opioid use in an urban setting were interested in having access to a device capable of detecting and reversing an opioid overdose. Participants emphasized privacy and comfort as the most important factors influencing their willingness to use such a device. Trial Registration: NCT0453059

    Impact of HLA Mismatching on Early Subclinical Inflammation in Low-Immunological-Risk Kidney Transplant Recipients

    Get PDF
    The impact of human leukocyte antigen (HLA)-mismatching on the early appearance of subclinical inflammation (SCI) in low-immunological-risk kidney transplant (KT) recipients is undetermined. We aimed to assess whether HLA-mismatching (A-B-C-DR-DQ) is a risk factor for early SCI. As part of a clinical trial (Clinicaltrials.gov, number NCT02284464), a total of 105 low-immunological-risk KT patients underwent a protocol biopsy on the third month post-KT. As a result, 54 presented SCI, showing a greater number of total HLA-mismatches (p = 0.008) and worse allograft function compared with the no inflammation group (48.5 ± 13.6 vs. 60 ± 23.4 mL/min; p = 0.003). Multiple logistic regression showed that the only risk factor associated with SCI was the total HLA-mismatch score (OR 1.32, 95%CI 1.06-1.64, p = 0.013) or class II HLA mismatching (OR 1.51; 95%CI 1.04-2.19, p = 0.032) after adjusting for confounder variables (recipient age, delayed graft function, transfusion prior KT, and tacrolimus levels). The ROC curve illustrated that the HLA mismatching of six antigens was the optimal value in terms of sensitivity and specificity for predicting the SCI. Finally, a significantly higher proportion of SCI was seen in patients with >6 vs. ≤6 HLA-mismatches (62.3 vs. 37.7%; p = 0.008). HLA compatibility is an independent risk factor associated with early SCI. Thus, transplant physicians should perhaps be more aware of HLA mismatching to reduce these early harmful lesions

    Clinical Relevance of Corticosteroid Withdrawal on Graft Histological Lesions in Low-Immunological-Risk Kidney Transplant Patients

    Get PDF
    The impact of corticosteroid withdrawal on medium-term graft histological changes in kidney transplant (KT) recipients under standard immunosuppression is uncertain. As part of an open-label, multicenter, prospective, phase IV, 24-month clinical trial (ClinicalTrials.gov, NCT02284464) in low-immunological-risk KT recipients, 105 patients were randomized, after a protocol-biopsy at 3 months, to corticosteroid continuation (CSC, n = 52) or corticosteroid withdrawal (CSW, n = 53). Both groups received tacrolimus and MMF and had another protocol-biopsy at 24 months. The acute rejection rate, including subclinical inflammation (SCI), was comparable between groups (21.2 vs. 24.5%). No patients developed dnDSA. Inflammatory and chronicity scores increased from 3 to 24 months in patients with, at baseline, no inflammation (NI) or SCI, regardless of treatment. CSW patients with SCI at 3 months had a significantly increased chronicity score at 24 months. HbA1c levels were lower in CSW patients (6.4 +/- 1.2 vs. 5.7 +/- 0.6%; p = 0.013) at 24 months, as was systolic blood pressure (134.2 +/- 14.9 vs. 125.7 +/- 15.3 mmHg; p = 0.016). Allograft function was comparable between groups and no patients died or lost their graft. An increase in chronicity scores at 2-years post-transplantation was observed in low-immunological-risk KT recipients with initial NI or SCI, but CSW may accelerate chronicity changes, especially in patients with early SCI. This strategy did, however, improve the cardiovascular profiles of patients

    How do patients with systemic sclerosis experience currently provided healthcare and how should we measure its quality?

    Get PDF
    OBJECTIVES: To gain insight into SSc patients' perspective on quality of care and to survey their preferred quality indicators. METHODS: An online questionnaire about healthcare setting, perceived quality of care (CQ index) and quality indicators, was sent to 2093 patients from 13 Dutch hospitals. RESULTS: Six hundred and fifty patients (mean age 59 years, 75% women, 32% limited cutaneous SSc, 20% diffuse cutaneous SSc) completed the questionnaire. Mean time to diagnosis was 4.3 years (s.d. 6.9) and was longer in women compared with men (4.8 (s.d. 7.3) vs 2.5 (s.d. 5.0) years). Treatment took place in a SSc expert centre for 58%, regional centre for 29% or in both for 39% of patients. Thirteen percent of patients was not aware of whether their hospital was specialized in SSc. The perceived quality of care was rated with a mean score of 3.2 (s.d. 0.5) (range 1.0-4.0). There were no relevant differences between expert and regional centres. The three prioritized process indicators were: good patient-physician interaction (80%), structural multidisciplinary collaboration (46%) and receiving treatment according to SSc guidelines (44%). Absence of disease progression (66%), organ involvement (33%) and digital ulcers (27%) were the three highest rated outcome indicators. CONCLUSION: The perceived quality of care evaluated in our study was fair to good. No differences between expert and regional centres were observed. Our prioritized process and outcome indicators can be added to indicators suggested by SSc experts in earlier studies and can be used to evaluate the quality of care in SSc

    What is the optimal level of vitamin D in non-dialysis chronic kidney disease population?

    Get PDF
    AIM: To evaluate thresholds for serum 25(OH)D concentrations in relation to death, kidney progression and hospitalization in non-dialysis chronic kidney disease (CKD) population. METHODS: Four hundred and seventy non-dialysis 3-5 stage CKD patients participating in OSERCE-2 study, a prospective, multicenter, cohort study, were prospectively evaluated and categorized into 3 groups according to 25(OH)D levels at enrollment (less than 20 ng/mL, between 20 and 29 ng/mL, and at or above 30 ng/mL), considering 25(OH)D between 20 and 29 ng/mL as reference group. Association between 25(OH)D levels and death (primary outcome), and time to first hospitalization and renal progression (secondary outcomes) over a 3-year follow-up, were assessed by Kaplan-Meier survival curves and Cox-proportional hazard models. To identify 25(OH)D levels at highest risk for outcomes, receiver operating characteristic (ROC) curves were performed. RESULTS: Over 29 ± 12 mo of follow-up, 46 (10%) patients dead, 156 (33%) showed kidney progression, and 126 (27%) were hospitalized. After multivariate adjustment, 25(OH)D < 20 ng/mL was an independent predictor of all-cause mortality (HR = 2.33; 95%CI: 1.10-4.91; P = 0.027) and kidney progression (HR = 2.46; 95%CI: 1.63-3.71; P < 0.001), whereas the group with 25(OH)D at or above 30 ng/mL did not have a different hazard for outcomes from the reference group. Hospitalization outcomes were predicted by 25(OH) levels (HR = 0.98; 95%CI: 0.96-1.00; P = 0.027) in the unadjusted Cox proportional hazards model, but not after multivariate adjusting. ROC curves identified 25(OH)D levels at highest risk for death, kidney progression, and hospitalization, at 17.4 ng/mL [area under the curve (AUC) = 0.60; 95%CI: 0.52-0.69; P = 0.027], 18.6 ng/mL (AUC = 0.65; 95%CI: 0.60-0.71; P < 0.001), and 19.0 ng/mL (AUC = 0.56; 95%CI: 0.50-0.62; P = 0.048), respectively. CONCLUSION: 25(OH)D < 20 ng/mL was an independent predictor of death and progression in patients with stage 3-5 CKD, with no additional benefits when patients reached the levels at or above 30 ng/mL suggested as optimal by CKD guidelines.Abbott and the Spanish Society of Nephrolog

    Randomized Controlled Trial Assessing the Impact of Tacrolimus Versus Cyclosporine on the Incidence of Posttransplant Diabetes Mellitus

    Get PDF
    Despite the high incidence of posttransplant diabetes mellitus (PTDM) among high-risk recipients, no studies have investigated its prevention by immunosuppression optimization. We conducted an open-label, multicenter, randomized trial testing whether a tacrolimus-based immunosuppression and rapid steroid withdrawal (SW) within 1 week (Tac-SW) or cyclosporine A (CsA) with steroid minimization (SM) (CsA-SM), decreased the incidence of PTDM compared with tacrolimus with SM (Tac-SM). All arms received basiliximab and mycophenolate mofetil. High risk was defined by age >60 or >45 years plus metabolic criteria based on body mass index, triglycerides, and high-density lipoprotein-cholesterol levels. The primary endpoint was the incidence of PTDM after 12 months. The study comprised 128 de novo renal transplant recipients without pretransplant diabetes (Tac-SW: 44, Tac-SM: 42, CsA-SM: 42). The 1-year incidence of PTDM in each arm was 37.8% for Tac-SW, 25.7% for Tac-SM, and 9.7% for CsA-SM (relative risk [RR] Tac-SW vs. CsA-SM 3.9 [1.2-12.4; P = 0.01]; RR Tac-SM vs. CsA-SM 2.7 [0.8-8.9; P = 0.1]). Antidiabetic therapy was required less commonly in the CsA-SM arm (P = 0.06); however, acute rejection rate was higher in CsA-SM arm (Tac-SW 11.4%, Tac-SM 4.8%, and CsA-SM 21.4% of patients; cumulative incidence P = 0.04). Graft and patient survival, and graft function were similar among arms. In high-risk patients, tacrolimus-based immunosuppression with SM provides the best balance between PTDM and acute rejection incidence

    The effect of cholecalciferol for lowering albuminuria in chronic kidney disease: a prospective controlled study

    No full text
    Background. Growing evidence indicates that vitamin D receptor activation may have antiproteinuric effects. We aimed to evaluate whether vitamin D supplementation with daily cholecalciferol could reduce albuminuria in proteinuric chronic kidney disease (CKD) patients. Methods. This 6-month prospective, controlled, intervention study enrolled 101 non-dialysis CKD patients with albuminuria. Patients with low 25(OH) vitamin D [25(OH)D] and high parathyroid hormone (PTH) levels (n = 50; 49%) received oral cholecalciferol (666 IU/day), whereas those without hyperparathyroidism (n = 51; 51%), independent of their vitamin D status, did not receive any cholecalciferol, and were considered as the control group. Results. Cholecalciferol administration led to a rise in mean 25(OH)D levels by 53.0 ± 41.6% (P < 0.001). Urinary albumin-to-creatinine ratio (uACR) decreased from (geometric mean with 95% confidence interval) 284 (189–425) to 167 mg/g (105–266) at 6 months (P < 0.001) in the cholecalciferol group, and there was no change in the control group. Reduction in a uACR was observed in the absence of significant changes in other factors, which could affect proteinuria, like weight, blood pressure (BP) levels or antihypertensive treatment. Six-month changes in 25(OH)D levels were significantly and inversely associated with that in the uACR (Pearson's R = −0.519; P = 0.036), after adjustment by age, sex, body mass index, BP, glomerular filtration rate and antiproteinuric treatment. The mean PTH decreased by −13.8 ± 20.3% (P = 0.039) only in treated patients, with a mild rise in phosphate and calcium–phosphate product [7.0 ± 14.7% (P = 0.002) and 7.2 ± 15.2% (P = 0.003), respectively]. Conclusions. In addition to improving hyperparathyroidism, vitamin D supplementation with daily cholecalciferol had a beneficial effect in decreasing albuminuria with potential effects on delaying the progression of CKD
    corecore