8 research outputs found

    Common Variants in the Glycerol Kinase Gene Reduce Tuberculosis Drug Efficacy

    Get PDF
    Despite the administration of multiple drugs that are highly effective in vitro, tuberculosis (TB) treatment requires prolonged drug administration and is confounded by the emergence of drug-resistant strains. To understand the mechanisms that limit antibiotic efficacy, we performed a comprehensive genetic study to identify Mycobacterium tuberculosis genes that alter the rate of bacterial clearance in drug-treated mice. Several functionally distinct bacterial genes were found to alter bacterial clearance, and prominent among these was the glpK gene that encodes the glycerol-3-kinase enzyme that is necessary for glycerol catabolism. Growth on glycerol generally increased the sensitivity of M. tuberculosis to antibiotics in vitro, and glpK-deficient bacteria persisted during antibiotic treatment in vivo, particularly during exposure to pyrazinamide-containing regimens. Frameshift mutations in a hypervariable homopolymeric region of the glpK gene were found to be a specific marker of multidrug resistance in clinical M. tuberculosis isolates, and these loss-of-function alleles were also enriched in extensively drug-resistant clones. These data indicate that frequently observed variation in the glpK coding sequence produces a drug-tolerant phenotype that can reduce antibiotic efficacy and may contribute to the evolution of resistance. IMPORTANCE: TB control is limited in part by the length of antibiotic treatment needed to prevent recurrent disease. To probe mechanisms underlying survival under antibiotic pressure, we performed a genetic screen for M. tuberculosis mutants with altered susceptibility to treatment using the mouse model of TB. We identified multiple genes involved in a range of functions which alter sensitivity to antibiotics. In particular, we found glycerol catabolism mutants were less susceptible to treatment and that common variation in a homopolymeric region in the glpK gene was associated with drug resistance in clinical isolates. These studies indicate that reversible high-frequency variation in carbon metabolic pathways can produce phenotypically drug-tolerant clones and have a role in the development of resistance

    Impact of Induction Immunosuppressants on T Lymphocyte Subsets after Kidney Transplantation: A Prospective Observational Study with Focus on Anti-Thymocyte Globulin and Basiliximab Induction Therapies

    No full text
    Induction immunosuppressive therapy for kidney transplant recipients (KTRs) primarily includes interleukin-2 receptor antagonists, such as basiliximab (BXM) or lymphocyte-depleting agents, and anti-thymocyte globulin (ATG). This study aimed to investigate their effects on T cell dynamics during the early post-transplantation period. This prospective observational study included 157 KTRs. Peripheral blood samples were collected from each patient within 5 days before and 4 and 12 weeks after transplantation. Flow cytometric analysis was performed to assess various T cell subsets whose changes were then analyzed. In the ATG group, CD4+ T cell expression decreased significantly compared with that in the BXM group. However, CD4+CD161+ and CD4+CD25+CD127low T cell expression levels increased significantly. In the CD8+ T cell subset, a decrease in CD8+CD28nullCD57+ and CD8+CCR7+ T cell expression was observed in the ATG group. However, among patients diagnosed with biopsy-proven acute rejection, T cell subset expression did not significantly differ relative to non-rejection cases. In conclusion, ATG induction therapy resulted in more pronounced changes in T lymphocyte subsets than BXM induction, with increased CD4+CD161+ and CD4+CD25+CD127low T cells and an early decrease in CD8+CD28nullCD57+ and CD8+CCR7+ T cells, some of which are associated with acute rejection

    Changes in Fatigue Recovery and Muscle Damage Enzymes after Deep-Sea Water Thalassotherapy

    No full text
    The purpose of this study was to verify the effect of deep-sea water thalassotherapy (DSWTT) on recovery from fatigue and muscle damage. The same exercise program is conducted in general underwater and deep-sea water to confirm the characteristics of deep-sea water through fatigue recovery and muscle damage enzymes. A total of 30 male college students were studied, including 10 belonging to the control group (CG), 10 in the water exercise group (WEG), and 10 in the deep-sea water exercise group (DSWEG). The DSWTT treatment consists of three components—preheating, treatment, and cooling—and the DSWTT program stretches and massages the entire upper body, lower body, back, and the entire body for a total of 25 min in a deep-sea tank. After the DSWTT program, blood tests were conducted to confirm the level of fatigue-related parameters and muscle damage enzymes. Fatigue-related parameters including glucose, lactate, ammonia, and lactate dehydrogenase (LDH), and the levels of muscle damage enzymes such as creatinine kinase (CK) and aspartate aminotransferase (AST) were measured. The results revealed that fatigue had a primary effect (p < 0.001) and exhibited strongly significant interaction (p < 0.001) with lactate, ammonia, and LDH levels, whereas the glucose level remained unchanged. The post hoc results showed a significant decrease in these parameters among DSWEG compared to CG and WEG (p < 0.01). Muscle damage enzymes showed a main effect (p < 0.001) and significant interaction (p < 0.001) with CK and AST (p < 0.001). The post hoc results showed a significant decrease in DSWEG compared with CG and WEG (p < 0.01). In conclusion, the DSWTT program applied to this study showed significant effects on muscle fatigue and muscle damage recovery. When the DSWTT program is applied in hot springs, it can have a positive effect on muscle fatigue and muscle damage recovery and can contribute to improving national health and quality of life. Further studies are needed to investigate DSWTT programs with various research subjects at different program temperatures, exercise times, and frequencies of treatment and exercise

    Combined Analysis of HLA Class II Eplet Mismatch and Tacrolimus Levels for the Prediction of De Novo Donor Specific Antibody Development in Kidney Transplant Recipients

    No full text
    We investigated whether HLA class II eplet mismatch was related to dnDSA development and analyzed its combined impact with tacrolimus levels for kidney transplantation outcomes. A total of 347 kidney transplants were included. HLA Matchmaker was used for the single molecular eplet, total eplet, antibody (Ab)-verified eplet mismatch analyses, and Ab-verified single molecular analysis to identify HLA-DR/DQ molecular thresholds for the risk of dnDSA development. A time-weighted tacrolimus trough level (TAC-C0) of 5 ng/mL and a TAC-C0 time-weighted coefficient variability (TWCV) of 20% were applied to find the combined effects on dnDSA development. A high level of mismatch for single molecular eplet (DQ ≥ 10), total eplet (DQ ≥ 12), Ab-verified eplet (DQ ≥ 4), and Ab-verified single molecular eplet (DQ ≥ 4) significantly correlated with HLA class II dnDSA development. Class II dnDSA developed mostly in patients with low TAC-C0 and high eplet mismatch. In the multivariable analyses, low TAC-C0 and high eplet mismatch showed the highest hazard ratio for the development of dnDSA. No significant combined effect was observed in dnDSA development according to TWCV. In conclusion, the determination of HLA class II eplet mismatch may improve the risk stratification for dnDSA development, especially in conjunction with tacrolimus trough levels

    Impact of delayed graft function on clinical outcomes in highly sensitized patients after deceased-donor kidney transplantation

    No full text
    Background : We investigated whether the development of delayed graft function (DGF) in pre-sensitized patients affects the clinical outcomes after deceased-donor kidney transplantation (DDKT). Methods : The study included 709 kidney transplant recipients (KTRs) from three transplant centers. We divided KTRs into four subgroups (highly sensitized DGF, highly sensitized non-DGF, low-sensitized DGF, and low-sensitized non-DGF) according to panel reactive antibody level of 50%, or DGF development. We compared post-transplant clinical outcomes among the four subgroups. Results : Incidence of biopsy-proven acute rejection (BPAR) was higher in two highly sensitized subgroups than in low-sensitized subgroups. It tended to be higher in highly sensitized DGF subgroups than in the highly sensitized non-DGF subgroups. In addition, the highly sensitized DGF subgroup showed the highest risk for BPAR (hazard ratio, 3.051; P=0.005) and independently predicted BPAR. Allograft function was lower in the two DGF subgroups than in the non-DGF subgroup until one month after transplantation, but thereafter it was similar. Death-censored graft loss rates and patient mortality tended to be low when DGF developed, but it did not reach statistical significance. Conclusions: DGF development in highly sensitized patients increases the risk for BPAR in DDKT compared with patients without DGF, suggesting the need for strict monitoring and management of such cases

    Efficacy and safety of controlled-release oxycodone/naloxone versus controlled-release oxycodone in Korean patients with cancer-related pain: a randomized controlled trial

    No full text
    Abstract Background Controlled-release oxycodone/naloxone (OXN-CR) maintains the effect of opioid-induced analgesia through oxycodone while reducing the occurrence rate of opioid-induced constipation through naloxone. The present study was designed to assess the non-inferiority of OXN-CR to controlled-release oxycodone (OX-CR) for the control of cancer-related pain in Korean patients. Methods In this randomized, open-labeled, parallel-group, phase IV study, we enrolled patients aged 20 years or older with moderate to severe cancer-related pain [numeric rating scale (NRS) pain score ≥4] from seven Korean oncology/hematology centers. Patients in the intention-to-treat (ITT) population were randomized (1:1) to OXN-CR or OX-CR groups. OXN-CR was administered starting at 20 mg/10 mg per day and up-titrated to a maximum of 80 mg/40 mg per day for 4 weeks, and OX-CR was administered starting at 20 mg/day and up-titrated to a maximum of 80 mg/day for 4 weeks. The primary efficacy endpoint was the change in NRS pain score from baseline to week 4, with non-inferiority margin of −1.5. Secondary endpoints included analgesic rescue medication intake, patient-reported change in bowel habits, laxative intake, quality of life (QoL), and safety assessments. Results Of the ITT population comprising 128 patients, 7 with missing primary efficacy data and 4 who violated the eligibility criteria were excluded from the efficacy analysis. At week 4, the mean change in NRS pain scores was not significantly different between the OXN-CR group (n = 58) and the OX-CR group (n = 59) (−1.586 vs. −1.559, P = 0.948). The lower limit of the one-sided 95% confidence interval (−0.776 to 0.830) for the difference exceeded the non-inferiority margin (P < 0.001). The OXN-CR and OX-CR groups did not differ significantly in terms of analgesic rescue medication intake, change in bowel habits, laxative intake, QoL, and safety assessments. Conclusions OXN-CR was non-inferior to OX-CR in terms of pain reduction after 4 weeks of treatment and had a similar safety profile. Studies in larger populations of Korean patients with cancer-related pain are needed to further investigate the effectiveness of OXN-CR for long-term pain control and constipation alleviation. Trial registration ClinicalTrials.gov NCT01313780, registered March 8, 201

    DataSheet_1_Combined impact of the inter and intra-patient variability of tacrolimus blood level on allograft outcomes in kidney transplantation.docx

    No full text
    IntroductionTacrolimus (TAC) has been widely used as an immunosuppressant after kidney transplantation (KT); however, the combined effects of intra-patient variability (IPV) and inter-patient variability of TAC-trough level (C0) in blood remain controversial. This study aimed to determine the combined impact of TAC-IPV and TAC inter-patient variability on allograft outcomes of KT.MethodsIn total, 1,080 immunologically low-risk patients who were not sensitized to donor human leukocyte antigen (HLA) were enrolled. TAC-IPV was calculated using the time-weighted coefficient variation (TWCV) of TAC-C0, and values > 30% were classified as high IPV. Concentration-to-dose ratio (CDR) was used for calculating TAC inter-patient variability, and CDR ResultsThe incidences of DCGL, BPAR, and overall graft loss were the highest in the high-IPV/RM group. In addition, a high IPV/RM was identified as an independent risk factor for DCGL. The hazard ratio of high IPV/RM for DCGL and the incidence of active antibody-mediated rejection were considerably increased in the PRA-positive subgroup.DiscussionHigh IPV combined with RM (inter-patient variability) was closely related to adverse allograft outcomes, and hence, more attention must be given to pre-transplant PRA-positive patients.</p
    corecore