182 research outputs found

    CXCR4 Physically Associates with the T Cell Receptor to Signal in T Cells

    Get PDF
    SummarySDF-1α (CXCL12) signaling via its receptor, CXCR4, stimulates T cell chemotaxis and gene expression. The ZAP-70 tyrosine kinase critically mediates SDF-1α-dependent migration and prolonged ERK mitogen-activated protein (MAP) kinase activation in T cells. However, the molecular mechanism by which CXCR4 or other G protein-coupled receptors activate ZAP-70 has not been characterized. Here we show that SDF-1α stimulates the physical association of CXCR4 and the T cell receptor (TCR) and utilizes the ZAP-70 binding ITAM domains of the TCR for signal transduction. This pathway is responsible for several of the effects of SDF-1α on T cells, including prolonged ERK MAP kinase activity, increased intracellular calcium ion concentrations, robust AP-1 transcriptional activity, and SDF-1α costimulation of cytokine secretion. These results suggest new paradigms for understanding the effects of SDF-1α and other chemokines on immunity

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    A novel nonsense mutation of EXT1 gene in an Argentinian patient with Multiple Hereditary Exostoses

    Get PDF
    Multiple hereditary exostoses (MHE), also known as multiple osteochondromatosis, is an autosomal-dominant O-linked glycosylation disorder recently classified as EXT1/EXT2-CDG in the congenital disorder of glycosylation (CDG)nomenclature. MHE is characterized by the presence of multiple cartilage-capped tumors,called “osteochondromas,” which usually develop in the juxta-epiphyseal regions of the long bones. The prevalence of MHE is estimated at 1:50,000 in the general population1,2. The Online Mendelian Inheritance in Man (OMIM) database classified it as either 133700 or 133701, according to whether the mutations occurred in the EXT1 or the EXT2 gene. These genes are located at 8q24 and 11p11-11p12, respectively, and they encode the copolymerases responsible for heparan sulfate biosynthesis. EXT1 and EXT2 are tumor suppressor genes of the EXT gene family. The EXT1 gene contains eleven exons with a coding region of 2238 base pairs (bp), and the EXT2 gene contains sixteen exons with a coding region of 2154 bp3-6. These genes encode two glycosyltransferases involved in heparan sulphate biosynthesis, exostosin-1 (EXT1) (EC2.4.1.224)and exostosin-2 (EXT2) (EC2.4.1.225), whose impairment leads to the formation of exostoses4,7-9. Inactivating mutations (nonsense, frameshift, and splice site mutations) in EXT1 and EXT2 genes represent the majority of mutations that cause MHE. An overview of the reported variants is provided by the online Multiple Osteochondroma Mutation Database10. The most important complication of MHE is the malignant transformation of osteochondroma to chondrosarcoma, which is estimated to occur in 0.5% to 5% of patients6. Chondrosarcomas arise de novo (primary) or as a result of a preexistingcartilage lesion (secondary). The biological aggressiveness of chondrosarcomas can be predicted by means of a histological grading system (grade I to grade III), based on three parameters: cellularity, degree of nuclear atypia, and mitotic activity11,12.In our case report, we investigated the clinical, radiographic, and genetic aspects of a patient with MHE with a severe phenotype and malignant transformation to chondrosarcoma.Fil: Delgado, María Andrea. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Sarrión, Patricia. Universidad de Barcelona; EspañaFil: Azar, Nydia Beatríz. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Zecchini, Lorena del Valle. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Robledo, Hector Hugo. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Segura, Florencio Vicente. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Balcells, Susana. Universidad de Barcelona; EspañaFil: Grinberg Vaisman, Daniel Raúl. Universidad de Barcelona; EspañaFil: Dodelson de Kremer, Raquel. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; ArgentinaFil: Asteggiano, Carla Gabriela. Gobierno de la Provincia de Córdoba. Ministerio de Salud. Hospital de Niños de la Santísima Trinidad; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Córdoba; Argentina. Universidad Católica de Córdoba; Argentin

    A broad spectrum of genomic changes in Latinamerican patients with EXT1/EXT2-CDG

    Get PDF
    Multiple osteochondromatosis (MO), or EXT1/EXT2-CDG, is an autosomal dominant O-linked glycosylation disorder characterized by the formation of multiple cartilage-capped tumors (osteochondromas). In contrast, solitary osteochondroma (SO) is a non-hereditary condition. EXT1 and EXT2, are tumor suppressor genes that encode glycosyltransferases involved in heparan sulfate elongation. We present the clinical and molecular analysis of 33 unrelated Latin American patients (27 MO and 6 SO). Sixty-three percent of all MO cases presented severe phenotype and two malignant transformations to chondrosarcoma (7%). We found the mutant allele in 78% of MO patients. Ten mutations were novel. The disease-causing mutations remained unknown in 22% of the MO patients and in all SO patients. No second mutational hit was detected in the DNA of the secondary chondrosarcoma from a patient who carried a nonsense EXT1 mutation. Neither EXT1 nor EXT2 protein could be detected in this sample. This is the first Latin American research program on EXT1/EXT2-CDG

    Low selenium intake is associated with risk of all-cause mortality in kidney transplant recipients

    Get PDF
    Background: Deficiency of the essential trace element selenium is common in kidney transplant recipients (KTR), potentially hampering antioxidant and anti-inflammatory defence. Whether this impacts the long-term outcomes of KTR remains unknown. We investigated the association of urinary selenium excretion, a biomarker of selenium intake, with all-cause mortality; and its dietary determinants.Methods: In this cohort study, outpatient KTR with a functioning graft for longer than 1 year were recruited (2008–11). Baseline 24-h urinary selenium excretion was measured by mass spectrometry. Diet was assessed by a 177-item food frequency questionnaire, and protein intake was calculated by the Maroni equation. Multivariable linear and Cox regression analyses were performed.Results: In 693 KTR (43% men, 52 ± 12 years), baseline urinary selenium excretion was 18.8 (interquartile range 15.1–23.4) μg/24-h. During a median follow-up of 8 years, 229 (33%) KTR died. KTR in the first tertile of urinary selenium excretion, compared with those in the third, had over a 2-fold risk of all-cause mortality [hazard ratio 2.36 (95% confidence interval 1.70–3.28); P &lt; .001], independent of multiple potential confounders including time since transplantation and plasma albumin concentration. The most important dietary determinant of urinary selenium excretion was protein intake (Standardized β 0.49, P &lt; .001).Conclusions: Relatively low selenium intake is associated with a higher risk of all-cause mortality in KTR. Dietary protein intake is its most important determinant. Further research is required to evaluate the potential benefit of accounting for selenium intake in the care of KTR, particularly among those with low protein intake

    Volcanic fertilization of Balinese rice paddies

    Get PDF
    Abstract Since the advent of high-yielding &apos;&apos;Green Revolution&apos;&apos; rice agriculture in the 1970s, Balinese farmers have been advised to supply all the potassium and phosphate needed by rice plants via chemical fertilizers. This policy neglects the contribution of minerals leached from the volcanic soil and transported via irrigation systems. We documented frequent deposition of volcanic ash deposits to rice producing watersheds. Concentrations of phosphorus in rivers were between 1 and 4 mg l − 1 PO 4 , increasing downstream. We measured extractable potassium and phosphate levels in the soils of unfertilized Balinese rice paddies, and found them to be indistinguishable from those in fertilized paddies, and sufficient for high grain yields. Field experiments varying phosphorus applications to rice fields from 0 to 100 kg superphosphate per hectare (7-26 kg P ha − 1 ) demonstrated small increases in harvest yields only with the smallest additions. Direct measurements of PO 4 in irrigation waters indicate that most of the added phosphate flows out of the paddies and into the river systems, accumulating to very high levels before reaching the coast
    corecore