119 research outputs found

    Supine sleep and positional sleep apnea after acute ischemic stroke and intracerebral hemorrhage

    Get PDF
    OBJECTIVE: Obstructive sleep apnea is frequent during the acute phase of stroke, and it is associated with poorer outcomes. A well-established relationship between supine sleep and obstructive sleep apnea severity exists in non-stroke patients. This study investigated the frequency of supine sleep and positional obstructive sleep apnea in patients with ischemic or hemorrhagic stroke. METHODS: Patients who suffered their first acute stroke, either ischemic or hemorrhagic, were subjected to a full polysomnography, including the continuous monitoring of sleep positions, during the first night after symptom onset. Obstructive sleep apnea severity was measured using the apnea-hypopnea index, and the NIHSS measured stroke severity. RESULTS: We prospectively studied 66 stroke patients. The mean age was 57.6±11.5 years, and the mean body mass index was 26.5±4.9. Obstructive sleep apnea (apnea-hypopnea index >5) was present in 78.8% of patients, and the mean apnea-hypopnea index was 29.7±26.6. The majority of subjects (66.7%) spent the entire sleep time in a supine position, and positional obstructive sleep apnea was clearly present in the other 23.1% of cases. A positive correlation was observed between the NIHSS and sleep time in the supine position (r s = 0.5;

    Plasma Lead Concentration and Risk of Late Kidney Allograft Failure:Findings From the TransplantLines Biobank and Cohort Studies

    Get PDF
    Rationale &amp; Objective: Heavy metals are known to induce kidney damage, and recent studies have linked minor exposures to cadmium and arsenic with increased risk of kidney allograft failure, yet the potential association of lead with late graft failure in kidney transplant recipients (KTRs) remains unknown. Study Design: Prospective cohort study in The Netherlands. Setting &amp; Participants: We studied outpatient KTRs (n = 670) with a functioning graft for ≥1 year recruited at a university setting (2008-2011) and followed for a median of 4.9 (interquartile range, 3.4-5.5) years. Additionally, patients with chronic kidney disease (n = 46) enrolled in the ongoing TransplantLines Cohort and Biobank Study (2016-2017, ClinicalTrials.gov identifier NCT03272841) were studied at admission for transplant and at 3, 6, 12, and 24 months after transplant. Exposure: Plasma lead concentration was log2-transformed to estimate the association with outcomes per doubling of plasma lead concentration and also considered categorically as tertiles of lead distribution. Outcome: Kidney graft failure (restart of dialysis or repeat transplant) with the competing event of death with a functioning graft. Analytical Approach: Multivariable-adjusted cause-specific hazards models in which follow-up of KTRs who died with a functioning graft was censored. Results: Median baseline plasma lead concentration was 0.31 (interquartile range, 0.22-0.45) μg/L among all KTRs. During follow-up, 78 (12%) KTRs experienced graft failure. Higher plasma lead concentration was associated with increased risk of graft failure (hazard ratio, 1.59 [95% CI, 1.14-2.21] per doubling; P = 0.006) independent of age, sex, transplant characteristics, estimated glomerular filtration rate, proteinuria, smoking status, alcohol intake, and plasma concentrations of cadmium and arsenic. These findings remained materially unchanged after additional adjustment for dietary intake and were consistent with those of analyses examining lead categorically. In serial measurements, plasma lead concentration was significantly higher at admission for transplant than at 3 months after transplant (P = 0.001), after which it remained stable over 2 years of follow-up (P = 0.2). Limitations: Observational study design. Conclusions: Pretransplant plasma lead concentrations, which decrease after transplant, are associated with increased risk of late kidney allograft failure. These findings warrant further studies to evaluate whether preventive or therapeutic interventions to decrease plasma lead concentration may represent novel risk-management strategies to decrease the rate of kidney allograft failure.</p

    Plasma Lead Concentration and Risk of Late Kidney Allograft Failure:Findings From the TransplantLines Biobank and Cohort Studies

    Get PDF
    Rationale &amp; Objective: Heavy metals are known to induce kidney damage, and recent studies have linked minor exposures to cadmium and arsenic with increased risk of kidney allograft failure, yet the potential association of lead with late graft failure in kidney transplant recipients (KTRs) remains unknown. Study Design: Prospective cohort study in The Netherlands. Setting &amp; Participants: We studied outpatient KTRs (n = 670) with a functioning graft for ≥1 year recruited at a university setting (2008-2011) and followed for a median of 4.9 (interquartile range, 3.4-5.5) years. Additionally, patients with chronic kidney disease (n = 46) enrolled in the ongoing TransplantLines Cohort and Biobank Study (2016-2017, ClinicalTrials.gov identifier NCT03272841) were studied at admission for transplant and at 3, 6, 12, and 24 months after transplant. Exposure: Plasma lead concentration was log2-transformed to estimate the association with outcomes per doubling of plasma lead concentration and also considered categorically as tertiles of lead distribution. Outcome: Kidney graft failure (restart of dialysis or repeat transplant) with the competing event of death with a functioning graft. Analytical Approach: Multivariable-adjusted cause-specific hazards models in which follow-up of KTRs who died with a functioning graft was censored. Results: Median baseline plasma lead concentration was 0.31 (interquartile range, 0.22-0.45) μg/L among all KTRs. During follow-up, 78 (12%) KTRs experienced graft failure. Higher plasma lead concentration was associated with increased risk of graft failure (hazard ratio, 1.59 [95% CI, 1.14-2.21] per doubling; P = 0.006) independent of age, sex, transplant characteristics, estimated glomerular filtration rate, proteinuria, smoking status, alcohol intake, and plasma concentrations of cadmium and arsenic. These findings remained materially unchanged after additional adjustment for dietary intake and were consistent with those of analyses examining lead categorically. In serial measurements, plasma lead concentration was significantly higher at admission for transplant than at 3 months after transplant (P = 0.001), after which it remained stable over 2 years of follow-up (P = 0.2). Limitations: Observational study design. Conclusions: Pretransplant plasma lead concentrations, which decrease after transplant, are associated with increased risk of late kidney allograft failure. These findings warrant further studies to evaluate whether preventive or therapeutic interventions to decrease plasma lead concentration may represent novel risk-management strategies to decrease the rate of kidney allograft failure.</p

    Plasma Lead Concentration and Risk of Late Kidney Allograft Failure:Findings From the TransplantLines Biobank and Cohort Studies

    Get PDF
    RATIONALE & OBJECTIVE: Heavy metals are known to induce kidney damage and recent studies have linked minor exposures to cadmium and arsenic with increased risk of kidney allograft failure, yet the potential association of lead (Pb) with late graft failure in kidney transplant recipients (KTR) remains unknown. STUDY DESIGN: Prospective cohort study in the Netherlands. SETTING & PARTICIPANTS: We studied outpatient KTR (n=670) with a functioning graft for ≥1 year recruited at a university setting (2008-2011, NCT02811835) and followed, on average, for 4.9 (IQR, 3.4‒5.5) years. Additionally, end-stage kidney disease patients (n=46) enrolled in the ongoing TransplantLines Cohort and Biobank Study (2016-2017, NCT03272841) were studied at admission for transplantation and at 3, 6, 12, and 24 months after transplantation. EXPOSURE: Plasma Pb was log2 transformed to estimate the association with outcomes per doubling of plasma Pb concentration and also considered categorically as tertiles of the Pb distribution. OUTCOME: Kidney graft failure (restart of dialysis or re-transplantation) with the competing event of death with a functioning graft. ANALYTICAL APPROACH: Multivariable-adjusted cause-specific hazards models where follow-up of KTR who died with a functioning graft was censored. RESULTS: Median baseline plasma Pb was 0.31 (IQR, 0.22─0.45) μg/L among all KTRs. During follow-up, 78 (12%) KTR developed graft failure. Higher plasma Pb was associated with increased risk of graft failure (HR 1.59, 95% CI 1.14‒2.21 per doubling; P=0.006) independent of age, sex, transplant characteristics, eGFR, proteinuria, smoking status, alcohol intake, and plasma concentrations of cadmium and arsenic. These findings remained materially unchanged after additional adjustment for dietary intake and were consistent with those of analyses examining Pb categorically. In serial measurements, plasma Pb was significantly higher at admission for transplantation than at 3-months post-transplant (P=0.001), after which it remained stable over 2 years of follow-up (P=0.2). LIMITATIONS: Observational study design. CONCLUSIONS: Pretransplant plasma Pb concentrations, which fall after transplantation, are associated with increased risk of late kidney allograft failure. These findings warrant further studies to evaluate whether preventive or therapeutic interventions to decrease plasma Pb may represent novel risk-management strategies to decrease the rate of kidney allograft failure

    Low circulating concentrations of very long chain saturated fatty acids are associated with high risk of mortality in kidney transplant recipients

    Get PDF
    Kidney transplant recipients (KTR) are at increased risk of mortality, particularly from infectious diseases, due to lifelong immunosuppression. Although very long chain saturated fatty acids (VLSFA) have been identified as crucial for phagocytosis and clearance of infections, their association with mortality in immunocompromised patient groups has not been studied. In this prospective cohort study we included 680 outpatient KTR with a functional graft ≥1 year and 193 healthy controls. Plasma VLSFA (arachidonic acid (C20:0), behenic acid (C22:0) and lignoceric acid (C24:0)) were measured by gas chromatography coupled with a flame ionization detector. Cox regression analyses was used to prospectively study the associations of VLSFA with all-cause and cause-specific mortality. All studied VLSFA were significantly lower in KTR compared to healthy controls (all p < 0.001). During a median (interquartile range) follow-up of 5.6 (5.2–6.3) years, 146 (21%) KTR died, of which 41 (28%) died due to infectious diseases. In KTR, C22:0 was inversely associated with risk of all-cause mortality, with a HR (95% CI) per 1-SD-increment of 0.79 (0.64–0.99), independent of adjustment for potential confounders. All studied VLSFA were particularly strongly associated with mortality from infectious causes, with respective HRs for C20:0, C22:0 and C24:0 of 0.53 (0.35–0.82), 0.48 (0.30–0.75), and 0.51 (0.33–0.80), independent of potential confounders. VLSFA are inversely associated with infectious disease mortality in KTR after adjustment, including HDL-cholesterol. Further studies are needed to assess the effect of VLSFA-containing foods on the risk of infectious diseases in immunocompromised patient groups

    Bone Mineral Density and Aortic Calcification: Evidence for a Bone-Vascular Axis After Kidney Transplantation

    Get PDF
    BACKGROUND: Chronic kidney disease mineral and bone disorders (CKD-MBD) and vascular calcification are often seen in kidney transplantation recipients (KTR). This study focused on the bone-vascular axis hypothesis, the pathophysiological mechanisms driving both bone loss and vascular calcification, supported by an association between lower bone mineral density (BMD) and higher risk of vascular calcification. METHODS: KTR referred for a dual-energy X-ray absorptiometry procedure within 6 mo after transplantation were included in a cross-sectional study (2004-2014). Areal BMD was measured at the proximal femur, and abdominal aortic calcification (AAC) was quantified (8-points score) from lateral single-energy images of the lumbar spine. Patients were divided into 3 AAC categories (negative-AAC: AAC 0; low-AAC: AAC 1-3; and high-AAC: AAC 4-8). Multivariable-adjusted multinomial logistic regression models were performed to study the association between BMD and AAC. RESULTS: We included 678 KTR (51 ± 13 y old, 58% males), 366 (54%) had BMD disorders, and 266 (39%) had detectable calcification. High-AAC was observed in 9%, 11%, and 25% of KTR with normal BMD, osteopenia, and osteoporosis, respectively (P &lt; 0.001). Higher BMD (T-score, continuous) was associated with a lower risk of high-AAC (odds ratio 0.61, 95% confidence interval 0.42-0.88; P = 0.008), independent of age, sex, body mass index, estimated glomerular filtration rate, and immunosuppressive therapy. KTR with normal BMD were less likely to have high-AAC (odds ratio 0.24, 95% confidence interval 0.08-0.72; P = 0.01). CONCLUSIONS: BMD disorders are highly prevalent in KTR. The independent inverse association between BMD and AAC may provide evidence to point toward the existence, while highlighting the clinical and epidemiological relevance, of a bone-vascular axis after kidney transplantation.</p

    Proposta para a formação de citotécnicos do SIG-Citotecnologia da CPLP

    Get PDF
    Os SIGs são Grupos Especiais de Interesse (SIG - Special Interest Group) que buscam promover a integração e troca de experiências entre Unidades e Núcleos de Telessaúde membros da Rede Universitária de Telemedicina (RUTE), um programa da Rede Nacional de Ensino e Pesquisa (RNP/MCTI). A meta do grupo é a interação entre profissionais de citotecnologia de países de língua portuguesa para promover capacitações, discutir projetos conjuntos e a publicação de materiais de ensino na área. O objetivo deste documento foi definir as competências a adquirir pelos estudantes na citecnologia (formação básica e/ou avançada) de modo a que as entidades formadoras tenham uma orientação para a elaboração e atualização de programas formativos dirigido à área de conhecimento da citotecnologia.info:eu-repo/semantics/publishedVersio

    Consumption of fruits and vegetables and cardiovascular mortality in renal transplant recipients:A prospective cohort study

    Get PDF
    Background It currently remains understudied whether low consumption of fruits and vegetables after kidney transplantation may be a modifiable cardiovascular risk factor. We aimed to investigate the associations between consumption of fruits and vegetables and cardiovascular mortality in renal transplant recipients (RTRs). Methods Consumption of fruits and vegetables was assessed in an extensively phenotyping cohort of RTRs. Multivariable-adjusted Cox proportional hazards regression analyses were performed to assess the risk of cardiovascular mortality. Results We included 400 RTRs (age 5212 years, 54% males). At a median follow-up of 7.2years, 23% of RTRs died (53% were due to cardiovascular causes). Overall, fruit consumption was not associated with cardiovascular mortality {hazard ratio [HR] 0.82 [95% confidence interval (CI) 0.60-1.14]; P = 0.24}, whereas vegetable consumption was inversely associated with cardiovascular mortality [HR 0.49 (95% CI 0.34-0.71); P 45mL/min/1.73 m(2) [HR 0.56 (95% CI 0.35-0.92); P = 0.02] or the absence of proteinuria [HR 0.62 (95% CI 0.41-0.92); P = 0.02]. Conclusions In RTRs, a relatively higher vegetable consumption is independently and strongly associated with lower cardiovascular mortality. A relatively higher fruit consumption is also associated with lower cardiovascular mortality, although particularly in RTRs with eGFR>45mL/min/1.73 m(2) or an absence of proteinuria. Further studies seem warranted to investigate whether increasing consumption of fruits and vegetables may open opportunities for potential interventional pathways to decrease the burden of cardiovascular mortality in RTRs.Dutch Kidney Foundation: C00.187

    Supine sleep and positional sleep apnea after acute ischemic stroke and intracerebral hemorrhage

    Get PDF
    OBJECTIVE: Obstructive sleep apnea is frequent during the acute phase of stroke, and it is associated with poorer outcomes. A well-established relationship between supine sleep and obstructive sleep apnea severity exists in non-stroke patients. This study investigated the frequency of supine sleep and positional obstructive sleep apnea in patients with ischemic or hemorrhagic stroke. METHODS: Patients who suffered their first acute stroke, either ischemic or hemorrhagic, were subjected to a full polysomnography, including the continuous monitoring of sleep positions, during the first night after symptom onset. Obstructive sleep apnea severity was measured using the apnea-hypopnea index, and the NIHSS measured stroke severity. RESULTS: We prospectively studied 66 stroke patients. The mean age was 57.6+/-11.5 years, and the mean body mass index was 26.5+/-4.9. Obstructive sleep apnea (apnea-hypopnea index &gt;= 5) was present in 78.8% of patients, and the mean apnea-hypopnea index was 29.7+/-26.6. The majority of subjects (66.7%) spent the entire sleep time in a supine position, and positional obstructive sleep apnea was clearly present in the other 23.1% of cases. A positive correlation was observed between the NIHSS and sleep time in the supine position (r(s) = 0.5; p&lt;0.001). CONCLUSIONS: Prolonged supine positioning during sleep was highly frequent after stroke, and it was related to stroke severity. Positional sleep apnea was observed in one quarter of stroke patients, which was likely underestimated during the acute phase of stroke. The adequate positioning of patients during sleep during the acute phase of stroke may decrease obstructive respiratory events, regardless of the stroke subtype.Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP)Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP)Conselho Nacional de Desenvolvimento Cientifico e Tecnologico (CNPq)Conselho Nacional de Desenvolvimento Cientifico e Tecnologico (CNPq)Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (CAPES)Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (CAPES
    corecore