11 research outputs found

    In black south africans from rural and urban communities, the 4G/5G PAI-1 polymorphism influences PAI-1 activity, but not plasma clot lysis time

    Get PDF
    Data on genetic and environmental factors influencing PAI-1 levels and their consequent effect on clot lysis in black African populations are limited. We identified polymorphisms in the promoter area of the PAI-1 gene and determined their influence on PAI-1act levels and plasma clot lysis time (CLT). We also describe gene-environment interactions and the effect of urbanisation. Data from 2010 apparently healthy urban and rural black participants from the South African arm of the PURE study were cross-sectionally analysed. The 5G allele frequency of the 4G/5G polymorphism was 0.85. PAI-1act increased across genotypes in the urban subgroup (p = 0.009) but not significantly in the rural subgroup, while CLT did not differ across genotypes. Significant interaction terms were found between the 4G/5G polymorphism and BMI, waist circumference and triglycerides in determining PAI-1act, and between the 4G/5G polymorphism and fibrinogen and fibrinogen gamma prime in determining CLT. The C428T and G429A polymorphisms did not show direct relationships with PAI-1act or CLT but they did influence the association of other environmental factors with PAI-1 act and CLT. Several of these interactions differed significantly between rural and urban subgroups, particularly in individuals harbouring the mutant alleles. In conclusion, although the 4G/5G polymorphism significantly affected PAI-1act, it contributed less than 1% to the PAI-1 act variance. (Central) obesity was the biggest contributor to PAI-1act variance (12.5%). Urbanisation significantly influenced the effect of the 4G/5G polymorphism on PAI-1act as well as gene-environment interactions for the C428T and G429A genotypes in determining PAI-1act and CLT

    Factors associated with requesting and receiving euthanasia: a nationwide mortality follow-back study with a focus on patients with psychiatric disorders, dementia, or an accumulation of health problems related to old age

    Get PDF
    Background Recently, euthanasia and assisted suicide (EAS) in patients with psychiatric disorders, dementia, or an accumulation of health problems has taken a prominent place in the public debate. However, limited is known about this practice. The purpose of this study was threefold: to estimate the frequency of requesting and receiving EAS among people with (also) a psychiatric disorder, dementia, or an accumulation of health problems; to explore reasons for physicians to grant or refuse a request; and to describe differences in characteristics, including the presence of psychiatric disorders, dementia, and accumulation of health problems, between patients who did and did not request EAS and between patients whose request was or was not granted. Methods A nationwide cross-sectional survey study was performed. A stratified sample of death certificates of patients who died between 1 August and 1 December 2015 was drawn from the central death registry of Statistics Netherlands. Questionnaires were sent to the certifying physician (n = 9351, response 78%). Only deceased patients aged ≥ 17 years and who died a non-sudden death were included in the analyses (n = 5361). Results The frequency of euthanasia requests among deceased people who died non-suddenly and with (also) a psychiatric disorder (11.4%), dementia (2.1%), or an accumulation of health problems (8.0%) varied. Factors positively associated with requesting euthanasia were age (< 80 years), ethnicity (Dutch/Western), cause of death (cancer), attending physician (general practitioner), and involvement of a pain specialist or psychiatrist. Cause of death (neurological disorders, another cause) and attending physician (general practitioner) were also positively associated with receiving euthanasia. Psychiatric disorders, dementia, and/or an accumulation of health problems were negatively associated with both requesting and receiving euthanasia. Conclusions EAS in deceased patients with psychiatric disorders, dementia, and/or an accumulation of health problems is relatively rare. Partly, this can be explained by the belief that the due care criteria cannot be met. Another explanation is that patients with these conditions are less likely to request EAS

    Magnesium intake and vascular structure and function:the Hoorn Study

    Get PDF
    PURPOSE: Circulating and dietary magnesium have been shown to be inversely associated with the prevalence of cardiovascular disease (CVD) and mortality in both high and low-risk populations. We aimed to examine the association between dietary magnesium intake and several measures of vascular structure and function in a prospective cohort. METHODS: We included 789 participants who participated in the vascular screening sub-cohort of the Hoorn Study, a population-based, prospective cohort study. Baseline dietary magnesium intake was estimated with a validated food frequency questionnaire and categorised in energy-adjusted magnesium intake tertiles. Several measurements of vascular structure and function were performed at baseline and most measurements were repeated after 8 years of follow-up (n = 432). Multivariable linear and logistic regression was performed to study the cross-sectional and longitudinal associations of magnesium intake and intima-media thickness (IMT), augmentation index (Aix), pulse wave velocity (PWV), flow-mediated dilatation (FMD), and peripheral arterial disease (PAD). RESULTS: Mean absolute magnesium intake was 328 ± 83 mg/day and prior CVD and DM2 was present in 55 and 41% of the participants, respectively. Multivariable regression analyses did not demonstrate associations between magnesium intake and any of the vascular outcomes. Participants in the highest compared to the lowest magnesium intake tertile demonstrated in fully adjusted cross-sectional analyses a PWV of −0.21 m/s (95% confidence interval −1.95, 1.52), a FMD of −0.03% (−0.89, 0.83) and in longitudinal analyses an IMT of 0.01 mm (−0.03, 0.06), an Aix of 0.70% (−1.69, 3.07) and an odds ratio of 0.84 (0.23, 3.11) for PAD CONCLUSION: We did not find associations between dietary magnesium intake and multiple markers of vascular structure and function, in either cross-sectional or longitudinal analyses. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00394-021-02667-0

    Coronary Artery Calcification in Hemodialysis and Peritoneal Dialysis

    Get PDF
    Background: Vascular calcification is seen in most patients on dialysis and is strongly associated with cardiovascular mortality. Vascular calcification is promoted by phosphate, which generally reaches higher levels in hemodialysis than in peritoneal dialysis. However, whether vascular calcification develops less in peritoneal dialysis than in hemodialysis is currently unknown. Therefore, we compared coronary artery calcification (CAC), its progression, and calcification biomarkers between patients on hemodialysis and peritoneal dialysis. Methods: We measured CAC in 134 patients who had been treated exclusively with hemodialysis (n = 94) or peritoneal dialysis (n = 40) and were transplantation candidates. In 57 of them (34 on hemodialysis and 23 on peritoneal dialysis), we also measured CAC progression annually up to 3 years and the inactive species of desphospho-uncarboxylated matrix Gla protein (dp-ucMGP), fetuin-A, osteoprotegerin. We compared CAC cross-sectionally with Tobit regression. CAC progression was compared in 2 ways: with linear mixed models as the difference in square root transformed volume score per year (CAC SQRV) and with Tobit mixed models. We adjusted for potential confounders. Results: In the cross-sectional cohort, CAC volume scores were 92 mm(3) in hemodialysis and 492 mm(3) in peritoneal dialysis (adjusted difference 436 mm(3); 95% CI -47 to 919; p = 0.08). In the longitudinal cohort, peritoneal dialysis was associated with significantly more CAC progression defined as CAC SQRV (adjusted difference 1.20; 95% CI 0.09 to 2.31; p = 0.03), but not with Tobit mixed models (adjusted difference in CAC score increase per year 106 mm(3); 95% CI -140 to 352; p = 0.40). Peritoneal dialysis was associated with higher osteoprotegerin (adjusted p = 0.02) but not with dp-ucMGP or fetuin-A. Conclusions: Peritoneal dialysis is not associated with less CAC or CAC progression than hemodialysis, and perhaps with even more progression. This indicates that vascular calcification does not develop less in peritoneal dialysis than in hemodialysis

    Treatment with high dose of erythropoiesis-stimulating agents and mortality: analysis with a sequential Cox approach and a marginal structural model

    No full text
    Anemia-correction trials indicated higher mortality rates in chronic kidney disease patients assigned to higher hemoglobin targets. The safety of the high erythropoiesis-stimulating agent (ESA) doses that these patients received has therefore been questioned. However, no trial that directly compares treatment with different ESA doses has been published. We thus aimed to estimate the effect of high ESA dose on mortality in an observational cohort of dialysis patients. The Netherlands Cooperative Study on the Adequacy of Dialysis is a Dutch cohort study of incident dialysis patients in which ESA dose, comorbidities, and laboratory parameters were collected every 6 months. Mortality in patients with a high ESA dose (above median 6000 units/week) was compared with that in patients with no or low ESA dose with Cox regression analyses. To handle time-dependent confounding, a sequential Cox approach was used conditional on baseline covariates, with inverse probability of censoring weights (IPCW) for dependent censoring. Analyses were repeated with a marginal structural model (MSM) with inverse probability of treatment weights and IPCW. Hazard ratio (HR) for high ESA dose was 1.20 (95%CI 0.83-1.73) with a sequential Cox and 1.54 (95%CI 1.08-2.18) with an MSM. Truncation of weights in the MSM did not affect estimates. To compare, conventional Cox analyses indicated a baseline adjusted HR of 1.66 (95%CI 1.20-2.31). Patients treated with high ESA dose have a 1.2-1.5 increased risk of mortality. Our analyses support guidelines advising a conservative ESA dosing regimen, which carefully weighs the patients' benefits and risk

    Erythropoiesis-stimulating agent resistance and mortality in hemodialysis and peritoneal dialysis patients

    Get PDF
    Responsiveness to erythropoiesis-stimulating agents (ESAs) varies widely among dialysis patients. ESA resistance has been associated with mortality in hemodialysis (HD) patients, but in peritoneal dialysis (PD) patients data is limited. Therefore we assessed the relation between ESA resistance in both HD and PD patients. NECOSAD is a Dutch multi-center prospective cohort study of incident dialysis patients who started dialysis between January 1997 and January 2007. ESA resistance was defined as hemoglobin level < 11 g/dL with an above median ESA dose (i.e. 8,000 units/week in HD and 4,000 units/week in PD patients). Unadjusted and adjusted Cox regression analysis for all-cause 5-year mortality was performed for HD and PD patients separately. 1013 HD and 461 PD patients were included in the analysis. ESA resistant HD patients had an adjusted hazard ratio of 1.37 (95% CI 1.04-1.80) and ESA resistant PD patients had an adjusted hazard ratio of 2.41 (1.27-4.57) as compared to patients with a good response. ESA resistance, as defined by categories of ESA and Hb, is associated with increased mortality in both HD and PD patients. The effect of ESA resistance, ESA dose and hemoglobin are closely related and the exact mechanism remains unclear. Our results strengthen the need to investigate and treat causes of ESA resistance not only in HD, but also in PD patient

    Estimating residual kidney function in dialysis patients without urine collection

    No full text
    Residual kidney function contributes substantially to solute clearance in dialysis patients but cannot be assessed without urine collection. We used serum filtration markers to develop dialysis-specific equations to estimate urinary urea clearance without the need for urine collection. In our development cohort, we measured 24-hour urine clearances under close supervision in 44 patients and validated these equations in 826 patients from the Netherlands Cooperative Study on the Adequacy of Dialysis. For the development and validation cohorts, median urinary urea clearance was 2.6 and 2.4 ml/min, respectively. During the 24-hour visit in the development cohort, serum β-trace protein concentrations remained in steady state but concentrations of all other markers increased. In the validation cohort, bias (median measured minus estimated clearance) was low for all equations. Precision was significantly better for β-trace protein and β2-microglobulin equations and the accuracy was significantly greater for β-trace protein, β2-microglobulin, and cystatin C equations, compared with the urea plus creatinine equation. Area under the receiver operator characteristic curve for detecting measured urinary urea clearance by equation-estimated urinary urea clearance (both 2 ml/min or more) were 0.821, 0.850, and 0.796 for β-trace protein, β2-microglobulin, and cystatin C equations, respectively; significantly greater than the 0.663 for the urea plus creatinine equation. Thus, residual renal function can be estimated in dialysis patients without urine collection

    The association between dialysis modality and the risk for dialysis technique and non-dialysis technique-related infections

    No full text
    Infections are a major cause of morbidity and mortality among dialysis patients. Dialysis modality has been hypothesized to be a potential immunomodulatory factor. The objective of this study was to determine the influence of the first dialysis modality on the risk for infections on dialysis. Our study was conducted utilizing the Netherlands Cooperative Study on the Adequacy of Dialysis (NECOSAD) cohort of incident dialysis patients. Medical records of all patients from two tertiary care university hospitals and three regional hospitals were reviewed using pre-specified criteria. Information about infections was collected from the start of dialysis until death, modality switch, study withdrawal, kidney transplantation or at the end of the study. Age-standardized incidence rates for infections were calculated. Poisson regression analysis was used to calculate adjusted incidence rate ratios (IRRs). In total, 452 patients, of whom 285 started with haemodialysis (HD) and 167 with peritoneal dialysis (PD), were included. The median follow-up time on the first dialysis modality was similar for HD and PD, 1.8 and 2.0 dialysis years, respectively. During the first 6 months, the age-standardized infection incidence rate was higher on HD compared with PD patients (P = 0.02). Overall, PD patients had a higher infection risk [adjusted IRR: 1.65, 95% confidence interval (CI): 1.34-2.03], which could be attributed to a 4-fold increased risk for dialysis technique-related infections. The risk for non-dialysis technique-related infections was lower in PD patients (adjusted IRR: 0.56, 95% CI: 0.40-0.79). Overall, PD patients carry a higher risk for infections. Interestingly, the risk for non-dialysis technique-related infections was higher in HD patients. The links between dialysis modality and the immune system are expected to explain this difference, but future studies are needed to test these assumption
    corecore