21 research outputs found

    Association between aortic calcification, cardiovascular events, and mortality in kidney and pancreas-kidney transplant recipients

    Get PDF
    BACKGROUND: Cardiovascular (CV) disease is the leading cause of death in kidney and simultaneous pancreas-kidney (SPK) transplant recipients. Assessing abdominal aortic calcification (AAC), using lateral spine x-rays and the Kaupilla 24-point AAC (0-24) score, may identify transplant recipients at higher CV risk. METHODS: Between the years 2000 and 2015, 413 kidney and 213 SPK first transplant recipients were scored for AAC at time of transplant and then followed for CV events (coronary heart, cerebrovascular, or peripheral vascular disease), graft-loss, and all-cause mortality. RESULTS: The mean age was 44 ± 12 years (SD) with 275 (44%) having AAC (26% moderate: 1-7 and 18% high: ≄8). After a median of 65 months (IQR 29-107 months), 46 recipients experienced CV events, 59 died, and 80 suffered graft loss. For each point increase in AAC, the unadjusted hazard ratios (HR) for CV events and mortality were 1.11 (95% CI 1.07-1.15) and 1.11 (1.08-1.15). These were similar after adjusting for age, gender, smoking, transplant type, dialysis vintage, and diabetes: aHR 1.07 (95% CI 1.02-1.12) and 1.09 (1.04-1.13). For recipients with high versus no AAC, the unadjusted and fully-adjusted HRs for CV events were 5.90 (2.90-12.02) and 3.51 (1.54-8.00), for deaths 5.39 (3.00-9.68) and 3.38 (1.71-6.70), and for graft loss 1.30 (0.75-2.28) and 1.94 (1.04-3.27) in age and smoking history-adjusted analyses. CONCLUSION: Kidney and SPK transplant recipients with high AAC have 3-fold higher CV and mortality risk and poorer graft outcomes than recipients without AAC. AAC scoring may be useful in assessing and targeted risk-lowering strategies

    Effects of erythropoietin therapy on the lipid profile in end-stage renal failure

    Get PDF
    Effects of erythropoietin therapy on the lipid profile in end-stage renal failure. To evaluate the effects of erythropoietin (EPO) therapy on the lipid profile in end-stage renal failure, we undertook a prospective study in patients on both hemodialysis (HD) and continuous ambulatory peritoneal dialysis (CAPD). One hundred and twelve patients (81 HD, 31 CAPD) were enrolled into the study. Lipid parameters [that is, total cholesterol and the LDL and HDL subfractions, triglycerides, lipoprotein (a), apoproteins A and B], full blood count, iron studies, B12, folate, blood urea, aluminium and serum parathyroid hormone were measured prior to commencement of EPO therapy. Ninety-five patients were reassessed 5.2 ± 0.3 (mean ± SEM) months later and 53 patients underwent a further assessment 13.1 ± 0.6 months after the commencement of EPO, giving an overall follow-up of 10.0 ± 0.6 months in 95 patients. As expected, EPO treatment was associated with an increase in hemoglobin (7.7 ± 0.1 vs. 9.9 ± 0.2 g/dl; P < 0.001) and a decrease in ferritin (687 ± 99 vs. 399 ± 69 ”g/liter; P < 0.01). A significant fall in total cholesterol occurred (5.8 ± 0.1 vs. 5.4 ± 0.2 mmol/liter; P < 0.05) in association with a fall in apoprotein B (1.15 ± 0.04 vs. 1.04 ± 0.06; P < 0.05) and serum triglycerides (2.26 ± 0.14 vs. 1.99 ± 0.21; P < 0.05) during the course of the study. Other lipid parameters did not change, although there was a trend towards improvement. These changes correlated with the increase in Hb (P < 0.001 in each case), and the reduction in ferritin for total cholesterol (P < 0.02), LDL cholesterol (P < 0.03), and to a lesser extent apoprotein B (P < 0.07). No difference was observed in patients using maintenance HD or CAPD, and similar trends were observed in male and female patients. Improvements in the lipid profile occurred independently of the time on dialysis prior to the commencement of EPO. We conclude that EPO treatment is associated with alterations in the lipid profile which may suggest a long-term improvement in the vascular morbidity of chronic renal failure. The causes of the improved lipids are not addressed by this study and may be equally due to a direct or secondary benefit of EPO therapy

    Behavioral profiling of multiple pairs of rats selectively bred for high and low alcohol intake using the MCSF test

    Get PDF
    Genetic aspects of alcoholism have been modeled using rats selectively bred for extremes of alcohol preference and voluntary alcohol intake. These lines show similar alcohol drinking phenotypes but have different genetic and environmental backgrounds and may therefore display diverse behavioral traits as seen in human alcoholics. The multivariate concentric square fieldℱ (MCSF) test is designed to provoke exploration and behaviors associated with risk assessment, risk taking and shelter seeking in a novel environment. The aim was to use the MCSF to characterize behavioral profiles in rat lines from selective breeding programs in the United States (P/NP, HAD1/LAD1, HAD2/LAD2), Italy (sP/sNP) and Finland (AA/ANA). The open field and elevated plus maze tests were used as reference tests. There were substantial differences within some of the pairs of selectively bred rat lines as well as between all alcohol-preferring rats. The most pronounced differences within the pairs of lines were between AA and ANA rats and between sP and sNP rats followed by intermediate differences between P and NP rats and minor differences comparing HAD and LAD rats. Among all preferring lines, P, HAD1 and HAD2 rats shared similar behavioral profiles, while AA and sP rats were quite different from each other and the others. No single trait appeared to form a common 'pathway' associated with a high alcohol drinking phenotype among all of the alcohol-preferring lines of rats. The marked behavioral differences found in the different alcohol-preferring lines may mimic the heterogeneity observed among human alcoholic subtypes

    Outcomes of cinacalcet withdrawal in Australian dialysis patients

    Get PDF
    Background: Secondary hyperparathyroidism (SHPT) in chronic kidney disease is associated with cardiovascular and bone pathology. Measures to achieve parathyroid hormone (PTH) target values and control biochemical abnormalities associated with SHPT require complex therapies, and severe SHPT often requires parathyroidectomy or the calcimimetic cinacalcet. In Australia, cinacalcet was publicly funded for dialysis patients from 2009 to 2015 when funding was withdrawn following publication of the EVOLVE study, which resulted in most patients on cinacalcet ceasing therapy. We examined the clinical and biochemical outcomes associated with this change at Australian renal centres. Methods: We conducted a retrospective study of dialysis patients who ceased cinacalcet after August 2015 in 11 Australian units. Clinical outcomes and changes in biochemical parameters were assessed over a 24‐ and 12‐month period respectively from cessation of cinacalcet. Results: 228 patients were included (17.7% of all dialysis patients from the units). Patients were aged 63±15 years with 182 patients on haemodialysis and 46 on peritoneal dialysis. Over 24 months following cessation of cinacalcet, we observed 26 parathyroidectomies, 3 episodes of calciphylaxis, 8 fractures and 50 deaths. Seven patients recommenced cinacalcet, meeting criteria under a special access scheme. Biochemical changes from baseline to 12 months after cessation included increased levels of serum PTH from 54 (IQR 27‐90) pmol/L to 85 (IQR 41‐139) pmol/L (

    Werther

    Get PDF
    Johann Wolfgang Goethe'nin Malumat'ta yayımlanan Werther adlı romanının ilk ve son tefrikalarıTefrikanın devamına rastlanmamÄ±ĆŸ, tefrika yarım kalmÄ±ĆŸtır

    Safety of intravenous ferric carboxymaltose versus oral iron in patients with nondialysis-dependent CKD: an analysis of the 1-year FIND-CKD trial.

    Get PDF
    Background: The evidence base regarding the safety of intravenous (IV) iron therapy in patients with chronic kidney disease (CKD) is incomplete and largely based on small studies of relatively short duration. Methods: FIND-CKD (ClinicalTrials.gov number NCT00994318) was a 1-year, open-label, multicenter, prospective study of patients with nondialysis-dependent CKD, anemia and iron deficiency randomized (1:1:2) to IV ferric carboxymaltose (FCM), targeting higher (400-600 ”g/L) or lower (100-200 ”g/L) ferritin, or oral iron. A post hoc analysis of adverse event rates per 100 patient-years was performed to assess the safety of FCM versus oral iron over an extended period. Results: The safety population included 616 patients. The incidence of one or more adverse events was 91.0, 100.0 and 105.0 per 100 patient-years in the high ferritin FCM, low ferritin FCM and oral iron groups, respectively. The incidence of adverse events with a suspected relation to study drug was 15.9, 17.8 and 36.7 per 100 patient-years in the three groups; for serious adverse events, the incidence was 28.2, 27.9 and 24.3 per 100 patient-years. The incidence of cardiac disorders and infections was similar between groups. At least one ferritin level ≄800 ”g/L occurred in 26.6% of high ferritin FCM patients, with no associated increase in adverse events. No patient with ferritin ≄800 ”g/L discontinued the study drug due to adverse events. Estimated glomerular filtration rate remained the stable in all groups. Conclusions: These results further support the conclusion that correction of iron deficiency anemia with IV FCM is safe in patients with nondialysis-dependent CKD

    Mushroom Clouds for Vitamin D?

    No full text

    Chronic kidney disease-related sarcopenia as a prognostic indicator in elderly haemodialysis patients

    No full text
    Abstract Background The mortality of dialysis patients greatly exceeds that of the general population and identifying predictive factors for mortality may provide opportunities for earlier intervention. This study assessed the influence of sarcopenia on mortality in patients on haemodialysis. Methods This prospective, observational study enrolled 77 haemodialysis patients aged 60 years and over, of whom 33 (43%) were female, from two community dialysis centres. Baseline demographic and laboratory data were collected, and sarcopenia was diagnosed using grip strength, muscle mass by bioimpedance analysis (BIA) and muscle function by timed up-and-go according to European Working Group on Sarcopenia in Older People criteria. Nutritional status was assessed using a subjective nutritional assessment score, comprising functional changes in weight, appetite, gastrointestinal symptoms and energy.. A comorbidity score (maximum 7 points) was derived from the presence or absence of hypertension, ischaemic heart disease, vascular disease (cerebrovascular disease, peripheral vascular disease, and abdominal aortic aneurysm), diabetes mellitus, respiratory disease, a history of malignancy and psychiatric disease. Outcomes over six years were linked to the Australian and New Zealand Dialysis and Transplant Registry. Results The median participant age was 71 years (range 60–87). Probable and confirmed sarcopenia was present in 55.9% and severe sarcopenia with reduced functional testing in 11.7%. Over 6 years, overall mortality was 50 of the 77 patients (65%), principally from cardiovascular events, dialysis withdrawal and infection. There were no significant survival differences between patients with no, probable, confirmed, or severe sarcopenia, or between tertiles of the nutritional assessment score. After adjustment for age, dialysis vintage, mean arterial pressure (MAP) and the total comorbidity score, no sarcopenia category predicted mortality. However, the total comorbidity score [Hazard Ratio (HR) 1.27, Confidence Intervals (CI) 1.02, 1.58, p = 0.03] and MAP (HR 0.96, CI 0.94, 0.99, P = < 0.01) predicted mortality. Conclusion Sarcopenia is highly prevalent in elderly haemodialysis patients but is not an independent predictor of mortality. Haemodialysis patients have multiple competing risks for mortality which, in this study, was predicted by a lower MAP and a higher total comorbidity score. Trial registration Recruitment commenced December 2011. The study was registered 10.01.2012 with the Australian New Zealand Clinical Trials Registry (ACTRN12612000048886)

    Improving bone mineral density screening by using digital X-radiogrammetry combined with mammography

    No full text
    Fracture risk evaluation of postmenopausal women is suboptimal, but most women undergo screening mammography. Digital X-radiogrammetry (DXR) determines bone mineral density (BMD) at the metacarpal shaft and can be performed on mammography equipment. This study examined correlations between DXR and dual-energy X-ray absorptiometry (DXA) in women undergoing mammography, to identify optimal DXR thresholds for triage to osteoporosis screening by central DXA. Postmenopausal women over age 50 years, recruited from Westmead Hospital\u27s Breast Cancer Institute, underwent mammography, DXR and DXA. Agreements were determined using the area under the receiver operator characteristic (AUC ROC) curve and Lin\u27s concordance correlation coefficient. Optimal DXR T-scores to exclude osteoporosis by DXA were determined using the Youden\u27s method. Of 200 women aged 64 ± 7 years (mean ± standard deviation [SD]), 82% had been diagnosed with breast cancer and 37% reported prior fracture. DXA T-scores were ≀ −1 at the spine, hip or forearm in 77.5% and accorded with DXR T-scores in 77%. For DXR and DXA T-scores ≀ −2.5, the AUC ROC was 0.87 (95% confidence interval [CI], 0.81–0.94) at the 1/3 radius, and 0.74 (95% CI, 0.64–0.84) for hip or spine. DXR T-scores \u3e −1.98 provided a negative predictive value of 94% (range, 88%, 98%) for osteoporosis by central DXA. In response to a questionnaire, radiography staff responded that DXR added 5 minutes to patient throughput with minimal workflow impact. In the mammography setting, triaging women with a screening DXR T-score \u3c −1.98 for DXA evaluation would capture a significant proportion of at-risk women who may not otherwise be identified and improve current low rates of osteoporosis screening. © 2022 The Authors. JBMR Plus published by Wiley Periodicals LLC on behalf of American Society for Bone and Mineral Research

    Individualized Therapy to Prevent Bone Mineral Density Loss after Kidney and Kidney-Pancreas Transplantation

    No full text
    Background and objectives: Most patients who undergo kidney or kidney-pancreas transplantation have renal osteodystrophy, and immediately after transplantation bone mineral density (BMD) commonly falls. Together, these abnormalities predispose to an increased fracture incidence. Bisphosphonate or calcitriol therapy can preserve BMD after transplantation, but although bisphosphonates may be more effective, they pose potential risks for adynamic bone
    corecore