52 research outputs found

    Ca2+-Mg2+-dependent ATP-ase activity in hemodialyzed children. Effect of a hemodialysis session

    Get PDF
    In the course of chronic kidney disease (CKD) the intracellular erythrocyte calcium (Cai2+) level increases along with the progression of the disease. The decreased activity of Ca2+-Mg2+-dependent ATP-ase (PMCA) and its endogenous modulators calmodulin (CALM), calpain (CANP), and calpastatin (CAST) are all responsible for disturbed calcium metabolism. The aim of the study was to analyze the activity of PMCA, CALM, and the CANP-CAST system in the red blood cells (RBCs) of hemodialyzed (HD) children and to estimate the impact of a single HD session on the aforementioned disturbances. Eighteen patients on maintenance HD and 30 healthy subjects were included in the study. CALM, Cai2+ levels and basal PMCA (bPMCA), PMCA, CANP, and CAST activities were determined in RBCs before HD, after HD, and before the next HD session. Prior to the HD session, the level of Cai2+ and the CAST activity were significantly higher, whereas bPMCA, PMCA, and CANP activities and the CALM level were significantly lower than in controls. After the HD session, the Cai2+ concentration and the CAST activity significantly decreased compared with the basal values, whereas the other parameters significantly increased, although they did not reach the levels of healthy children. The values observed prior to both HD sessions were similar. Cai2+ homeostasis is severely disturbed in HD children, which may be caused by the reduction in the PMCA activity, CALM deficiency, and CANP-CAST system disturbances. A single HD session improved these disturbances but the effect is transient

    Change in albuminuria as a surrogate endpoint for progression of kidney disease: a meta-analysis of treatment effects in randomised clinical trials

    Get PDF
    Background Change in albuminuria has strong biological plausibility as a surrogate endpoint for progression of chronic kidney disease, but empirical evidence to support its validity is lacking. We aimed to determine the association between treatment effects on early changes in albuminuria and treatment effects on clinical endpoints and surrograte endpoints, to inform the use of albuminuria as a surrogate endpoint in future randomised controlled trials. Methods In this meta-analysis, we searched PubMed for publications in English from Jan 1, 1946, to Dec 15, 2016, using search terms including “chronic kidney disease”, “chronic renal insufficiency”, “albuminuria”, “proteinuria”, and “randomized controlled trial”; key inclusion criteria were quantifiable measurements of albuminuria or proteinuria at baseline and within 12 months of follow-up and information on the incidence of end-stage kidney disease. We requested use of individual patient data from the authors of eligible studies. For all studies that the authors agreed to participate and that had sufficient data, we estimated treatment effects on 6-month change in albuminuria and the composite clinical endpoint of treated end-stage kidney disease, estimated glomerular filtration rate of less than 15 mL/min per 1·73 m2, or doubling of serum creatinine. We used a Bayesian mixed-effects meta-regression analysis to relate the treatment effects on albuminuria to those on the clinical endpoint across studies and developed a prediction model for the treatment effect on the clinical endpoint on the basis of the treatment effect on albuminuria. Findings We identified 41 eligible treatment comparisons from randomised trials (referred to as studies) that provided sufficient patient-level data on 29 979 participants (21 206 [71%] with diabetes). Over a median follow-up of 3·4 years (IQR 2·3–4·2), 3935 (13%) participants reached the composite clinical endpoint. Across all studies, with a meta-regression slope of 0·89 (95% Bayesian credible interval [BCI] 0·13–1·70), each 30% decrease in geometric mean albuminuria by the treatment relative to the control was associated with an average 27% lower hazard for the clinical endpoint (95% BCI 5–45%; median R2 0·47, 95% BCI 0·02–0·96). The association strengthened after restricting analyses to patients with baseline albuminuria of more than 30 mg/g (ie, 3·4 mg/mmol; R2 0·72, 0·05–0·99]). For future trials, the model predicts that treatments that decrease the geometric mean albuminuria to 0·7 (ie, 30% decrease in albuminuria) relative to the control will provide an average hazard ratio (HR) for the clinical endpoint of 0·68, and 95% of sufficiently large studies would have HRs between 0·47 and 0·95. Interpretation Our results support a role for change in albuminuria as a surrogate endpoint for the progression of chronic kidney disease, particularly in patients with high baseline albuminuria; for patients with low baseline levels of albuminuria this association is less certain

    Pharmacology and therapeutic implications of current drugs for type 2 diabetes mellitus

    Get PDF
    Type 2 diabetes mellitus (T2DM) is a global epidemic that poses a major challenge to health-care systems. Improving metabolic control to approach normal glycaemia (where practical) greatly benefits long-term prognoses and justifies early, effective, sustained and safety-conscious intervention. Improvements in the understanding of the complex pathogenesis of T2DM have underpinned the development of glucose-lowering therapies with complementary mechanisms of action, which have expanded treatment options and facilitated individualized management strategies. Over the past decade, several new classes of glucose-lowering agents have been licensed, including glucagon-like peptide 1 receptor (GLP-1R) agonists, dipeptidyl peptidase 4 (DPP-4) inhibitors and sodium/glucose cotransporter 2 (SGLT2) inhibitors. These agents can be used individually or in combination with well-established treatments such as biguanides, sulfonylureas and thiazolidinediones. Although novel agents have potential advantages including low risk of hypoglycaemia and help with weight control, long-term safety has yet to be established. In this Review, we assess the pharmacokinetics, pharmacodynamics and safety profiles, including cardiovascular safety, of currently available therapies for management of hyperglycaemia in patients with T2DM within the context of disease pathogenesis and natural history. In addition, we briefly describe treatment algorithms for patients with T2DM and lessons from present therapies to inform the development of future therapies

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore