18 research outputs found

    Benchmarking Deep Learning Architectures for Predicting Readmission to the ICU and Describing Patients-at-Risk

    Full text link
    Objective: To compare different deep learning architectures for predicting the risk of readmission within 30 days of discharge from the intensive care unit (ICU). The interpretability of attention-based models is leveraged to describe patients-at-risk. Methods: Several deep learning architectures making use of attention mechanisms, recurrent layers, neural ordinary differential equations (ODEs), and medical concept embeddings with time-aware attention were trained using publicly available electronic medical record data (MIMIC-III) associated with 45,298 ICU stays for 33,150 patients. Bayesian inference was used to compute the posterior over weights of an attention-based model. Odds ratios associated with an increased risk of readmission were computed for static variables. Diagnoses, procedures, medications, and vital signs were ranked according to the associated risk of readmission. Results: A recurrent neural network, with time dynamics of code embeddings computed by neural ODEs, achieved the highest average precision of 0.331 (AUROC: 0.739, F1-Score: 0.372). Predictive accuracy was comparable across neural network architectures. Groups of patients at risk included those suffering from infectious complications, with chronic or progressive conditions, and for whom standard medical care was not suitable. Conclusions: Attention-based networks may be preferable to recurrent networks if an interpretable model is required, at only marginal cost in predictive accuracy

    Variability in estimated glomerular filtration rate and the risk of major clinical outcomes in diabetes:Post hoc analysis from the ADVANCE trial

    Get PDF
    There are limited data on whether estimated glomerular filtration rate (eGFR) variability modifies the risk of future clinical outcomes in type 2 diabetes (T2D). We assessed the association between 20-month eGFR variability and the risk of major clinical outcomes in T2D among 8241 participants in the ADVANCE trial. Variability in eGFR (coefficient of variation [CVeGFR]) was calculated from three serum creatinine measurements over 20 months. Participants were classified into three groups by thirds of CVeGFR: low (6.4 to 12.1). The primary outcome was the composite of major macrovascular events, new or worsening nephropathy and all-cause mortality. Cox regression models were used to estimate hazard ratios (HRs). Over a median follow-up of 2.9 years following the 20-month period, 932 (11.3%) primary outcomes were recorded. Compared with low variability, greater 20-month eGFR variability was independently associated with higher risk of the primary outcome (HR for moderate and high variability: 1.07, 95% CI: 0.91-1.27 and 1.22, 95% CI: 1.03-1.45, respectively) with evidence of a positive linear trend (p = .015). These data indicate that eGFR variability predict changes in the risk of major clinical outcomes in T2D

    Dialysis catheter management practices in Australia and New Zealand

    No full text
    Aim: Dialysis catheter-associated infections (CAI) are a serious and costly burden on patients and the health-care system. Many approaches to minimizing catheter use and infection prophylaxis are available and the practice patterns in Australia and New Zealand are not known. We aimed to describe dialysis catheter management practices in dialysis units in Australia and New Zealand. Methods: Online survey comprising 52 questions, completed by representatives from dialysis units from both countries. Results: Of 64 contacted units, 48 (75%) responded (Australia 43, New Zealand 5), representing 79% of the dialysis population in both countries. Nephrologists (including trainees) inserted non-tunnelled catheters at 60% and tunnelled catheters at 31% of units. Prophylactic antibiotics were given with catheter insertion at 21% of units. Heparin was the most common locking solution for both non-tunnelled (77%) and tunnelled catheters (69%), with antimicrobial locks being predominant only in New Zealand (80%). Eight different combinations of exit site dressing were in use, with an antibiotic patch being most common (35%). All units in New Zealand and 84% of those in Australia undertook CAI surveillance. However, only 51% of those units were able to provide a figure for their most recent rate of catheter-associated bacteraemia per 1000 catheter days. Conclusion: There is wide variation in current dialysis catheter management practice and CAI surveillance is suboptimal. Increased attention to the scope and quality of CAI surveillance is warranted and further evidence to guide infection prevention is required

    Arteriovenous access practices in Australian and New Zealand dialysis units

    No full text
    Background: The creation and maintenance of dialysis vascular access is associated with significant morbidity. Structured management pathways can reduce this morbidity, yet practice patterns in Australia and New Zealand are not known. We aimed to describe the arteriovenous access practices in dialysis units in Australia and New Zealand. Methods: An online survey comprising 51 questions was completed by representatives from dialysis units from both countries. In addition to descriptive analysis, responses were compared between units inside and outside of major cities. Results: Of 64 contacted units, 48 (75%) responded (Australia 43, New Zealand 5), representing 38% of dialysis units in Australia and New Zealand. While 94% of units provided pre-dialysis education, only 60% reported a structured pre-dialysis pathway and 69% had a dedicated vascular access nurse. Most units routinely monitored fistula/graft function using flow rate measurement (73%) or recirculation studies (63%). A minority used routine ultrasound (35%). Thrombectomy, fistuloplasty and peritoneal dialysis catheter insertion were rarely performed by nephrologists (4%, 4% and 17% of units, respectively). Units outside of a major city were less likely to have access to a local vascular access surgeon (6/13 (46%) vs 35/35 (100%), P < 0.001). There were no other significant differences between units on the basis of location. Conclusion: Much variation exists in unit management of arteriovenous access. Structured pre-dialysis pathways and dedicated vascular access nurses may be underutilised in Australia and New Zealand. The use of regular access blood flow measurement and ultrasound is common in both countries despite a lack of data supporting its effectiveness. There is room for both practice improvement and a need for further evidence to ensure optimal arteriovenous access care

    Current practice in dialysis central venous catheter management: multi-disciplinary renal team perspectives

    No full text
    Aim: To explore the current practices related to the insertion, management and removal of dialysis central venous catheters (CVCs) used in patients with chronic kidney disease requiring haemodialysis. Methods: This qualitative descriptive study involved semi-structured interviews with surgeons, interventional radiologists, renal physicians, dialysis nurses, renal access nurses and renal researchers involved in the care of patients with chronic kidney disease requiring haemodialysis. Data were collected from staff at eight hospitals in six states and territories of Australia. Thirty-eight face-to-face interviews were conducted. A modified five-step qualitative content analysis approach was used to analyse the data. Results: Improved visualization technology and its use by interventional radiologists has steered insertions to specialist teams in specialist locations. This is thought to have decreased risk and improved patient outcomes. Nurses were identified as the professional group responsible for maintaining catheter access integrity, preventing access failure and reducing access-related complications. While best practice was considered important, justifications for variations in practice related to local patient and environment challenges were identified. Conclusion: The interdisciplinary team is central in the insertion, maintenance, removal and education of patients regarding dialysis CVCs. Clinicians temper research-based decision-making about central dialysis access catheter management with knowledge of individual, environmental and patient factors. Strategies to ensure guidelines are appropriately translated for use in a wide variety of settings are necessary for patient safety

    Buttonhole cannulation and clinical outcomes in a home hemodialysis cohort and systematic review

    No full text
    Background and objectivesThe relative merits of buttonhole (or blunt needle) versus rope ladder (or sharp needle) cannulation for hemodialysis vascular access are unclear.Design, setting, participants, & measurementsClinical outcomes by cannulation method were reviewed in 90 consecutive home hemodialysis patients. Initially, patients were trained in rope ladder cannulation. From 2004 on, all incident patients were started on buttonhole cannulation, and prevalent patients were converted to this cannulation method. Coprimary outcomes were arteriovenous fistula-attributable systemic infections and a composite of arteriovenous fistula loss or requirement for surgical intervention. Secondary outcomes were total arteriovenous fistula-related infections and staff time requirements. Additionally, a systematic review evaluating infections by cannulation method was performed.ResultsSeventeen systemic arteriovenous fistula-attributable infections were documented in 90 patients who were followed for 3765 arteriovenous fistula-months. Compared with rope ladder, buttonhole was not associated with a significantly higher rate of systemic arteriovenous fistula-attributable infections (incidence rate ratio, 2.71; 95% confidence interval, 0.66 to 11.09; P=0.17). However, use of buttonhole was associated with a significantly higher rate of total arteriovenous fistula infections (incidence rate ratio, 3.85; 95% confidence interval, 1.66 to 12.77; P=0.03). Initial and ongoing staff time requirements were significantly higher with buttonhole cannulation. Arteriovenous fistula loss or requirement for surgical intervention was not different between cannulation methods. A systematic review found increased arteriovenous fistula-related infections with buttonhole compared with rope ladder in four randomized trials (relative risk, 3.34; 95% confidence interval, 0.91 to 12.20), seven observational studies comparing before with after changes (relative risk, 3.15; 95% confidence interval, 1.90 to 5.21), and three observational studies comparing units with different cannulation methods (relative risk, 3.27; 95% confidence interval, 1.44 to 7.43).ConclusionButtonhole cannulation was associated with higher rates of infectious events, increased staff support requirements, and no reduction in surgical arteriovenous fistula interventions compared with rope ladder in home hemodialysis patients. A systematic review of the published literature found that buttonhole is associated with higher risk of arteriovenous fistula-related infections

    Estimating the population-level impacts of improved uptake of SGLT2 inhibitors in patients with chronic kidney disease: a cross-sectional observational study using routinely collected Australian primary care dataResearch in context

    No full text
    Summary: Background: Sodium glucose co-transporter 2 (SGLT2) inhibitors reduce the risk of kidney failure and death in patients with chronic kidney disease (CKD) but are underused. We evaluated the number of patients with CKD in Australia that would be eligible for treatment and estimated the number of cardiorenal and kidney failure events that could be averted with improved uptake of SGLT2 inhibitors. Methods: This cross-sectional observational study leveraged nationally representative primary care data from 392 Australian general practices (MedicineInsight) between 1 January 2020 and 31 December 2021. We identified patients that would have met inclusion criteria of key SGLT2 inhibitor trials and applied these data to age and sex-stratified estimates of CKD prevalence for the Australian population (using national census data), estimating the number of preventable events using trial event rates. Key outcomes included cardiorenal events (CKD progression, kidney failure, or death due to cardiovascular or kidney disease) and kidney failure. Findings: In MedicineInsight, 44.2% of adults with CKD would have met CKD eligibility criteria for an SGLT2 inhibitor; baseline use was 4.1%. Applying these data to the Australian population, 230,246 patients with CKD would have been eligible for treatment with an SGLT2 inhibitor. Optimal implementation of SGLT2 inhibitors (75% uptake) could reduce cardiorenal and kidney failure events annually in Australia by 3644 (95% CI 3526–3764) and 1312 (95% CI 1242–1385), respectively. Interpretation: Improved uptake of SGLT2 inhibitors for patients with CKD in Australia has the potential to prevent large numbers of patients experiencing CKD progression or dying due to cardiovascular or kidney disease. Identifying strategies to increase the uptake of SGLT2 inhibitors is critical to realising the population-level benefits of this drug class. Funding: University of New South Wales Scientia Program and Boehringer Ingelheim Eli Lilly Alliance
    corecore