RocScholar (Rochester Regional Health)
Not a member yet
    3788 research outputs found

    Association Between Rate of Hypernatremia Correction and Mortality: A Retrospective Cohort Study Across a Regional Health System

    No full text
    Background: Rate of correction in severe hypernatremia remains controversial. Although data increasingly supports rapid correction, hypernatremia is still often treated similarly to hyponatremia with a maximum rate of correction of 8-12 mmol/L per day due to concerns of neurological complications. This retrospective cohort study investigated the association between the rate of correction in hypernatremia and mortality. A secondary objective was to evaluate whether any adverse neurological outcomes were attributable to rapid correction. Methods: A retrospective cohort study of patients with severe hypernatremia (serum sodium ≥ 155 mmol/L) was conducted across a health system in the United States between January and December 2023. Rates of correction were calculated using the time between peak serum sodium values and first eunatremic (serum sodium ≤ 145 mmol/L) or last known values. Patients were categorized by their hypernatremia correction rates into slow (≤ 8 mmol/L/day) or rapid (\u3e 8 mmol/L/day) correction groups. Mortality was compared between the two groups using Fisher\u27s exact test and survival analysis for 90-day and one-year intervals. Multivariate Cox regression analysis was performed to evaluate for association between the rate of correction and mortality. Results: Among 150 included patients, 33 underwent rapid correction. The slow correction group had higher Charlson Comorbidity Indices compared to the rapid correction group. No significant differences in 90-day (43% vs 33%, p=0.42) and one-year mortality rates (63% vs 52%, p=0.23) were observed between the slow and rapid correction groups. Subsequent chart review revealed no documented adverse neurological outcomes attributable to rapid correction. Multivariate analysis did not identify a significant association between correction rate and mortality (hazard ratio 1.00, p=0.27). Conclusion: These findings add to the growing evidence challenging traditional concerns about rapid correction of hypernatremia in adults, suggesting that rapid correction rates exceeding 8 mmol/L/day do not increase mortality or cause adverse neurological events. These results support reconsidering rigid correction limits and highlight the need for further research on individualized treatment strategies

    Spot Urine Protein/Creatinine Ratio as an Alternative to 24-Hour Urine Collection for Measuring Proteinuria in Patients With Multiple Myeloma: A Prospective Study

    No full text
    Measurement of 24- hour urine protein electrophoresis (PEP) and immunofixation (IFE) are part of standard diagnostic evaluation and monitoring of patients with suspected and diagnosed monoclonal plasma cell disorder for baseline evaluation of renal dysfunction or nephrotic syndrome. Measurement of 24-hour urine protein can however be time consuming and cumbersome and many centers have moved towards random urine protein measurements. The evidence for correlation between spot urine protein creatinine ratio (SUPC) and 24-hour urine protein measurements its scarce. Therefore, we have carried out a prospective study with a sample size of 40 multiple myeloma (MM) patients to demonstrate this correlation. Our results suggest a good correlation between SUPC and 24-hour urine protein measurements, suggesting SUPC testing is a reliable and easier alternative to 24-hour urine collection. These findings should now be confirmed in larger patient populations including early stage plasma cell disorders, light chain amyloidosis and symptomatic MM

    Noninterruptive tool to support provider malnutrition documentation and minimize documentation queries

    No full text
    Objectives: Determine if an electronic documentation tool can reduce documentation queries for malnutrition without impacting diagnostic coding. Materials and methods: Malnutrition documentation queries and diagnosis coding proportions were compared between 2 groups of 600 malnourished adults discharged from internal medicine services before and after this electronic malnutrition documentation tool was promoted. Results: Documentation queries for malnutrition were observed in 300 (50%) of the preintervention discharges and 112 (19%) of the postintervention discharges (P \u3c .001). A diagnosis code for malnutrition was observed in 99% of both groups. In a logistic regression accounting for clustering by provider, the odds ratio of a query postdeployment vs predeployment was 0.21 (95% CI, 0.16-0.29). In 88 of 112 (79%) of the postintervention discharges queried for malnutrition, the tool was not used as recommended. Conclusions: We have demonstrated that introducing and promoting this electronic documentation tool can reduce querying for malnutrition while preserving diagnostic coding

    Influence of Obesity Class on Clinical Outcomes in Alcoholic Hepatitis: A National Cohort Study of Mortality, Complications, and Resource Use

    No full text
    Background & aims: Alcoholic hepatitis (AH) is a severe manifestation of alcoholic liver disease with high morbidity and mortality. This study used the 2016-2020 National Readmission Database to investigate how obesity influences AH outcomes. Methods: Adult hospitalizations were categorized as those without obesity, Class 1 obesity (BMI 30-34.9), Class 2 obesity (BMI 35-39.9), or Class 3 obesity (BMI ≥ 40). We compared mortality, complications, and resource utilization across these groups using regression models. Results: Among 82 367 AH admissions, 4.09% had Class 1 obesity, 2.73% had Class 2 obesity, and 4.02% had Class 3 obesity. After adjusting for confounders, Class 3 obesity was associated with higher odds of mortality (Odds ratio OR = 1.74; 95% CI: 1.40-2.17; p \u3c 0.01), septic shock (OR = 2.27; 95% CI: 1.60-3.22; p \u3c 0.01), hepatic encephalopathy (OR = 2.53; 95% CI: 1.15-5.56; p = 0.02), and intensive care unit (ICU) admission (OR = 1.93; 95% CI: 1.57-2.36; p \u3c 0.01). All obesity classes had increased associations with hepatorenal syndrome. No significant differences emerged for spontaneous bacterial peritonitis or variceal bleeding. Resource utilization rose with increasing obesity severity, with Class 3 obesity having a 1.84-day longer adjusted length of stay (p \u3c 0.01) and an additional $20 174 in total hospitalization charges (p \u3c 0.01) compared with hospitalizations without obesity. Conclusions: Class 3 obesity conferred the greatest burden of mortality, complications, and healthcare costs among hospitalizations with AH. Further research is warranted to clarify the intricate interplay between obesity and AH

    Advances in Non-Invasive Screening Methods for Gastrointestinal Cancers: How Continued Innovation Has Revolutionized Early Cancer Detection

    No full text
    Early detection is important in reducing mortality from gastrointestinal (GI) cancers, but conventional screening methods can be invasive and expensive. Non-invasive diagnostic approaches hold promise for more efficient surveillance. For gastric cancer, oral rinse tests assessing the oral microbiome and circulating tumor DNA (ctDNA) analysis have shown improved specificity, while stool-based assays and FDA-approved blood-based tests such as Shield are revolutionizing colorectal cancer screening. Pancreatic cancer detection benefits from liquid biopsy technologies targeting KRAS mutations, exosomal markers, and VOC breath analysis. Hepatocellular carcinoma (HCC) surveillance is evolving with ctDNA methylation panels plus AI-driven radiological assessments. These innovations address long-standing challenges in early GI cancer diagnosis by increasing sensitivity and patient comfort. This review highlights the most recent advances in non-invasive GI cancer screening, offering a hopeful future for early detection and paving the way for personalized interventions

    Informatics-driven unsupervised learning of comorbidity clusters for COVID-19 reinfection risk: A finite mixture modeling approach

    No full text
    Purpose: This study applied an informatics-focused, unsupervised learning framework (finite mixture modeling) to determine whether distinct clusters of coexisting conditions among patients with coronavirus disease 2019 (COVID-19) are associated with multiple (reinfection) versus single infections. Methods: We analyzed 42,974 patient records containing COVID-19 diagnoses using an machine learning classification algorithm to identify comorbidity profiles. Of nearly 850 recorded conditions, 29 were retained if they occurred in at least 5 % of the sample. We then compared patients with single versus multiple COVID-19 diagnoses within each profile. Results: Three comorbidity profiles emerged. The first profile (Minimal Comorbidity) was the largest (67 % of sample) and was characterized by few additional conditions. Patients classified into this profile were also 20–30 years younger, on average, than members of the other profiles. The second (Elevated Select Comorbidity) profile consisted of 24 % of the sample and was characterized by moderate-risk factors such as hypertension, hyperlipidemia, and acute respiratory failure. The third (High Comorbidity Burden) third was represented by 9 % of the sample and was characterized by conditions related to cardiovascular, renal, endocrine, and respiratory systems. Among the high-burden group, 30 % experienced reinfection, versus only 9 % in the minimal group. Overall, patients with more extensive cardiometabolic or pulmonary conditions were more likely to experience repeated infection. Conclusions: By identifying and characterizing comorbidity clusters, this informatics-based approach offers deeper insight into COVID-19 reinfection dynamics. The findings may support targeted prevention, data-driven resource allocation, and precision medicine strategies by highlighting subgroups at elevated risk. Moreover, the unsupervised modeling framework is potentially adaptable to other multifactorial conditions, underscoring its broader utility in medical informatics

    A second look at secondary hypogammaglobulinemia

    No full text
    Hypogammaglobulinemia is defined as a reduced immunoglobulin level, which can be either primary due to inborn errors of immunity or acquired in the setting of poor antibody production or increased antibody loss. Secondary hypogammaglobulinemia (SHG) should be considered in patients with a history of immunosuppressive therapy, transplant, protein loss syndromes, certain autoimmune conditions, and malignancies, as it can be associated with increased infectious risk. Appropriate history and lab-based screening in these populations can identify SHG allowing treatment and close monitoring as appropriate. Ideally, treatment focuses on control of the underlying condition or removal of iatrogenic causes of SHG. However, in many cases, treatment of the underlying condition does not reverse SHG or immunosuppressive therapy cannot be discontinued without significant risk to the patient. For these patients, strategies for risk mitigation against infectious complications include vaccination, antibiotic prophylaxis, and immunoglobulin replacement therapy. This report aims to summarize the existing and emerging data in the evaluation and management of SHG and highlight areas that require further investigation

    RFK Jr. and the Kidney

    No full text
    RFK Jr. and the Kidney. Dr. Marvin Grieff, Chief, Nephrology Department; Dr. Arjun Sekar, Nephrology Department Objectives: Understand the role of dietary sodium and potassium in the management of hypertension Understand the influence of kidney disease on dietary recommendations for sodium, potassium, and protein intake Understand the role of medications to control potassium in the setting of kidney diseas

    Sex disparities in outcomes of transcatheter aortic valve implantation- a multi-year propensity-matched nationwide study

    No full text
    Transcatheter Aortic Valve Implantation (TAVI) has revolutionized the management of severe aortic stenosis (AS), but the impact of sex on TAVI outcomes remains unclear. In this study, we examined differences between men and women in the post-procedural outcomes of TAVI, including healthcare burden and readmission rates. The Nationwide Readmissions Database (2016-2020) was utilized to identify hospitalizations for TAVI. A propensity score matching (PSM) model was used to match males and females. Outcomes were examined using Pearson\u27s chi-squared test. Among 320,324 hospitalizations for TAVI, 142,054 (44.3 %) procedures were performed in women. After propensity matching (N = 165,894 with 82,947 hospitalizations in each group), women had higher in-hospital mortality (2.48 % vs 2.11 %, p: 0.001), stroke (2.14 % vs 1.49 %, p \u3c 0.001), post-procedural bleeding (2.34 % vs 1.72 %, p \u3c 0.001), vascular complications (1.2 % vs 0.7 %, p \u3c 0.001), pericardial complications (1.13 % vs 0.60 %, p \u3c 0.001), acute respiratory failure (ARF) (5.10 % vs 4.63 %, p \u3c 0.001), need for transfusion (7 % vs 5.56 %, p \u3c 0.001), need for vasopressors (2.48 % vs 2.11 %, p \u3c 0.001) and major adverse cardiac and cerebrovascular events (MACCE) (7.53 % vs 6.85 %, p \u3c 0.001). Meanwhile, women had modestly lower incidence of acute kidney injury (AKI) (10.17 % vs 11.88 %, p \u3c 0.001), sudden cardiac arrest (SCA) (0.96 % vs 1.06 %, p: 0.042), cardiogenic shock (1.69 % vs 2.05 %, p \u3c 0.001) and mechanical circulatory support (MCS) requirement (0.69 % vs 0.84 %, p \u3c 0.001). With regard to readmissions, men had higher readmission rates at 30 days (16.07 % vs 14.75 %, p \u3c 0.001) and 90 days (23.8 % vs 21.9 %, p \u3c 0.001). No significant difference was observed in 180-day readmission rates between men and women after TAVI. Notably, procedure-related mortality decreased for both sexes from 2016 to 2020, accompanied by faster recovery times and reduced hospitalization costs (p-trend \u3c 0.001). In conclusion, women had higher mortality and post-procedural complication rates, while men had higher readmission rates, cardiogenic shock, AKI and need for mechanical circulatory support. While procedure-related mortality and resource utilization for TAVI have improved over time from 2016 to 2020, irrespective of sex, our findings highlight that significant disparities exist in TAVI outcomes

    The Role of Artificial Intelligence in Cardiovascular Disease Risk Prediction: An Updated Review on Current Understanding and Future Research

    No full text
    Cardiovascular disease (CVD) Continues to be the leading cause of mortality worldwide, underscoring the critical need for effective prevention and management strategies. The ability to predict cardiovascular risk accurately and cost-effectively is central to improving patient outcomes and reducing the global burden of CVD. While useful, traditional tools used for risk assessment are often limited in their scope and fail to adequately account for atypical presentations and complex patient profiles. These limitations highlight the necessity for more advanced approaches, particularly integrating artificial intelligence (AI) into cardiovascular risk prediction. Our review explores the transformative role of AI in enhancing the accuracy, efficiency, and accessibility of cardiovascular risk prediction models. The implementation of AI-driven risk assessment tools has shown promising results, not only in improving CVD mortality rates but also in enhancing quality of life (QOL) markers and reducing healthcare costs. Machine learning (ML) algorithms predicted 2-year survival rates after MI with improved accuracy compared to traditional models. Deep learning (DL) forecasted hypertension risk with a 91.7% accuracy based on electronic health records. Furthermore, AI-driven ECG (Electrocardiography) analysis has demonstrated high precision in identifying left ventricular systolic dysfunction, even with noisy single-lead data from wearable devices. These tools enable more personalized treatment strategies, foster greater patient engagement, and support informed decision-making by healthcare providers. Unfortunately, the widespread adoption of AI in CVD risk assessment remains a challenge, largely due to a lack of education and acceptance among healthcare professionals. To overcome these barriers, it is crucial to promote broader education on the benefits and applications of AI in cardiovascular risk prediction. By fostering a greater understanding and acceptance of these technologies, we can accelerate their integration into clinical practice, ultimately aiming to mitigate the global impact of CVD

    517

    full texts

    3,591

    metadata records
    Updated in last 30 days.
    RocScholar (Rochester Regional Health) is based in United States
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇