374 research outputs found

    Effect of changes over time in the performance of a customized SAPS-II model on the quality of care assessment

    Get PDF
    Purpose: The aim of our study was to explore, using an innovative method, the effect of temporal changes in the mortality prediction performance of an existing model on the quality of care assessment. The prognostic model (rSAPS-II) was a recalibrated Simplified Acute Physiology Score-II model developed for very elderly Intensive Care Unit (ICU) patients. Methods: The study population comprised all 12,143 consecutive patients aged 80 years and older admitted between January 2004 and July 2009 to one of the ICUs of 21 Dutch hospitals. The prospective dataset was split into 30 equally sized consecutive subsets. Per subset, we measured the model's discrimination [area under the curve (AUC)], accuracy (Brier score), and standardized mortality ratio (SMR), both without and after repeated recalibration. All performance measures were considered to be stable if 1 without and after repeated recalibration for the year 2009. Results: For all subsets, the AUCs were stable, but the Brier scores and SMRs were not. The SMR was downtrending, achieving levels significantly below 1. Repeated recalibration rendered it stable again. The proportions of hospitals with SMR>1 and SMR <1 changed from 15 versus 85% to 35 versus 65%. Conclusions: Variability over time may markedly vary among different performance measures, and infrequent model recalibration can result in improper assessment of the quality of care in many hospitals. We stress the importance of the timely recalibration and repeated validation of prognostic models over tim

    Markers of cerebral damage during delirium in elderly patients with hip fracture

    Get PDF
    BACKGROUND: S100B protein and Neuron Specific Enolase (NSE) can increase due to brain cell damage and/or increased permeability of the blood-brain-barrier. Elevation of these proteins has been shown after various neurological diseases with cognitive dysfunction. Delirium is characterized by temporal cognitive deficits and is an important risk factor for dementia. The aim of this study was to compare the level of S100B and NSE of patients before, during and after delirium with patients without delirium and investigate the possible associations with different subtypes of delirium. METHODS: The study population were patients aged 65 years or more acutely admitted after hip fracture. Delirium was diagnosed by the Confusion Assessment Method and the subtype by Delirium Symptom interview. In maximal four serum samples per patient S100B and NSE levels were determined by electrochemiluminescence immunoassay. RESULTS: Of 120 included patients with mean age 83.9 years, 62 experienced delirium. Delirious patients had more frequently pre-existing cognitive impairment (67% vs. 18%, p<0.001). Comparing the first samples during delirium to samples of non-delirious patients, a difference was observed in S100B (median 0.16 versus 0.10 ug/L, p=<0.001), but not in NSE (median 11.7 versus 11.7 ng/L, p=0.97). Delirious state (before, during, after) (p<0.001), day of blood withdrawal (p<0.001), pre- or postoperative status (p=0.001) and type of fracture (p=0.036) were all associated with S100B level. The highest S100B levels were found 'during' delirium. S100B levels 'before' and 'after' delirium were still higher than those from 'non-delirious' patients. No significant difference in S100B (p=0.43) or NSE levels (p=0.41) was seen between the hyperactive, hypoactive and mixed subtype of delirium. CONCLUSIONS: Delirium was associated with increased level of S100B which could indicate cerebral damage either due to delirium or leading to delirium. The possible association between higher levels of S100B during delirium and the higher risk of developing dementia after delirium is an interesting field for future research. More studies are needed to elucidate the role of S100B proteins in the pathophysiological pathway leading to delirium and to investigate its possibility as biomarker for deliriu

    Differential Changes in QTc Duration during In-Hospital Haloperidol Use

    Get PDF
    Aims: To evaluate changes in QT duration during low-dose haloperidol use, and determine associations between clinical variables and potentially dangerous QT prolongation. Methods: In a retrospective cohort study in a tertiary university teaching hospital in The Netherlands, all 1788 patients receiving haloperidol between 2005 and 2007 were studied; ninety-seven were suitable for final analysis. Rate-corrected QT duration (QTc) was measured before, during and after haloperidol use. Clinical variables before haloperidol use and at the time of each ECG recording were retrieved from hospital charts. Mixed model analysis was used to estimate changes in QT duration. Risk factors for potentially dangerous QT prolongation were estimated by logistic regression analysis. Results: Patients with normal before-haloperidol QTc duration (male <= 430 ms, female <= 450 ms) had a significant increase in QTc duration of 23 ms during haloperidol use; twenty-three percent of patients rose to abnormal levels (male >= 450 ms, female >= 470 ms). In contrast, a significant decrease occurred in patients with borderline (male 430-450 ms, female 450-470 ms) or abnormal before-haloperidol QTc duration (15 ms and 46 ms, respectively); twenty-three percent of patients in the borderline group, and only 9% of patients in the abnormal group obtained abnormal levels. Potentially dangerous QTc prolongation was independently associated with surgery before haloperidol use (OR(adj) 34.9, p = 0.009) and before-haloperidol QTc duration (OR(adj) 0.94, p = 0.004). Conclusion: QTc duration during haloperidol use changes differentially, increasing in patients with normal before-haloperidol QTc duration, but decreasing in patients with prolonged before-haloperidol QTc duration. Shorter before-haloperidol QTc duration and surgery before haloperidol use predict potentially dangerous QTc prolongatio

    Decision making in interhospital transport of critically ill patients: national questionnaire survey among critical care physicians

    Get PDF
    Objective: This study assessed the relative importance of clinical and transport-related factors in physicians' decision-making regarding the interhospital transport of critically ill patients. Methods: The medical heads of all 95 ICUs in The Netherlands were surveyed with a questionnaire using 16 case vignettes to evaluate preferences for transportability; 78 physicians (82%) participated. The vignettes varied in eight factors with regard to severity of illness and transport conditions. Their relative weights were calculated for each level of the factors by conjoint analysis and expressed in beta. The reference value (beta = 0) was defined as the optimal conditions for critical care transport; a negative beta indicated preference against transportability. Results: The type of escorting personnel (paramedic only: beta = 3.1) and transport facilities (standard ambulance beta = 1.21) had the greatest negative effect on preference for transportability. Determinants reflecting severity of illness were of relative minor importance (dose of noradrenaline beta = 0.6, arterial oxygenation beta = 0.8, level of peep beta = 0.6). Age, cardiac arrhythmia, and the indication for transport had no significant effect. Conclusions: Escorting personnel and transport facilities in interhospital transport were considered as most important by intensive care physicians in determining transportability. When these factors are optimal, even severely critically ill patients are considered able to undergo transport. Further clinical research should tailor transport conditions to optimize the use of expensive resources in those inevitable road trip

    The Promise and Challenge of Therapeutic MicroRNA Silencing in Diabetes and Metabolic Diseases

    Get PDF
    MicroRNAs (miRNAs) are small, non-coding, RNA molecules that regulate gene expression. They have a long evolutionary history and are found in plants, viruses, and animals. Although initially discovered in 1993 in Caenorhabditis elegans, they were not appreciated as widespread and abundant gene regulators until the early 2000s. Studies in the last decade have found that miRNAs confer phenotypic robustness in the face of environmental perturbation, may serve as diagnostic and prognostic indicators of disease, underlie the pathobiology of a wide array of complex disorders, and represent compelling therapeutic targets. Pre-clinical studies in animal models have demonstrated that pharmacologic manipulation of miRNAs, mostly in the liver, can modulate metabolic phenotypes and even reverse the course of insulin resistance and diabetes. There is cautious optimism in the field about miRNA-based therapies for diabetes, several of which are already in various stages of clinical trials. This review will highlight both the promise and the most pressing challenges of therapeutic miRNA silencing in diabetes and related conditions

    Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1), and to identify high-risk subgroups from the data (aim#2).</p> <p>Methods</p> <p>A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493). A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances.</p> <p>Results</p> <p>The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity.</p> <p>Conclusions</p> <p>Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity) reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack diagnostic precision. High-risk subgroups may be identified automatically from existing geriatric assessment data, especially when combined with domain knowledge in a hybrid classification model. Further work is necessary to validate our approach in a controlled prospective setting.</p

    Hypothermia predicts mortality in critically ill elderly patients with sepsis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Advanced age is one of the factors that increase mortality in intensive care. Sepsis and multi-organ failure are likely to further increase mortality in elderly patients.</p> <p>We compared the characteristics and outcomes of septic elderly patients (> 65 years) with younger patients (≤ 65 years) and identified factors during the first 24 hours of presentation that could predict mortality in elderly patients.</p> <p>Methods</p> <p>This study was conducted in a Level III intensive care unit with a case mix of medical and surgical patients excluding cardiac and neurosurgical patients.</p> <p>We performed a retrospective review of all septic patients admitted to our ICU between July 2004 and May 2007. In addition to demographics and co-morbidities, physiological and laboratory variables were analysed to identify early predictors of mortality in elderly patients with sepsis.</p> <p>Results</p> <p>Of 175 patients admitted with sepsis, 108 were older than 65 years. Elderly patients differed from younger patients with regard to sex, temperature (37.2°C VS 37.8°C p < 0.01), heart rate, systolic blood pressure, pH, HCO<sub>3</sub>, potassium, urea, creatinine, APACHE III and SAPS II. The ICU and hospital mortality was significantly higher in elderly patients (10.6% Vs 23.14% (p = 0.04) and 19.4 Vs 35.1 (p = 0.02) respectively). Elderly patients who died in hospital had a significant difference in pH, HCO<sub>3</sub>, mean blood pressure, potassium, albumin, organs failed, lactate, APACHE III and SAPS II compared to the elderly patients who survived while the mean age and co-morbidities were comparable. Logistic regression analysis identified temperature (OR [per degree centigrade decrease] 0.51; 95% CI 0.306- 0.854; p = 0.010) and SAPS II (OR [per point increase]: 1.12; 95% CI 1.016-1.235; p = 0.02) during the first 24 hours of admission to independently predict increased hospital mortality in elderly patients.</p> <p>Conclusions</p> <p>The mortality in elderly patients with sepsis is higher than the younger patients. Temperature (hypothermia) and SAPS II scores during the first 24 hours of presentation independently predict hospital mortality.</p
    corecore