23 research outputs found

    Optimizing dialysis dose in the context of frailty: an exploratory study

    Get PDF
    Introduction Frailty is a multicausal syndrome characterized by a decrease in strength, resistance and physiological function, which makes the individual vulnerable and dependent, and increases his/her mortality. This syndrome is more prevalent among older individuals, and chronic kidney disease patients, particularly those on dialysis. Dialysis dose is currently standardized for hemodialysis (HD) patients regardless of their age and functional status. However, it has been postulated that the dialysis dose required in older patients, especially frail ones, should be lower, since it could increase their degree of frailty. Then, the purpose of this study was to evaluate if there would be a correlation between the dose of Kt/V and the degree of frailty in a population of adult patients on HD. Materials and methods A cross-sectional study with 82 patients on HD in Barranquilla (Colombia) and Lobos (Argentina) was conducted. Socio-demographic and laboratory data, as well as dialysis doses (Kt/V) were recorded and scales of fragility, physical activity, gait and grip strength were applied. Then these data were correlated by a Spearman’s correlation and a logistic regression. Results CFS, social isolation, physical activity, gait speed, and prehensile strength tests were outside the reference ranges in the studied group. No significant correlation was found between dialysis dose and all the above mentioned functional tests. However, a significant and inverse correlation between physical activity and CFS was documented (score − 1.41 (CI − 2.1 to − 0.7). Conclusion No significant correlation was documented between Kt/V value and different parameters of the frailty status, but this status correlated significantly and inversely with physical activity in this group. Frailty status in hemodialysis patients was significantly higher in older individuals, although young individuals were not exempt from it

    The Athena X-ray Integral Field Unit: a consolidated design for the system requirement review of the preliminary definition phase

    Get PDF
    Instrumentatio

    Osmotic diuresis in chronic kidney disease: its significance and clinical utility

    No full text
    Introduction The kidneys contribute to maintain plasma osmolality in normal range by achieving the adequate daily osmolar urine excretion (DOUE). An equation has been described for estimating the expected daily urine volume necessary to excrete the osmolar load required to keep serum osmolality in normal range. According to this equation, a difference between real and expected daily osmolar diuresis (DOD) can be obtained, being normally this difference value zero (± 500 cc). However, a positive DOD difference signifies a reduced urine concentration capability, while a negative DOD difference signifies a reduced urine dilution capability. Therefore, we decided to originally investigate how DOUE, and DOD difference are modified through the different stages of CKD. Materials and methods 61 patients suffering from CKD (stages I–V) secondary to glomerulopathies were studied. Creatinine clearance (CrCl), DOUE, and difference between real and expected DOD were obtained from each patient. Besides, correlation (Spearman) between CrCl and DOUE, and between CrCl and real–expected DOD difference were also obtained. Results Spearman correlation between CrCl and DOUE was positive and significant (Spearman’s ρ = 0.63, p < 0.0001). In addition, CKD patients who were not able to achieve the minimal DOUE required (600 mOsm/day) were mostly those with CrCl < 40 mL/min. Spearman correlation between CrCl and real–expected DOD difference was negative and significant (Spearman’s ρ = − 0.4, p < 0.0013). Additionally, abnormal DOD difference (> 500 cc) was found in CKD patients with CrCl < 80 mL/min/1.73 m2. Conclusion Daily osmolar urine excretion, and difference between real and expected daily osmolar diuresis are simple and significant clinical parameter which can be useful to easily evaluate urine concentration–dilution capability (tubular function) in CKD patients

    Terminal cancer: duration and prediction of survival time

    No full text
    Abstract The duration of the terminal period of cancer allows us to determine its prevalence, which is necessary to plan palliative care services. Clinical prediction of survival in¯uences access to palliative care and the healthcare approach to be adopted. The objective of this study was to determine the duration of the terminal period, the prognostic ability of healthcare professionals to predict this terminal period and the factors that can improve the prognostic accuracy. In the island of Mallorca, Spain, we followed 200 cancer patients at the inception of the terminal period. Twenty-one symptoms, quality of life, prognosis and duration of survival were measured. Using a Cox regression model, a predictive survival model was built. Median duration was 59 days; 95% conŸdence interval (CI)=49±69 days, mean=99 days. The oncologists were accurate in their predictions (AE1/3 duration) in 25.7% of cases, the nurses in 21.5% of cases and the family physicians in 21.7% of cases. Errors of overestimation occurred 2.86±4.14 times more frequently than underestimation. In the Ÿnal model, in addition to clinical prognosis (P=0.0094), asthenia (P=0.0257) and the Hebrew Rehabilitation Centre for Aged Quality of Life (HRCA-QL) Index (P=0.0002) were shown to be independent predictors of survival. In this study, the estimated duration of the terminal period was greater than that reported in a series of palliative care programmes, and survival was overestimated. Oncologists could estimate prognosis more accurately if they also take into account asthenia and HRCA-QL Index.

    Surgical Staff Radiation Protection During Fluoroscopy-Guided Urologic Interventions

    No full text
    International audienceINTRODUCTION: Over the past 20 years, the use of fluoroscopy to guide urologic surgical interventions has been constantly growing. Thus, in their daily practice, urologists and other operating room (OR) staff are exposed to X-radiation increasingly frequently. This raises questions as to the risks they encounter and the actions needed to reduce them. OBJECTIVE: Evaluate X-ray dose exposure in the members of the surgical team and determine urologist radioprotection knowledge and practices. MATERIALS AND METHODS: A prospective bicenter study was conducted within AFUF (French urology resident association) and in association with The French Nuclear Safety Authority/The Institute for Radiological Protection and Nuclear Safety (ASN/IRSN). Radiation exposure was measured on 12 operators using dosimeters (seven per operator), in staff-occupied locations in the OR using ionization chambers, and on anthropomorphic phantoms. A survey was used to gather information on radiation knowledge and safety practices of the AFUF members. RESULTS: Annual whole-body radiation doses were low (0.1-0.8 millisieverts [mSv], mostly at around 0.3 mSv), and equivalent doses were low for the fingers (0.7-15 mSv, mostly at around 2.5 mSv), and low for the lens of the eye (0.3-2.3 mSv, mostly at around 0.7 mSv). In percutaneous nephrolithotomy, extremity doses were lower when the patient was placed in dorsal decubitus compared with ventral decubitus. Pulsed fluoroscopy reduced radiation dose exposure by a factor of 3 compared with continuous fluoroscopy with no image quality loss. Radiation safety practices were poor: only 15% of urologists wore dosimeters and only 5% had been trained in the handling of X-ray generators. CONCLUSION: In the present study, radiation exposure for urologists was low, but so was knowledge of radiation safety and optimization practices. This absence of training for radiation safety and reduction, teamed with novel techniques involving long fluoroscopy-guided interventions, could result in unnecessarily high exposure for patients and OR personne
    corecore