226 research outputs found
New perspectives in pediatric dialysis technologies: the case for neonates and infants with acute kidney injury
Advancements in pediatric dialysis generally rely on adaptation of technology originally developed for adults. However, in the last decade, particular attention has been paid to neonatal extracorporeal therapies for acute kidney care, an area in which technology has made giant strides in recent years. Peritoneal dialysis (PD) is the kidney replacement therapy (KRT) of choice in the youngest age group because of its simplicity and effectiveness. However, extracorporeal blood purification provides more rapid clearance of solutes and faster fluid removal. Hemodialysis (HD) and continuous KRT (CKRT) are thus the most used dialysis modalities for pediatric acute kidney injury (AKI) in developed countries. The utilization of extracorporeal dialysis for small children is associated with a series of clinical and technical challenges which have discouraged the use of CKRT in this population. The revolution in the management of AKI in newborns has started recently with the development of new CKRT machines for small infants. These new devices have a small extracorporeal volume that potentially prevents the use of blood to prime lines and dialyzer, allow a better volume control and the use of small-sized catheter without compromising the blood flow amount. Thanks to the development of new dedicated devices, we are currently dealing with a true “scientific revolution” in the management of neonates and infants who require an acute kidney support
Validation and Performance Comparison of Two Scoring Systems Created Specifically to Predict the Risk of Deep Sternal Wound Infection after Bilateral Internal Thoracic Artery Grafting
Background: The Gatti and the bilateral internal mammary artery (BIMA) scores were created to predict the risk of deep sternal wound infection (DSWI) after bilateral internal thoracic artery (BITA) grafting. Methods: Both scores were evaluated retrospectively in two consecutive series of patients undergoing isolated multi-vessel coronary surgical procedures - i.e., the Trieste (n = 1,122; BITA use, 52.1%; rate of DSWI, 5.7%) and the Besan\ue7on cohort (n = 721; BITA use, 100%; rate of DSWI, 2.5%). Baseline patient characteristics were compared between the two validation samples. For each score, the accuracy of prediction and predictive power were assessed by the area under the receiver-operating characteristic curve (AUC) and the Goodman-Kruskal gamma coefficient, respectively. Results: There were significant differences between the two series in terms of age, gender, New York Heart Association functional class, chronic lung disease, left ventricular function, surgical priority, and the surgical techniques used. In the Trieste series, accuracy of prediction of the Gatti score for DSWI was higher than that of the BIMA score (AUC, 0.729 vs. 0.620, p = 0.0033). The difference was not significant, however, in the Besan\ue7on series (AUC, 0.845 vs. 0.853, p = 0.880) and when only BITA patients of the Trieste series were considered for analysis (AUC, 0.738 vs. 0.665, p = 0.157). In both series, predictive power was at least moderate for the Gatti score and low for the BIMA score. Conclusions: The Gatti and the BIMA scores seem to be useful for pre-operative evaluation of the risk of DSWI after BITA grafting. Further validation studies should be performed
Dynamic Collection Scheduling Using Remote Asset Monitoring: Case Study in the UK Charity Sector
Remote sensing technology is now coming onto the market in the waste collection sector. This technology allows waste and recycling receptacles to report their fill levels at regular intervals. This reporting enables collection schedules to be optimized dynamically to meet true servicing needs in a better way and so reduce transport costs and ensure that visits to clients are made in a timely fashion. This paper describes a real-life logistics problem faced by a leading UK charity that services its textile and book donation banks and its high street stores by using a common fleet of vehicles with various carrying capacities. Use of a common fleet gives rise to a vehicle routing problem in which visits to stores are on fixed days of the week with time window constraints and visits to banks (fitted with remote fill-monitoring technology) are made in a timely fashion so that the banks do not become full before collection. A tabu search algorithm was developed to provide vehicle routes for the next day of operation on the basis of the maximization of profit. A longer look-ahead period was not considered because donation rates to banks are highly variable. The algorithm included parameters that specified the minimum fill level (e.g., 50%) required to allow a visit to a bank and a penalty function used to encourage visits to banks that are becoming full. The results showed that the algorithm significantly reduced visits to banks and increased profit by up to 2.4%, with the best performance obtained when the donation rates were more variable
Prostate Cancer Treatment-Related Toxicity: Comparison between 3D-Conformal Radiation Therapy (3D-CRT) and Volumetric Modulated Arc Therapy (VMAT) Techniques
Objective: This paper illustrates the results of a mono-institutional registry trial, aimed to test whether gastrointestinal (GI) and genitourinary (GU) toxicity rates were lower in localized prostate cancer patients treated with image-guided volumetric modulated arc therapy (IG-VMAT) compared to those treated with IG-3D conformal radiation therapy (IG-3DCRT). Materials and Methods: Histologically proven prostate cancer patients with organ-confined disease, treated between October 2008 and September 2014 with moderately hypofractionated radiotherapy, were reviewed. Fiducial markers were placed in the prostate gland by transrectal ultrasound guide. The prescribed total dose was 70 Gy in 28 fractions. The mean and median dose volume constraints for bladder and rectum as well as total volume of treatment were analyzed as potentially prognostic factors influencing toxicity. The Kaplan–Meier method was applied to calculate survival. Results: Overall, 83 consecutive patients were included. Forty-two (50.6%) patients were treated with 3D-CRT and 41 (49.4%) with the VMAT technique. The median follow-up for toxicity was 77.26 months for the whole cohort. The VMAT allowed for a dose reduction to the rectum and bladder for the large majority of the considered parameters; nonetheless, the only parameter correlated with a clinical outcome was a rectal dose limit V66 > 8.5% for late GI toxicity G ≥ 2 (p = 0.045). Rates of G ≥ 2 toxicities were low among the whole cohort of these patients treated with IGRT. The analysis for rectum dose volume histograms (DVHs) showed that a severe (grade ≥ 2) late GI toxicity was related with the rectal dose limit V66 > 8.5% (p = 0.045). Conclusions: This study shows that moderate hypofractionation is feasible and safe in patients with intermediate and high-risk prostate cancer. Daily IGRT may decrease acute and late toxicity to organs at risk and improve clinical benefit and disease control rate, cutting down the risk of PTV geographical missing. The adoption of VMAT allows for promising results in terms of OAR sparing and a reduction in toxicity that, also given the small sample, did not reach statistical significance
The Smc5/6 complex is required for dissolution of DNA-mediated sister chromatid linkages
Mitotic chromosome segregation requires the removal of physical connections between sister chromatids. In addition to cohesin and topological entrapments, sister chromatid separation can be prevented by the presence of chromosome junctions or ongoing DNA replication. We will collectively refer to them as DNA-mediated linkages. Although this type of structures has been documented in different DNA replication and repair mutants, there is no known essential mechanism ensuring their timely removal before mitosis. Here, we show that the dissolution of these connections is an active process that requires the Smc5/6 complex, together with Mms21, its associated SUMO-ligase. Failure to remove DNA-mediated linkages causes gross chromosome missegregation in anaphase. Moreover, we show that Smc5/6 is capable to dissolve them in metaphase-arrested cells, thus restoring chromosome resolution and segregation. We propose that Smc5/6 has an essential role in the removal of DNA-mediated linkages to prevent chromosome missegregation and aneuploidy
"You have to get wet to learn how to swim" applied to bridging the gap between research into personnel scheduling and its implementation in practice
Personnel scheduling problems have attracted research interests for several decades. They have been considerably changed over time, accommodating a variety of constraints related to legal and organisation requirements, part-time staff, flexible hours of staff, staff preferences, etc. This led to a myriad of approaches developed for solving personnel scheduling problems including optimisation, meta-heuristics, artificial intelligence, decision-support, and also hybrids of these approaches. However, this still does not imply that this research has a large impact on practice and that state-of-the art models and algorithms are widely in use in organisations. One can find a reasonably large number of software packages that aim to assist in personnel scheduling. A classification of this software based on its purpose will be proposed, accompanied with a discussion about the level of support that this software offers to schedulers. A general conclusion is that the available software, with some exceptions, does not benefit from the wealth of developed models and methods. The remaining of the paper will provide insights into some characteristics of real-world scheduling problems that, in the author’s opinion, have not been given a due attention in the personnel scheduling research community yet and which could contribute to the enhancement of the implementation of research results in practice. Concluding remarks are that in order to bridge the gap that still exists between research into personnel scheduling and practice, we need to engage more with schedulers in practice and also with software developers; one may say we need to get wet if we want to learn how to swim
Sources and Sinks of Greenhouse Gases from European Grasslands and Mitigation Options: The ‘GreenGrass’ Project
Adapting the management of grasslands may be used to enhance carbon sequestration into soil, but could also increase N2O and CH4 emissions. In support of the European post-Kyoto policy, the European \u27GreenGrass\u27 project (EC FP5, EVK2-CT2001-00105) has three main objectives: i) to reduce the large uncertainties concerning the estimates of CO2, N2O and CH4 fluxes to and from grassland plots under different climatic conditions and assess their global warming potential, ii) to measure net greenhouse gas (GHG) fluxes for different management which reflect potential mitigation options, iii) to construct a model of the controlling processes to quantify the net fluxes and to evaluate mitigation scenarios by up-scaling to a European level
"Delirium Day": A nationwide point prevalence study of delirium in older hospitalized patients using an easy standardized diagnostic tool
Background: To date, delirium prevalence in adult acute hospital populations has been estimated generally from pooled findings of single-center studies and/or among specific patient populations. Furthermore, the number of participants in these studies has not exceeded a few hundred. To overcome these limitations, we have determined, in a multicenter study, the prevalence of delirium over a single day among a large population of patients admitted to acute and rehabilitation hospital wards in Italy. Methods: This is a point prevalence study (called "Delirium Day") including 1867 older patients (aged 65 years or more) across 108 acute and 12 rehabilitation wards in Italian hospitals. Delirium was assessed on the same day in all patients using the 4AT, a validated and briefly administered tool which does not require training. We also collected data regarding motoric subtypes of delirium, functional and nutritional status, dementia, comorbidity, medications, feeding tubes, peripheral venous and urinary catheters, and physical restraints. Results: The mean sample age was 82.0 \ub1 7.5 years (58 % female). Overall, 429 patients (22.9 %) had delirium. Hypoactive was the commonest subtype (132/344 patients, 38.5 %), followed by mixed, hyperactive, and nonmotoric delirium. The prevalence was highest in Neurology (28.5 %) and Geriatrics (24.7 %), lowest in Rehabilitation (14.0 %), and intermediate in Orthopedic (20.6 %) and Internal Medicine wards (21.4 %). In a multivariable logistic regression, age (odds ratio [OR] 1.03, 95 % confidence interval [CI] 1.01-1.05), Activities of Daily Living dependence (OR 1.19, 95 % CI 1.12-1.27), dementia (OR 3.25, 95 % CI 2.41-4.38), malnutrition (OR 2.01, 95 % CI 1.29-3.14), and use of antipsychotics (OR 2.03, 95 % CI 1.45-2.82), feeding tubes (OR 2.51, 95 % CI 1.11-5.66), peripheral venous catheters (OR 1.41, 95 % CI 1.06-1.87), urinary catheters (OR 1.73, 95 % CI 1.30-2.29), and physical restraints (OR 1.84, 95 % CI 1.40-2.40) were associated with delirium. Admission to Neurology wards was also associated with delirium (OR 2.00, 95 % CI 1.29-3.14), while admission to other settings was not. Conclusions: Delirium occurred in more than one out of five patients in acute and rehabilitation hospital wards. Prevalence was highest in Neurology and lowest in Rehabilitation divisions. The "Delirium Day" project might become a useful method to assess delirium across hospital settings and a benchmarking platform for future surveys
Association of kidney disease measures with risk of renal function worsening in patients with type 1 diabetes
Background: Albuminuria has been classically considered a marker of kidney damage progression in diabetic patients and it is routinely assessed to monitor kidney function. However, the role of a mild GFR reduction on the development of stage 653 CKD has been less explored in type 1 diabetes mellitus (T1DM) patients. Aim of the present study was to evaluate the prognostic role of kidney disease measures, namely albuminuria and reduced GFR, on the development of stage 653 CKD in a large cohort of patients affected by T1DM. Methods: A total of 4284 patients affected by T1DM followed-up at 76 diabetes centers participating to the Italian Association of Clinical Diabetologists (Associazione Medici Diabetologi, AMD) initiative constitutes the study population. Urinary albumin excretion (ACR) and estimated GFR (eGFR) were retrieved and analyzed. The incidence of stage 653 CKD (eGFR < 60 mL/min/1.73 m2) or eGFR reduction > 30% from baseline was evaluated. Results: The mean estimated GFR was 98 \ub1 17 mL/min/1.73m2 and the proportion of patients with albuminuria was 15.3% (n = 654) at baseline. About 8% (n = 337) of patients developed one of the two renal endpoints during the 4-year follow-up period. Age, albuminuria (micro or macro) and baseline eGFR < 90 ml/min/m2 were independent risk factors for stage 653 CKD and renal function worsening. When compared to patients with eGFR > 90 ml/min/1.73m2 and normoalbuminuria, those with albuminuria at baseline had a 1.69 greater risk of reaching stage 3 CKD, while patients with mild eGFR reduction (i.e. eGFR between 90 and 60 mL/min/1.73 m2) show a 3.81 greater risk that rose to 8.24 for those patients with albuminuria and mild eGFR reduction at baseline. Conclusions: Albuminuria and eGFR reduction represent independent risk factors for incident stage 653 CKD in T1DM patients. The simultaneous occurrence of reduced eGFR and albuminuria have a synergistic effect on renal function worsening
- …