229 research outputs found

    Trends in Characteristics of Patients Listed for Liver Transplantation Will Lead to Higher Rates of Waitlist Removal Due to Clinical Deterioration

    Get PDF
    BACKGROUND: Changes in the epidemiology of end-stage liver disease may lead to increased risk of dropout from the liver transplant waitlist. Anticipating the future of liver transplant waitlist characteristics is vital when considering organ allocation policy. METHODS: We performed a discrete event simulation to forecast patient characteristics and rate of waitlist dropout. Estimates were simulated from 2015 to 2025. The model was informed by data from the Organ Procurement and Transplant Network, 2003 to 2014. National data are estimated along with forecasts for 2 regions. RESULTS: Nonalcoholic steatohepatitis will increase from 18% of waitlist additions to 22% by 2025. Hepatitis C will fall from 30% to 21%. Listings over age 60 years will increase from 36% to 48%. The hazard of dropout will increase from 41% to 46% nationally. Wait times for transplant for patients listed with a Model for End-Stage Liver Disease (MELD) between 22 and 27 will double. Region 5, which transplants at relatively higher MELD scores, will experience an increase from 53% to 64% waitlist dropout. Region 11, which transplants at lower MELD scores, will have an increase in waitlist dropout from 30% to 44%. CONCLUSIONS: The liver transplant waitlist size will remain static over the next decade due to patient dropout. Liver transplant candidates will be older, more likely to have nonalcoholic steatohepatitis and will wait for transplantation longer even when listed at a competitive MELD score. There will continue to be significant heterogeneity among transplant regions where some patients will be more likely to drop out of the waitlist than receive a transplant

    Declining liver graft quality threatens the future of liver transplantation in the United States

    Get PDF
    National liver transplantation (LT) volume has declined since 2006, in part because of worsening donor organ quality. Trends that degrade organ quality are expected to continue over the next 2 decades. We used the United Network for Organ Sharing (UNOS) database to inform a 20-year discrete event simulation estimating LT volume from 2010 to 2030. Data to inform the model were obtained from deceased organ donors between 2000 and 2009. If donor liver utilization practices remain constant, utilization will fall from 78% to 44% by 2030, resulting in 2230 fewer LTs. If transplant centers increase their risk tolerance for marginal grafts, utilization would decrease to 48%. The institution of "opt-out" organ donation policies to increase the donor pool would still result in 1380 to 1866 fewer transplants. Ex vivo perfusion techniques that increase the use of marginal donor livers may stabilize LT volume. Otherwise, the number of LTs in the United States will decrease substantially over the next 15 years. In conclusion, the transplant community will need to accept inferior grafts and potentially worse posttransplant outcomes and/or develop new strategies for increasing organ donation and utilization in order to maintain the number of LTs at the current level

    Knowledge, Awareness and Practice with Antimicrobial Stewardship Programmes among Healthcare Providers in a Ghanaian Tertiary Hospital

    Get PDF
    Antimicrobial resistance (AMR) is a significant problem in global health today, particularly in low- and middle-income countries (LMICs) where antimicrobial stewardship programmes are yet to be successfully implemented. We established a partnership between AMR pharmacists from a UK NHS hospital and in Ho Teaching Hospital with the aim of enhancing antimicrobial stewardship knowledge and practice among healthcare providers through an educational intervention. We employed a mixed-method approach that included an initial survey on knowledge and awareness before and after training, followed by qualitative interviews with healthcare providers conducted six months after delivery of training. This study was carried out in two phases in Ho Teaching Hospital with healthcare professionals, including pharmacists, medical doctors, nurses and medical laboratory scientists. Ethical approval was obtained prior to data collection. In the first phase, we surveyed 50 healthcare providers, including nurses (33%), pharmacists (29%) and biomedical scientists (23%). Of these, 58% of participants had engaged in continuous professional development on AMR/AMS, and above 95% demonstrated good knowledge on the general use of antibiotics. A total of 18 participants, which included four medical doctors, five pharmacists, four nurses, two midwives and three biomedical scientists, were interviewed in the second phase and demonstrated greater awareness of AMS practices, particularly the role of education for patients, as well as healthcare professionals. We found that knowledge and practice with AMS was markedly improved six months after the training session. There is limited practice of AMS in LMICs; however, through AMR-focused training, we demonstrated improved AMS skills and practice among healthcare providers in Ho Teaching Hospital. There is a need for continuous AMR training sessions for healthcare professionals in resource-limited settings

    Non-invasive detection of ischemic vascular damage in a pig model of liver donation after circulatory death

    Get PDF
    Background and Aims: Liver graft quality is evaluated by visual inspection prior to transplantation, a process highly dependent on the surgeon's experience. We present an objective, noninvasive, quantitative way of assessing liver quality in real time using Raman spectroscopy, a laser-based tool for analyzing biomolecular composition. Approach and Results: A porcine model of donation after circulatory death (DCD) with normothermic regional perfusion (NRP) allowed assessment of liver quality premortem, during warm ischemia (WI) and post-NRP. Ten percent of circulating blood volume was removed in half of experiments to simulate blood recovery for DCD heart removal. Left median lobe biopsies were obtained before circulatory arrest, after 45 minutes of WI, and after 2 hours of NRP and analyzed using spontaneous Raman spectroscopy, stimulated Raman spectroscopy (SRS), and staining. Measurements were also taken in situ from the porcine liver using a handheld Raman spectrometer at these time points from left median and right lateral lobes. Raman microspectroscopy detected congestion during WI by measurement of the intrinsic Raman signal of hemoglobin in red blood cells (RBCs), eliminating the need for exogenous labels. Critically, this microvascular damage was not observed during WI when 10% of circulating blood was removed before cardiac arrest. Two hours of NRP effectively cleared RBCs from congested livers. Intact RBCs were visualized rapidly at high resolution using SRS. Optical properties of ischemic livers were significantly different from preischemic and post-NRP livers as measured using a handheld Raman spectrometer. Conclusions: Raman spectroscopy is an effective tool for detecting microvascular damage which could assist the decision to use marginal livers for transplantation. Reducing the volume of circulating blood before circulatory arrest in DCD may help reduce microvascular damage

    Towards improved cover glasses for photovoltaic devices

    Get PDF
    For the solar energy industry to increase its competitiveness there is a global drive to lower the cost of solar generated electricity. Photovoltaic (PV) module assembly is material-demanding and the cover glass constitutes a significant proportion of the cost. Currently, 3 mm thick glass is the predominant cover material for PV modules, accounting for 10-25% of the total cost. Here we review the state-of-the-art of cover glasses for PV modules and present our recent results for improvement of the glass. These improvements were demonstrated in terms of mechanical, chemical and optical properties by optimizing the glass composition, including addition of novel dopants, to produce cover glasses that can provide: (i) enhanced UV protection of polymeric PV module components, potentially increasing module service lifetimes; (ii) re-emission of a proportion of the absorbed UV photon energy as visible photons capable of being absorbed by the solar cells, thereby increasing PV module efficiencies; (iii) Successful laboratory-scale demonstration of proof-of-concept, with increases of 1-6% in Isc and 1-8% Ipm. Improvements in both chemical and crack resistance of the cover glass were also achieved through modest chemical reformulation, highlighting what may be achievable within existing manufacturing technology constraints

    Characteristics and properties of nano-LiCoO2 synthesized by pre-organized single source precursors: Li-ion diffusivity, electrochemistry and biological assessment

    Get PDF
    Background: LiCoO2 is one of the most used cathode materials in Li-ion batteries. Its conventional synthesis requires high temperature (>800 degrees C) and long heating time (>24 h) to obtain the micronscale rhombohedral layered high-temperature phase of LiCoO2 ( HT-LCO). Nanoscale HT-LCO is of interest to improve the battery performance as the lithium (Li+) ion pathway is expected to be shorter in nanoparticles as compared to micron sized ones. Since batteries typically get recycled, the exposure to nanoparticles during this process needs to be evaluated. Results: Several new single source precursors containing lithium (Li+) and cobalt (Co2+) ions, based on alkoxides and aryloxides have been structurally characterized and were thermally transformed into nanoscale HT-LCO at 450 degrees C within few hours. The size of the nanoparticles depends on the precursor, determining the electrochemical performance. The Li-ion diffusion coefficients of our - LiCoO2 nanoparticles improved at least by a factor of 10 compared to commercial one, while showing good reversibility upon charging and discharging. The hazard of occupational exposure to nanoparticles during battery recycling was investigated with an in vitro multicellular lung model. Conclusions: Our heterobimetallic single source precursors allow to dramatically reduce the production temperature and time for HT-LCO. The obtained nanoparticles of LiCoO2 have faster kinetics for Li+ insertion/extraction compared to microparticles. Overall, nano-sized - LiCoO2 particles indicate a lower cytotoxic and (pro-)inflammogenic potential in vitro compared to their micron-sized counterparts. However, nanoparticles aggregate in air and behave partially like microparticles

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk
    corecore