127 research outputs found

    From Physics to Fixtures to Food: Current and Potential LED Efficacy

    Get PDF
    Light-emitting diodes (LEDs) have enabled a historic increase in the conversion of electric energy to photons, but this is approaching a physical limit. The theoretical maximum efficiency occurs when all input energy is converted to energy in photosynthetic photons. Blue LEDs can be 93% efficient, phosphor-converted “whites” 76% efficient, and red LEDs 81% efficient. These improvements open new opportunities for horticultural lighting. Here we review (1) fundamental physics and efficiency of LEDs, (2) the current efficacy of LEDs, (3) the effect of spectral quality on crop yield, and (4) the potential efficacy of horticultural fixtures. Advances in the conversion of photons to yield can be achieved by optimization of spectral effects on plant morphology, which vary among species. Conversely, spectral effects on photosynthesis are remarkably similar across species, but the conventional definition of photosynthetic photons (400–700 nm) may need to be modified. The upper limit of LED fixture efficacy is determined by the LED package efficacy multiplied by four factors inherent to all fixtures: current droop, thermal droop, driver (power supply) inefficiencies, and optical losses. With current LED technology, the calculations indicate efficacy limits of 3.4 µmol J−1 for white + red fixtures, and 4.1 µmol J−1 for blue + red fixtures. Adding optical protection from water and high humidity reduces these values by ~10%. We describe tradeoffs between peak efficacy and cost

    Protocol for a mixed-methods exploratory investigation of care following intensive care discharge: the REFLECT study

    Get PDF
    © Author(s) 2019. Re-use permitted under CC BY. Published by BMJ.INTRODUCTION: A substantial number of patients discharged from intensive care units (ICUs) subsequently die without leaving hospital. It is unclear how many of these deaths are preventable. Ward-based management following discharge from ICU is an area that patients and healthcare staff are concerned about. The primary aim of REFLECT (Recovery Following Intensive Care Treatment) is to develop an intervention plan to reduce in-hospital mortality rates in patients who have been discharged from ICU. METHODS AND ANALYSIS: REFLECT is a multicentre mixed-methods exploratory study examining ward care delivery to adult patients discharged from ICU. The study will be made up of four substudies. Medical notes of patients who were discharged from ICU and subsequently died will be examined using a retrospective case records review (RCRR) technique. Patients and their relatives will be interviewed about their post-ICU care, including relatives of patients who died in hospital following ICU discharge. Staff involved in the care of patients post-ICU discharge will be interviewed about the care of this patient group. The medical records of patients who survived their post-ICU stay will also be reviewed using the RCRR technique. The analyses of the substudies will be both descriptive and use a modified grounded theory approach to identify emerging themes. The evidence generated in these four substudies will form the basis of the intervention development, which will take place through stakeholder and clinical expert meetings. ETHICS AND DISSEMINATION: Ethical approval has been obtained through the Wales Research and Ethics Committee 4 (17/WA/0107). We aim to disseminate the findings through international conferences, international peer-reviewed journals and social media. TRIAL REGISTRATION NUMBER: ISRCTN14658054.Peer reviewedFinal Published versio

    A randomised controlled feasibility trial for an educational school-based mental health intervention: study protocol

    Get PDF
    Background: With the burden of mental illness estimated to be costing the English economy alone around £22.5 billion a year [1], coupled with growing evidence that many mental disorders have their origins in adolescence, there is increasing pressure for schools to address the emotional well-being of their students, alongside the stigma and discrimination of mental illness. A number of prior educational interventions have been developed and evaluated for this purpose, but inconsistency of findings, reporting standards, and methodologies have led the majority of reviewers to conclude that the evidence for the efficacy of these programmes remains inconclusive. Methods/Design: A cluster randomised controlled trial design has been employed to enable a feasibility study of 'SchoolSpace', an intervention in 7 UK secondary schools addressing stigma of mental illness, mental health literacy, and promotion of mental health. A central aspect of the intervention involves students in the experimental condition interacting with a young person with lived experience of mental illness, a stigma reducing technique designed to facilitate students' engagement in the project. The primary outcome is the level of stigma related to mental illness. Secondary outcomes include mental health literacy, resilience to mental illness, and emotional well-being. Outcomes will be measured pre and post intervention, as well as at 6 month follow-up. Discussion: The proposed intervention presents the potential for increased engagement due to its combination of education and contact with a young person with lived experience of mental illness. Contact as a technique to reduce discrimination has been evaluated previously in research with adults, but has been employed in only a minority of research trials investigating the impact on youth. Prior to this study, the effect of contact on mental health literacy, resilience, and emotional well-being has not been evaluated to the authors' knowledge. If efficacious the intervention could provide a reliable and cost-effective method to reduce stigma in young people, whilst increasing mental health literacy, and emotional well-being. Trial registration: ISRCTN: ISRCTN0740602

    Effects of rare kidney diseases on kidney failure: a longitudinal analysis of the UK National Registry of Rare Kidney Diseases (RaDaR) cohort

    Get PDF
    \ua9 2024 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licenseBackground: Individuals with rare kidney diseases account for 5–10% of people with chronic kidney disease, but constitute more than 25% of patients receiving kidney replacement therapy. The National Registry of Rare Kidney Diseases (RaDaR) gathers longitudinal data from patients with these conditions, which we used to study disease progression and outcomes of death and kidney failure. Methods: People aged 0–96 years living with 28 types of rare kidney diseases were recruited from 108 UK renal care facilities. The primary outcomes were cumulative incidence of mortality and kidney failure in individuals with rare kidney diseases, which were calculated and compared with that of unselected patients with chronic kidney disease. Cumulative incidence and Kaplan–Meier survival estimates were calculated for the following outcomes: median age at kidney failure; median age at death; time from start of dialysis to death; and time from diagnosis to estimated glomerular filtration rate (eGFR) thresholds, allowing calculation of time from last eGFR of 75 mL/min per 1\ub773 m2 or more to first eGFR of less than 30 mL/min per 1\ub773 m2 (the therapeutic trial window). Findings: Between Jan 18, 2010, and July 25, 2022, 27 285 participants were recruited to RaDaR. Median follow-up time from diagnosis was 9\ub76 years (IQR 5\ub79–16\ub77). RaDaR participants had significantly higher 5-year cumulative incidence of kidney failure than 2\ub781 million UK patients with all-cause chronic kidney disease (28% vs 1%; p<0\ub70001), but better survival rates (standardised mortality ratio 0\ub742 [95% CI 0\ub732–0\ub752]; p<0\ub70001). Median age at kidney failure, median age at death, time from start of dialysis to death, time from diagnosis to eGFR thresholds, and therapeutic trial window all varied substantially between rare diseases. Interpretation: Patients with rare kidney diseases differ from the general population of individuals with chronic kidney disease: they have higher 5-year rates of kidney failure but higher survival than other patients with chronic kidney disease stages 3–5, and so are over-represented in the cohort of patients requiring kidney replacement therapy. Addressing unmet therapeutic need for patients with rare kidney diseases could have a large beneficial effect on long-term kidney replacement therapy demand. Funding: RaDaR is funded by the Medical Research Council, Kidney Research UK, Kidney Care UK, and the Polycystic Kidney Disease Charity

    Risk of adverse outcomes in patients with underlying respiratory conditions admitted to hospital with COVID-19:a national, multicentre prospective cohort study using the ISARIC WHO Clinical Characterisation Protocol UK

    Get PDF
    Background Studies of patients admitted to hospital with COVID-19 have found varying mortality outcomes associated with underlying respiratory conditions and inhaled corticosteroid use. Using data from a national, multicentre, prospective cohort, we aimed to characterise people with COVID-19 admitted to hospital with underlying respiratory disease, assess the level of care received, measure in-hospital mortality, and examine the effect of inhaled corticosteroid use. Methods We analysed data from the International Severe Acute Respiratory and emerging Infection Consortium (ISARIC) WHO Clinical Characterisation Protocol UK (CCP-UK) study. All patients admitted to hospital with COVID-19 across England, Scotland, and Wales between Jan 17 and Aug 3, 2020, were eligible for inclusion in this analysis. Patients with asthma, chronic pulmonary disease, or both, were identified and stratified by age (<16 years, 16–49 years, and ≥50 years). In-hospital mortality was measured by use of multilevel Cox proportional hazards, adjusting for demographics, comorbidities, and medications (inhaled corticosteroids, short-acting β-agonists [SABAs], and long-acting β-agonists [LABAs]). Patients with asthma who were taking an inhaled corticosteroid plus LABA plus another maintenance asthma medication were considered to have severe asthma. Findings 75 463 patients from 258 participating health-care facilities were included in this analysis: 860 patients younger than 16 years (74 [8·6%] with asthma), 8950 patients aged 16–49 years (1867 [20·9%] with asthma), and 65 653 patients aged 50 years and older (5918 [9·0%] with asthma, 10 266 [15·6%] with chronic pulmonary disease, and 2071 [3·2%] with both asthma and chronic pulmonary disease). Patients with asthma were significantly more likely than those without asthma to receive critical care (patients aged 16–49 years: adjusted odds ratio [OR] 1·20 [95% CI 1·05–1·37]; p=0·0080; patients aged ≥50 years: adjusted OR 1·17 [1·08–1·27]; p<0·0001), and patients aged 50 years and older with chronic pulmonary disease (with or without asthma) were significantly less likely than those without a respiratory condition to receive critical care (adjusted OR 0·66 [0·60–0·72] for those without asthma and 0·74 [0·62–0·87] for those with asthma; p<0·0001 for both). In patients aged 16–49 years, only those with severe asthma had a significant increase in mortality compared to those with no asthma (adjusted hazard ratio [HR] 1·17 [95% CI 0·73–1·86] for those on no asthma therapy, 0·99 [0·61–1·58] for those on SABAs only, 0·94 [0·62–1·43] for those on inhaled corticosteroids only, 1·02 [0·67–1·54] for those on inhaled corticosteroids plus LABAs, and 1·96 [1·25–3·08] for those with severe asthma). Among patients aged 50 years and older, those with chronic pulmonary disease had a significantly increased mortality risk, regardless of inhaled corticosteroid use, compared to patients without an underlying respiratory condition (adjusted HR 1·16 [95% CI 1·12–1·22] for those not on inhaled corticosteroids, and 1·10 [1·04–1·16] for those on inhaled corticosteroids; p<0·0001). Patients aged 50 years and older with severe asthma also had an increased mortality risk compared to those not on asthma therapy (adjusted HR 1·24 [95% CI 1·04–1·49]). In patients aged 50 years and older, inhaled corticosteroid use within 2 weeks of hospital admission was associated with decreased mortality in those with asthma, compared to those without an underlying respiratory condition (adjusted HR 0·86 [95% CI 0·80−0·92]). Interpretation Underlying respiratory conditions are common in patients admitted to hospital with COVID-19. Regardless of the severity of symptoms at admission and comorbidities, patients with asthma were more likely, and those with chronic pulmonary disease less likely, to receive critical care than patients without an underlying respiratory condition. In patients aged 16 years and older, severe asthma was associated with increased mortality compared to non-severe asthma. In patients aged 50 years and older, inhaled corticosteroid use in those with asthma was associated with lower mortality than in patients without an underlying respiratory condition; patients with chronic pulmonary disease had significantly increased mortality compared to those with no underlying respiratory condition, regardless of inhaled corticosteroid use. Our results suggest that the use of inhaled corticosteroids, within 2 weeks of admission, improves survival for patients aged 50 years and older with asthma, but not for those with chronic pulmonary disease

    Development and validation of the ISARIC 4C Deterioration model for adults hospitalised with COVID-19: a prospective cohort study.

    Get PDF
    BACKGROUND: Prognostic models to predict the risk of clinical deterioration in acute COVID-19 cases are urgently required to inform clinical management decisions. METHODS: We developed and validated a multivariable logistic regression model for in-hospital clinical deterioration (defined as any requirement of ventilatory support or critical care, or death) among consecutively hospitalised adults with highly suspected or confirmed COVID-19 who were prospectively recruited to the International Severe Acute Respiratory and Emerging Infections Consortium Coronavirus Clinical Characterisation Consortium (ISARIC4C) study across 260 hospitals in England, Scotland, and Wales. Candidate predictors that were specified a priori were considered for inclusion in the model on the basis of previous prognostic scores and emerging literature describing routinely measured biomarkers associated with COVID-19 prognosis. We used internal-external cross-validation to evaluate discrimination, calibration, and clinical utility across eight National Health Service (NHS) regions in the development cohort. We further validated the final model in held-out data from an additional NHS region (London). FINDINGS: 74 944 participants (recruited between Feb 6 and Aug 26, 2020) were included, of whom 31 924 (43·2%) of 73 948 with available outcomes met the composite clinical deterioration outcome. In internal-external cross-validation in the development cohort of 66 705 participants, the selected model (comprising 11 predictors routinely measured at the point of hospital admission) showed consistent discrimination, calibration, and clinical utility across all eight NHS regions. In held-out data from London (n=8239), the model showed a similarly consistent performance (C-statistic 0·77 [95% CI 0·76 to 0·78]; calibration-in-the-large 0·00 [-0·05 to 0·05]); calibration slope 0·96 [0·91 to 1·01]), and greater net benefit than any other reproducible prognostic model. INTERPRETATION: The 4C Deterioration model has strong potential for clinical utility and generalisability to predict clinical deterioration and inform decision making among adults hospitalised with COVID-19. FUNDING: National Institute for Health Research (NIHR), UK Medical Research Council, Wellcome Trust, Department for International Development, Bill & Melinda Gates Foundation, EU Platform for European Preparedness Against (Re-)emerging Epidemics, NIHR Health Protection Research Unit (HPRU) in Emerging and Zoonotic Infections at University of Liverpool, NIHR HPRU in Respiratory Infections at Imperial College London

    Importance of patient bed pathways and length of stay differences in predicting COVID-19 hospital bed occupancy in England.

    Get PDF
    Background: Predicting bed occupancy for hospitalised patients with COVID-19 requires understanding of length of stay (LoS) in particular bed types. LoS can vary depending on the patient’s “bed pathway” - the sequence of transfers of individual patients between bed types during a hospital stay. In this study, we characterise these pathways, and their impact on predicted hospital bed occupancy. Methods: We obtained data from University College Hospital (UCH) and the ISARIC4C COVID-19 Clinical Information Network (CO-CIN) on hospitalised patients with COVID-19 who required care in general ward or critical care (CC) beds to determine possible bed pathways and LoS. We developed a discrete-time model to examine the implications of using either bed pathways or only average LoS by bed type to forecast bed occupancy. We compared model-predicted bed occupancy to publicly available bed occupancy data on COVID-19 in England between March and August 2020. Results: In both the UCH and CO-CIN datasets, 82% of hospitalised patients with COVID-19 only received care in general ward beds. We identified four other bed pathways, present in both datasets: “Ward, CC, Ward”, “Ward, CC”, “CC” and “CC, Ward”. Mean LoS varied by bed type, pathway, and dataset, between 1.78 and 13.53 days. For UCH, we found that using bed pathways improved the accuracy of bed occupancy predictions, while only using an average LoS for each bed type underestimated true bed occupancy. However, using the CO-CIN LoS dataset we were not able to replicate past data on bed occupancy in England, suggesting regional LoS heterogeneities. Conclusions: We identified five bed pathways, with substantial variation in LoS by bed type, pathway, and geography. This might be caused by local differences in patient characteristics, clinical care strategies, or resource availability, and suggests that national LoS averages may not be appropriate for local forecasts of bed occupancy for COVID-19. Trial registration: The ISARIC WHO CCP-UK study ISRCTN66726260 was retrospectively registered on 21/04/2020 and designated an Urgent Public Health Research Study by NIHR.</p

    Use of anticoagulants and antiplatelet agents in stable outpatients with coronary artery disease and atrial fibrillation. International CLARIFY registry

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field
    corecore