48 research outputs found
Investigating the use of visible and near infrared spectroscopy to predict sensory and texture attributes of beef M. longissimus thoracis et lumborum
peer-reviewedThe aim of this study was to calibrate chemometric models to predict beef M. longissimus thoracis et lumborum (LTL) sensory and textural values using visible-near infrared (VISNIR) spectroscopy. Spectra were collected on the cut surface of LTL steaks both on-line and off-line. Cooked LTL steaks were analysed by a trained beef sensory panel as well as undergoing WBSF analysis. The best coefficients of determination of cross validation (R2CV) in the current study were for textural traits (WBSF = 0.22; stringiness = 0.22; crumbly texture = 0.41: all 3 models calibrated using 48 h post-mortem spectra), and some sensory flavour traits (fatty mouthfeel = 0.23; fatty after-effect = 0.28: both calibrated using 49 h post-mortem spectra). The results of this experiment indicate that VISNIR spectroscopy has potential to predict a range of sensory traits (particularly textural traits) with an acceptable level of accuracy at specific post-mortem times.This work is funded by the BreedQuality project (11/SF/311) which is supported by The Irish Department of Agriculture, Food and the Marine (DAFM) under the National Development Plan 2007–2013
Implementation and evaluation of a nurse-centered computerized potassium regulation protocol in the intensive care unit - a before and after analysis
<p>Abstract</p> <p>Background</p> <p>Potassium disorders can cause major complications and must be avoided in critically ill patients. Regulation of potassium in the intensive care unit (ICU) requires potassium administration with frequent blood potassium measurements and subsequent adjustments of the amount of potassium administrated. The use of a potassium replacement protocol can improve potassium regulation. For safety and efficiency, computerized protocols appear to be superior over paper protocols. The aim of this study was to evaluate if a computerized potassium regulation protocol in the ICU improved potassium regulation.</p> <p>Methods</p> <p>In our surgical ICU (12 beds) and cardiothoracic ICU (14 beds) at a tertiary academic center, we implemented a nurse-centered computerized potassium protocol integrated with the pre-existent glucose control program called GRIP (Glucose Regulation in Intensive Care patients). Before implementation of the computerized protocol, potassium replacement was physician-driven. Potassium was delivered continuously either by central venous catheter or by gastric, duodenal or jejunal tube. After every potassium measurement, nurses received a recommendation for the potassium administration rate and the time to the next measurement. In this before-after study we evaluated potassium regulation with GRIP. The attitude of the nursing staff towards potassium regulation with computer support was measured with questionnaires.</p> <p>Results</p> <p>The patient cohort consisted of 775 patients before and 1435 after the implementation of computerized potassium control. The number of patients with hypokalemia (<3.5 mmol/L) and hyperkalemia (>5.0 mmol/L) were recorded, as well as the time course of potassium levels after ICU admission. The incidence of hypokalemia and hyperkalemia was calculated. Median potassium-levels were similar in both study periods, but the level of potassium control improved: the incidence of hypokalemia decreased from 2.4% to 1.7% (P < 0.001) and hyperkalemia from 7.4% to 4.8% (P < 0.001). Nurses indicated that they considered computerized potassium control an improvement over previous practice.</p> <p>Conclusions</p> <p>Computerized potassium control, integrated with the nurse-centered GRIP program for glucose regulation, is effective and reduces the prevalence of hypo- and hyperkalemia in the ICU compared with physician-driven potassium regulation.</p
Consensus guidelines on analgesia and sedation in dying intensive care unit patients
BACKGROUND: Intensivists must provide enough analgesia and sedation to ensure dying patients receive good palliative care. However, if it is perceived that too much is given, they risk prosecution for committing euthanasia. The goal of this study is to develop consensus guidelines on analgesia and sedation in dying intensive care unit patients that help distinguish palliative care from euthanasia. METHODS: Using the Delphi technique, panelists rated levels of agreement with statements describing how analgesics and sedatives should be given to dying ICU patients and how palliative care should be distinguished from euthanasia. Participants were drawn from 3 panels: 1) Canadian Academic Adult Intensive Care Fellowship program directors and Intensive Care division chiefs (N = 9); 2) Deputy chief provincial coroners (N = 5); 3) Validation panel of Intensivists attending the Canadian Critical Care Trials Group meeting (N = 12). RESULTS: After three Delphi rounds, consensus was achieved on 16 statements encompassing the role of palliative care in the intensive care unit, the management of pain and suffering, current areas of controversy, and ways of improving palliative care in the ICU. CONCLUSION: Consensus guidelines were developed to guide the administration of analgesics and sedatives to dying ICU patients and to help distinguish palliative care from euthanasia
Long-term reductions in tinnitus severity
BACKGROUND: This study was undertaken to assess long-term changes in tinnitus severity exhibited by patients who completed a comprehensive tinnitus management program; to identify factors that contributed to changes in tinnitus severity within this population; to contribute to the development and refinement of effective assessment and management procedures for tinnitus. METHODS: Detailed questionnaires were mailed to 300 consecutive patients prior to their initial appointment at the Oregon Health & Science University Tinnitus Clinic. All patients were then evaluated and treated within a comprehensive tinnitus management program. Follow-up questionnaires were mailed to the same 300 patients 6 to 36 months after their initial tinnitus clinic appointment. RESULTS: One hundred ninety patients (133 males, 57 females; mean age 57 years) returned follow-up questionnaires 6 to 36 months (mean = 22 months) after their initial tinnitus clinic appointment. This group of patients exhibited significant long-term reductions in self-rated tinnitus loudness, Tinnitus Severity Index scores, tinnitus-related anxiety and prevalence of current depression. Patients who improved their sleep patterns or Beck Depression Inventory scores exhibited greater reductions of tinnitus severity scores than patients who continued to experience insomnia and depression at follow-up. CONCLUSIONS: Individualized tinnitus management programs that were designed for each patient contributed to overall reductions in tinnitus severity exhibited on follow-up questionnaires. Identification and treatment of patients experiencing anxiety, insomnia or depression are vital components of an effective tinnitus management program. Utilization of acoustic therapy also contributed to improvements exhibited by these patients
Clinical practice guidelines for the diagnosis and surveillance of BAP1 tumour predisposition syndrome
BRCA1-associated protein-1 (BAP1) is a recognised tumour suppressor gene. Germline BAP1 pathogenic/likely pathogenic variants are associated with predisposition to multiple tumours, including uveal melanoma, malignant pleural and peritoneal mesothelioma, renal cell carcinoma and specific non-malignant neoplasms of the skin, as part of the autosomal dominant BAP1-tumour predisposition syndrome. The overall lifetime risk for BAP1 carriers to develop at least one BAP1-associated tumour is up to 85%, although due to ascertainment bias, current estimates of risk are likely to be overestimated. As for many rare cancer predisposition syndromes, there is limited scientific evidence to support the utility of surveillance and, therefore, management recommendations for BAP1 carriers are based on expert opinion. To date, European recommendations for BAP1 carriers have not been published but are necessary due to the emerging phenotype of this recently described syndrome and increased identification of BAP1 carriers via large gene panels or tumour sequencing. To address this, the Clinical Guideline Working Group of the CanGene-CanVar project in the United Kingdom invited European collaborators to collaborate to develop guidelines to harmonize surveillance programmes within Europe. Recommendations with respect to BAP1 testing and surveillance were achieved following literature review and Delphi survey completed by a core group and an extended expert group of 34 European specialists including Geneticists, Ophthalmologists, Oncologists, Dermatologists and Pathologists. It is recognised that these largely evidence-based but pragmatic recommendations will evolve over time as further data from research collaborations informs the phenotypic spectrum and surveillance outcomes.</p
Genetic mechanisms of critical illness in COVID-19.
Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 × 10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice
Common, low-frequency, rare, and ultra-rare coding variants contribute to COVID-19 severity
The combined impact of common and rare exonic variants in COVID-19 host genetics is currently insufficiently understood. Here, common and rare variants from whole-exome sequencing data of about 4000 SARS-CoV-2-positive individuals were used to define an interpretable machine-learning model for predicting COVID-19 severity. First, variants were converted into separate sets of Boolean features, depending on the absence or the presence of variants in each gene. An ensemble of LASSO logistic regression models was used to identify the most informative Boolean features with respect to the genetic bases of severity. The Boolean features selected by these logistic models were combined into an Integrated PolyGenic Score that offers a synthetic and interpretable index for describing the contribution of host genetics in COVID-19 severity, as demonstrated through testing in several independent cohorts. Selected features belong to ultra-rare, rare, low-frequency, and common variants, including those in linkage disequilibrium with known GWAS loci. Noteworthily, around one quarter of the selected genes are sex-specific. Pathway analysis of the selected genes associated with COVID-19 severity reflected the multi-organ nature of the disease. The proposed model might provide useful information for developing diagnostics and therapeutics, while also being able to guide bedside disease management. © 2021, The Author(s)
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
Duration of androgen deprivation therapy with postoperative radiotherapy for prostate cancer: a comparison of long-course versus short-course androgen deprivation therapy in the RADICALS-HD randomised trial
Background
Previous evidence supports androgen deprivation therapy (ADT) with primary radiotherapy as initial treatment for intermediate-risk and high-risk localised prostate cancer. However, the use and optimal duration of ADT with postoperative radiotherapy after radical prostatectomy remains uncertain.
Methods
RADICALS-HD was a randomised controlled trial of ADT duration within the RADICALS protocol. Here, we report on the comparison of short-course versus long-course ADT. Key eligibility criteria were indication for radiotherapy after previous radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to add 6 months of ADT (short-course ADT) or 24 months of ADT (long-course ADT) to radiotherapy, using subcutaneous gonadotrophin-releasing hormone analogue (monthly in the short-course ADT group and 3-monthly in the long-course ADT group), daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as metastasis arising from prostate cancer or death from any cause. The comparison had more than 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 75% to 81% (hazard ratio [HR] 0·72). Standard time-to-event analyses were used. Analyses followed intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and
ClinicalTrials.gov
,
NCT00541047
.
Findings
Between Jan 30, 2008, and July 7, 2015, 1523 patients (median age 65 years, IQR 60–69) were randomly assigned to receive short-course ADT (n=761) or long-course ADT (n=762) in addition to postoperative radiotherapy at 138 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 8·9 years (7·0–10·0), 313 metastasis-free survival events were reported overall (174 in the short-course ADT group and 139 in the long-course ADT group; HR 0·773 [95% CI 0·612–0·975]; p=0·029). 10-year metastasis-free survival was 71·9% (95% CI 67·6–75·7) in the short-course ADT group and 78·1% (74·2–81·5) in the long-course ADT group. Toxicity of grade 3 or higher was reported for 105 (14%) of 753 participants in the short-course ADT group and 142 (19%) of 757 participants in the long-course ADT group (p=0·025), with no treatment-related deaths.
Interpretation
Compared with adding 6 months of ADT, adding 24 months of ADT improved metastasis-free survival in people receiving postoperative radiotherapy. For individuals who can accept the additional duration of adverse effects, long-course ADT should be offered with postoperative radiotherapy.
Funding
Cancer Research UK, UK Research and Innovation (formerly Medical Research Council), and Canadian Cancer Society