48 research outputs found

    Aspirin in primary prevention of cardiovascular disease and cancer : a systematic review of the balance of evidence from reviews of randomized trials

    Get PDF
    Background: Aspirin has been recommended for primary prevention of cardiovascular disease (CVD) and cancer, but overall benefits are unclear. We aimed to use novel methods to re-evaluate the balance of benefits and harms of aspirin using evidence from randomised controlled trials, systematic reviews and meta-analyses. Methods and Findings: Data sources included ten electronic bibliographic databases, contact with experts, and scrutiny of reference lists of included studies. Searches were undertaken in September 2012 and restricted to publications since 2008. Of 2,572 potentially relevant papers 27 met the inclusion criteria. Meta-analysis of control arms to estimate event rates, modelling of all-cause mortality and L'Abbé plots to estimate heterogeneity were undertaken. Absolute benefits and harms were low: 60-84 major CVD events and 34-36 colorectal cancer deaths per 100,000 person-years were averted, whereas 46-49 major bleeds and 68-117 gastrointestinal bleeds were incurred. Reductions in all-cause mortality were minor and uncertain (Hazard Ratio 0.96; 95% CI: 0.90-1.02 at 20 years, Relative Risk [RR] 0.94, 95% CI: 0.88-1.00 at 8 years); there was a non-significant change in total CVD (RR 0.85, 95% CI: 0.69-1.06) and change in total cancer mortality ranged from 0.76 (95% CI: 0.66-0.88) to 0.93 (95% CI: 0.84-1.03) depending on follow-up time and studies included. Risks were increased by 37% for gastrointestinal bleeds (RR 1.37, 95% CI: 1.15-1.62), 54%-66% for major bleeds (Rate Ratio from IPD analysis 1.54, 95% CI: 1.30-1.82, and RR 1.62, 95% CI: 1.31-2.00), and 32%-38% for haemorrhagic stroke (Rate Ratio from IPD analysis 1.32; 95% CI: 1.00-1.74; RR 1.38; 95% CI: 1.01-1.82). Conclusions: Findings indicate small absolute effects of aspirin relative to the burden of these diseases. When aspirin is used for primary prevention of CVD the absolute harms exceed the benefits. Estimates of cancer benefit rely on selective retrospective re-analysis of RCTs and more information is needed

    Financial feasibility of end-user designed rainwater harvesting and greywater reuse systems for high water use households

    Get PDF
    © 2017, The Author(s). Water availability pressures, competing end-uses and sewers at capacity are all drivers for change in urban water management. Rainwater harvesting (RWH) and greywater reuse (GWR) systems constitute alternatives to reduce drinking water usage and in the case of RWH, reduce roof runoff entering sewers. Despite the increasing popularity of installations in commercial buildings, RWH and GWR technologies at a household scale have proved less popular, across a range of global contexts. For systems designed from the top-down, this is often due to the lack of a favourable cost-benefit (where subsidies are unavailable), though few studies have focused on performing full capital and operational financial assessments, particularly in high water consumption households. Using a bottom-up design approach, based on a questionnaire survey with 35 households in a residential complex in Bucaramanga, Colombia, this article considers the initial financial feasibility of three RWH and GWR system configurations proposed for high water using households (equivalent to >203L per capita per day). A full capital and operational financial assessment was performed at a more detailed level for the most viable design using historic rainfall data. For the selected configuration (‘Alt 2’), the estimated potable water saving was 44% (equivalent to 131m3/year) with a rate of return on investment of 6.5% and an estimated payback period of 23years. As an initial end-user-driven design exercise, these results are promising and constitute a starting point for facilitating such approaches to urban water management at the household scale

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Genetic mechanisms of critical illness in COVID-19.

    Get PDF
    Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 ×  10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice

    Burnout among surgeons before and during the SARS-CoV-2 pandemic: an international survey

    Get PDF
    Background: SARS-CoV-2 pandemic has had many significant impacts within the surgical realm, and surgeons have been obligated to reconsider almost every aspect of daily clinical practice. Methods: This is a cross-sectional study reported in compliance with the CHERRIES guidelines and conducted through an online platform from June 14th to July 15th, 2020. The primary outcome was the burden of burnout during the pandemic indicated by the validated Shirom-Melamed Burnout Measure. Results: Nine hundred fifty-four surgeons completed the survey. The median length of practice was 10 years; 78.2% included were male with a median age of 37 years old, 39.5% were consultants, 68.9% were general surgeons, and 55.7% were affiliated with an academic institution. Overall, there was a significant increase in the mean burnout score during the pandemic; longer years of practice and older age were significantly associated with less burnout. There were significant reductions in the median number of outpatient visits, operated cases, on-call hours, emergency visits, and research work, so, 48.2% of respondents felt that the training resources were insufficient. The majority (81.3%) of respondents reported that their hospitals were included in the management of COVID-19, 66.5% felt their roles had been minimized; 41% were asked to assist in non-surgical medical practices, and 37.6% of respondents were included in COVID-19 management. Conclusions: There was a significant burnout among trainees. Almost all aspects of clinical and research activities were affected with a significant reduction in the volume of research, outpatient clinic visits, surgical procedures, on-call hours, and emergency cases hindering the training. Trial registration: The study was registered on clicaltrials.gov "NCT04433286" on 16/06/2020
    corecore