23 research outputs found

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    XVI International Congress of Control Electronics and Telecommunications: "Techno-scientific considerations for a post-pandemic world intensive in knowledge, innovation and sustainable local development"

    Get PDF
    Este tĂ­tulo, sugestivo por los impactos durante la situaciĂłn de la Covid 19 en el mundo, y que en Colombia lastimosamente han sido muy crĂ­ticos, permiten asumir la obligada superaciĂłn de tensiones sociales, polĂ­ticas, y econĂłmicas; pero sobre todo cientĂ­ficas y tecnolĂłgicas. Inicialmente, esto supone la existencia de una capacidad de la sociedad colombiana por recuperar su estado inicial despuĂ©s de que haya cesado la perturbaciĂłn a la que fue sometida por la catastrĂłfica pandemia, y superar ese anterior estado de cosas ya que se encontraban -y aĂșn se encuentran- muchos problemas locales mal resueltos, medianamente resueltos, y muchos sin resolver: es decir, habrĂĄ que rediseñar y fortalecer una probada resiliencia social existente - producto del prolongado conflicto social colombiano superado parcialmente por un proceso de paz exitoso - desde la tecnociencia local; como lo indicaba Markus Brunnermeier - economista alemĂĄn y catedrĂĄtico de economĂ­a de la Universidad de Princeton- en su libro The Resilient Society
La cuestiĂłn no es preveerlo todo sino poder reaccionar
aprender a recuperarse rĂĄpido.This title, suggestive of the impacts during the Covid 19 situation in the world, and which have unfortunately been very critical in Colombia, allows us to assume the obligatory overcoming of social, political, and economic tensions; but above all scientific and technological. Initially, this supposes the existence of a capacity of Colombian society to recover its initial state after the disturbance to which it was subjected by the catastrophic pandemic has ceased, and to overcome that previous state of affairs since it was found -and still is find - many local problems poorly resolved, moderately resolved, and many unresolved: that is, an existing social resilience test will have to be redesigned and strengthened - product of the prolonged Colombian social conflict partially overcome by a successful peace process - from local technoscience; As Markus Brunnermeier - German economist and professor of economics at Princeton University - indicates in his book The Resilient Society...The question is not to foresee everything but to be able to react...learn to recover quickly.Bogot

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Gaia data release 1. Pre-processing and source list creation

    No full text
    Context. The first data release from the Gaia mission contains accurate positions and magnitudes for more than a billion sources, and proper motions and parallaxes for the majority of the 2.5 million Hipparcos and Tycho-2 stars. Aims. We describe three essential elements of the initial data treatment leading to this catalogue: the image analysis, the construction of a source list, and the near real-time monitoring of the payload health. We also discuss some weak points that set limitations for the attainable precision at the present stage of the mission. Methods. Image parameters for point sources are derived from one-dimensional scans, using a maximum likelihood method, under the assumption of a line spread function constant in time, and a complete modelling of bias and background. These conditions are, however, not completely fulfilled. The Gaia source list is built starting from a large ground-based catalogue, but even so a significant number of new entries have been added, and a large number have been removed. The autonomous onboard star image detection will pick up many spurious images, especially around bright sources, and such unwanted detections must be identified. Another key step of the source list creation consists in arranging the more than 1010 individual detections in spatially isolated groups that can be analysed individually. Results. Complete software systems have been built for the Gaia initial data treatment, that manage approximately 50 million focal plane transits daily, giving transit times and fluxes for 500 million individual CCD images to the astrometric and photometric processing chains. The software also carries out a successful and detailed daily monitoring of Gaia health

    Gaia data release 1. Pre-processing and source list creation

    No full text
    Context. The first data release from the Gaia mission contains accurate positions and magnitudes for more than a billion sources, and proper motions and parallaxes for the majority of the 2.5 million Hipparcos and Tycho-2 stars. Aims. We describe three essential elements of the initial data treatment leading to this catalogue: the image analysis, the construction of a source list, and the near real-time monitoring of the payload health. We also discuss some weak points that set limitations for the attainable precision at the present stage of the mission. Methods. Image parameters for point sources are derived from one-dimensional scans, using a maximum likelihood method, under the assumption of a line spread function constant in time, and a complete modelling of bias and background. These conditions are, however, not completely fulfilled. The Gaia source list is built starting from a large ground-based catalogue, but even so a significant number of new entries have been added, and a large number have been removed. The autonomous onboard star image detection will pick up many spurious images, especially around bright sources, and such unwanted detections must be identified. Another key step of the source list creation consists in arranging the more than 1010 individual detections in spatially isolated groups that can be analysed individually. Results. Complete software systems have been built for the Gaia initial data treatment, that manage approximately 50 million focal plane transits daily, giving transit times and fluxes for 500 million individual CCD images to the astrometric and photometric processing chains. The software also carries out a successful and detailed daily monitoring of Gaia health

    Cardiorespiratory fitness cutoff points for early detection of present and future cardiovascular risk in children: A 2-year follow-up study

    No full text
    On behalf of the UP&DOWN Study Group.[Objective]: To examine the association between cardiorespiratory fitness (CRF) at baseline and cardiovascular disease (CVD) risk in 6- to 10-year-olds (cross-sectional) and 2 years later (8- to 12-year-olds [longitudinal]) and whether changes with age in CRF are associated with CVD risk in children aged 8 to 12 years. [Patients and Methods]: Spanish primary schoolchildren (n=236) aged 6 to 10 years participated at baseline. Of the 23 participating primary schools, 22% (n=5) were private schools and 78% (n=18) were public schools. The dropout rate at 2-year follow-up was 9.7% (n=23). The 20-m shuttle run test was used to estimate CRF. The CVD risk score was computed as the mean of 5 CVD risk factor standardized scores: sum of 2 skinfolds, systolic blood pressure, insulin/glucose, triglycerides, and total cholesterol/high-density lipoprotein cholesterol. [Results]: At baseline, CRF was inversely associated with single CVD risk factors (all P0.85; P<.001) and to predict CVD risk 2 years later (P=.004). Persistent low CRF or the decline of CRF from 6-10 to 8-12 years of age is associated with increased CVD risk at age 8 to 12 years (P<.001). [Conclusion]: During childhood, CRF is a strong predictor of CVD risk and should be monitored to identify children with potential CVD risk.This work was supported by grant DEP 2010-21662-C04-00 (DEP 2010-21662-C04-01: DEP 2010-21662-C04-02: DEP 2010-21662-C04-03: DEP 2010-21662-C04-04) from the National Plan for Research: Development and Innovation (R+D+i) MICINN and by grant FPU15/05337 from the Spanish Ministry of Education.Peer Reviewe

    Abstracts from Hydrocephalus 2016

    No full text
    International audienc

    Abstracts from Hydrocephalus 2016

    No full text
    International audienc
    corecore