11 research outputs found

    Burnout among surgeons before and during the SARS-CoV-2 pandemic: an international survey

    Get PDF
    Background: SARS-CoV-2 pandemic has had many significant impacts within the surgical realm, and surgeons have been obligated to reconsider almost every aspect of daily clinical practice. Methods: This is a cross-sectional study reported in compliance with the CHERRIES guidelines and conducted through an online platform from June 14th to July 15th, 2020. The primary outcome was the burden of burnout during the pandemic indicated by the validated Shirom-Melamed Burnout Measure. Results: Nine hundred fifty-four surgeons completed the survey. The median length of practice was 10 years; 78.2% included were male with a median age of 37 years old, 39.5% were consultants, 68.9% were general surgeons, and 55.7% were affiliated with an academic institution. Overall, there was a significant increase in the mean burnout score during the pandemic; longer years of practice and older age were significantly associated with less burnout. There were significant reductions in the median number of outpatient visits, operated cases, on-call hours, emergency visits, and research work, so, 48.2% of respondents felt that the training resources were insufficient. The majority (81.3%) of respondents reported that their hospitals were included in the management of COVID-19, 66.5% felt their roles had been minimized; 41% were asked to assist in non-surgical medical practices, and 37.6% of respondents were included in COVID-19 management. Conclusions: There was a significant burnout among trainees. Almost all aspects of clinical and research activities were affected with a significant reduction in the volume of research, outpatient clinic visits, surgical procedures, on-call hours, and emergency cases hindering the training. Trial registration: The study was registered on clicaltrials.gov "NCT04433286" on 16/06/2020

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    ULTRA LOW POWER APPLICATION SPECIFIC INSTRUCTION-SET PROCESSOR DESIGN: for a cardiac beat detector algorithm

    Full text link
    High efficiency and low power consumption are among the main topics in embedded systems today. For complex applications, off-the-shelf processor cores might not provide the desired goals in terms of power consumption. By optimizing the processor for the application, or a set of applications, one could improve the computing power by introducing special purpose hardware units. The execution cycle count of the application would in this case be reduced significantly, and the resulting processor would consume less power. In this thesis, some research is done in how to optimize a software and hardware development for ultra low power consumption. A cardiac beat detector algorithm is implemented in ANSI C, and optimized for low power consumption, by using several software power optimization techniques. The resulting application is mapped on a basic processor architecture provided by Target Compiler Technologies. This processor is optimized further for ultra low power consumption by applying application specific hardware, and by using several hardware power optimization techniques. A general processor and the optimized processor has been mapped on a chip, using a 90 nm low power TSMC process. Information about power dissipation is extracted through netlist simulation, and the results of both processors have been compared. The optimized processor consume 55% less average power, and the duty cycle of the processor, i.e., the time in which the processor executes its task with respect to the time budget available, has been reduced from 14% to 2.8%. The reduction in the total execution cycle count is 81%. The possibilities of applying power gating, or voltage and frequency scaling are discussed, and it is concluded that further reduction in power consumption is possible by applying these power optimization techniques. For a given case, the average leakage power dissipation is estimated to be reduced by 97.2%

    Low-power robust beat detection in ambulatory cardiac monitoring

    Full text link
    \u3cp\u3eWith new advances in ambulatory monitoring new challenges appear due to degradation in signal quality and limitations in hardware requirements. Existing signal analysis methods should be re-evaluated in order to adapt to the restrictive requirements of these new applications. With this motivation, we chose a robust beat detection algorithm and optimized it further to be running in an embedded platform within a cardiac monitoring sensor node. The algorithm was designed in floating point in Matlab and evaluated in order to study its performance under a wide range of conditions. The initial PC version of the algorithm obtained a good performance under a wide variety of conditions (Se=99.65% and +P=99.79% on the MIT/BIH arrhythmia database and Se=99.88%, +P=99.93% on our own database with ambulatory data). In this study, the algorithm is adapted and further optimized to work in real time on an embedded digital processor, while keeping this performance without degradation. The run-time memory usage of the application was of 150 KB with an execution time of 1.5 million cycles and an average power consumption of 494 μW for an ECG of 3 seconds length and sampling frequency of 198 Hz. The algorithm implementation in a general purpose processor will put significant limits on the performance in terms of power consumption. We propose possible specifications for an application-optimized processor for more efficient ECG analysis.\u3c/p\u3

    Ultra low power application specific instruction-set processor design for a cardiac beat detector algorithm

    Full text link
    \u3cp\u3eHigh efficiency and low power consumption are among the main topics in embedded systems today. For complex applications, off-the-shelf processor cores might not provide the desired goals in terms of power consumption. By optimizing the processor for the application, one can improve the computing power by introducing special purpose hardware units. In this paper, we present a case study with a possible design methodology for an ultra low power application specific instruction-set processor. A cardiac beat detector algorithm based on the Continuous Wavelet Transform is implemented in the C language. This application is further optimized using several software power optimization techniques. The resulting application is mapped on a basic processor architecture provided by Target Compiler Technologies, and the processor is further optimized for ultra low power consumption by applying application specific hardware, and by using several hardware optimization techniques. The optimized processor is compared with the unoptimized version, resulting in a 55% reduction in power consumption. The reduction in the total execution cycle count is 81%. Power gating, and dynamic voltage and frequency scaling, are investigated for further power optimization. For a given case, the reduction in the already optimized power consumption is estimated to be 62% and 35%, respectively.\u3c/p\u3

    Global Impact of the COVID-19 Pandemic on Cerebral Venous Thrombosis and Mortality

    Full text link
    BACKGROUND AND PURPOSE: Recent studies suggested an increased incidence of cerebral venous thrombosis (CVT) during the coronavirus disease 2019 (COVID-19) pandemic. We evaluated the volume of CVT hospitalization and in-hospital mortality during the 1st year of the COVID-19 pandemic compared to the preceding year. METHODS: We conducted a cross-sectional retrospective study of 171 stroke centers from 49 countries. We recorded COVID-19 admission volumes, CVT hospitalization, and CVT in-hospital mortality from January 1, 2019, to May 31, 2021. CVT diagnoses were identified by International Classification of Disease-10 (ICD-10) codes or stroke databases. We additionally sought to compare the same metrics in the first 5 months of 2021 compared to the corresponding months in 2019 and 2020 (ClinicalTrials.gov Identifier: NCT04934020). RESULTS: There were 2,313 CVT admissions across the 1-year pre-pandemic (2019) and pandemic year (2020); no differences in CVT volume or CVT mortality were observed. During the first 5 months of 2021, there was an increase in CVT volumes compared to 2019 (27.5%; 95% confidence interval [CI], 24.2 to 32.0; P CONCLUSIONS: During the 1st year of the COVID-19 pandemic, CVT hospitalization volume and CVT in-hospital mortality did not change compared to the prior year. COVID-19 diagnosis was associated with higher CVT in-hospital mortality. During the first 5 months of 2021, there was an increase in CVT hospitalization volume and increase in CVT-related mortality, partially attributable to VITT
    corecore