5,330 research outputs found

    The Bionic Radiologist: avoiding blurry pictures and providing greater insights

    Get PDF
    Radiology images and reports have long been digitalized. However, the potential of the more than 3.6 billion radiology examinations performed annually worldwide has largely gone unused in the effort to digitally transform health care. The Bionic Radiologist is a concept that combines humanity and digitalization for better health care integration of radiology. At a practical level, this concept will achieve critical goals: (1) testing decisions being made scientifically on the basis of disease probabilities and patient preferences; (2) image analysis done consistently at any time and at any site; and (3) treatment suggestions that are closely linked to imaging results and are seamlessly integrated with other information. The Bionic Radiologist will thus help avoiding missed care opportunities, will provide continuous learning in the work process, and will also allow more time for radiologists’ primary roles: interacting with patients and referring physicians. To achieve that potential, one has to cope with many implementation barriers at both the individual and institutional levels. These include: reluctance to delegate decision making, a possible decrease in image interpretation knowledge and the perception that patient safety and trust are at stake. To facilitate implementation of the Bionic Radiologist the following will be helpful: uncertainty quantifications for suggestions, shared decision making, changes in organizational culture and leadership style, maintained expertise through continuous learning systems for training, and role development of the involved experts. With the support of the Bionic Radiologist, disparities are reduced and the delivery of care is provided in a humane and personalized fashion

    A framework for utility data integration in the UK

    Get PDF
    In this paper we investigate various factors which prevent utility knowledge from being fully exploited and suggest that integration techniques can be applied to improve the quality of utility records. The paper suggests a framework which supports knowledge and data integration. The framework supports utility integration at two levels: the schema and data level. Schema level integration ensures that a single, integrated geospatial data set is available for utility enquiries. Data level integration improves utility data quality by reducing inconsistency, duplication and conflicts. Moreover, the framework is designed to preserve autonomy and distribution of utility data. The ultimate aim of the research is to produce an integrated representation of underground utility infrastructure in order to gain more accurate knowledge of the buried services. It is hoped that this approach will enable us to understand various problems associated with utility data, and to suggest some potential techniques for resolving them

    Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World

    Get PDF
    This report documents the program and the outcomes of GI-Dagstuhl Seminar 16394 "Software Performance Engineering in the DevOps World". The seminar addressed the problem of performance-aware DevOps. Both, DevOps and performance engineering have been growing trends over the past one to two years, in no small part due to the rise in importance of identifying performance anomalies in the operations (Ops) of cloud and big data systems and feeding these back to the development (Dev). However, so far, the research community has treated software engineering, performance engineering, and cloud computing mostly as individual research areas. We aimed to identify cross-community collaboration, and to set the path for long-lasting collaborations towards performance-aware DevOps. The main goal of the seminar was to bring together young researchers (PhD students in a later stage of their PhD, as well as PostDocs or Junior Professors) in the areas of (i) software engineering, (ii) performance engineering, and (iii) cloud computing and big data to present their current research projects, to exchange experience and expertise, to discuss research challenges, and to develop ideas for future collaborations

    Cloud Cost Optimization: A Comprehensive Review of Strategies and Case Studies

    Full text link
    Cloud computing has revolutionized the way organizations manage their IT infrastructure, but it has also introduced new challenges, such as managing cloud costs. This paper explores various techniques for cloud cost optimization, including cloud pricing, analysis, and strategies for resource allocation. Real-world case studies of these techniques are presented, along with a discussion of their effectiveness and key takeaways. The analysis conducted in this paper reveals that organizations can achieve significant cost savings by adopting cloud cost optimization techniques. Additionally, future research directions are proposed to advance the state of the art in this important field

    Adaptive model-driven user interface development systems

    Get PDF
    Adaptive user interfaces (UIs) were introduced to address some of the usability problems that plague many software applications. Model-driven engineering formed the basis for most of the systems targeting the development of such UIs. An overview of these systems is presented and a set of criteria is established to evaluate the strengths and shortcomings of the state-of-the-art, which is categorized under architectures, techniques, and tools. A summary of the evaluation is presented in tables that visually illustrate the fulfillment of each criterion by each system. The evaluation identified several gaps in the existing art and highlighted the areas of promising improvement

    On the automated analysis of preterm infant sleep states from electrocardiography

    Get PDF

    On the automated analysis of preterm infant sleep states from electrocardiography

    Get PDF
    • …
    corecore