56 research outputs found
Electronic health record: integrating evidence-based information at the point of clinical decision making
The authors created two tools to achieve the goals of providing physicians with a way to review alternative diagnoses and improving access to relevant evidence-based library resources without disrupting established workflows. The “diagnostic decision support tool” lifted terms from standard, coded fields in the electronic health record and sent them to Isabel, which produced a list of possible diagnoses. The physicians chose their diagnoses and were presented with the “knowledge page,” a collection of evidence-based library resources. Each resource was automatically populated with search results based on the chosen diagnosis. Physicians responded positively to the “knowledge page.
759–5 Use of an Interactive Electronic Whiteboard to Teach Clinical Cardiology Decision Analysis to Medical Students
We used innovative state-of-the-art computer and collaboration technologies to teach first-year medical students an analytic methodology to solve difficult clinical cardiology problems to make informed medical decisions. Clinical examples included the decision to administer thrombolytic therapy considering the risk of hemorrhagic stroke, and activity recommendations for athletes at risk for sudden death. Students received instruction on the decision-analytic approach which integrates pathophysiology, treatment efficacy, diagnostic test interpretation, health outcomes, patient preferences, and cost-effectiveness into a decision-analytic model.The traditional environment of a small group and blackboard was significantly enhanced by using an electronic whiteboard, the Xerox LiveBoard™. The LiveBoard features an 80486-based personal computer, large (3’×4’) display, and wireless pens for input. It allowed the integration of decision-analytic software, statistical software, digital slides, and additional media. We developed TIDAL (Team Interactive Decision Analysis in the Large-screen environment), a software package to interactively construct decision trees, calculate expected utilities, and perform one- and two-way sensitivity analyses using pen and gesture inputs. The Live Board also allowed the novel incorporation of Gambler, a utility assessment program obtained from the New England Medical Center. Gambler was used to obtain utilities for outcomes such as non-disabling hemorrhagic stroke. The interactive nature of the LiveBoard allowed real-time decision model development by the class, followed by instantaneous calculation of expected utilities and sensitivity analyses. The multimedia aspect and interactivity were conducive to extensive class participation.Ten out of eleven students wanted decision-analytic software available for use during their clinical years and all students would recommend the course to next year's students. We plan to experiment with the electronic collaboration features of this technology and allow groups separated by time or space to collaborate on decisions and explore the models created
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
A framework for the development of a global standardised marine taxon reference image database (SMarTaR-ID) to support image-based analyses
Video and image data are regularly used in the field of benthic ecology to document biodiversity. However, their use is subject to a number of challenges, principally the identification of taxa within the images without associated physical specimens. The challenge of applying traditional taxonomic keys to the identification of fauna from images has led to the development of personal, group, or institution level reference image catalogues of operational taxonomic units (OTUs) or morphospecies. Lack of standardisation among these reference catalogues has led to problems with observer bias and the inability to combine datasets across studies. In addition, lack of a common reference standard is stifling efforts in the application of artificial intelligence to taxon identification. Using the North Atlantic deep sea as a case study, we propose a database structure to facilitate standardisation of morphospecies image catalogues between research groups and support future use in multiple front-end applications. We also propose a framework for coordination of international efforts to develop reference guides for the identification of marine species from images. The proposed structure maps to the Darwin Core standard to allow integration with existing databases. We suggest a management framework where high-level taxonomic groups are curated by a regional team, consisting of both end users and taxonomic experts. We identify a mechanism by which overall quality of data within a common reference guide could be raised over the next decade. Finally, we discuss the role of a common reference standard in advancing marine ecology and supporting sustainable use of this ecosystem
Impact of jet-production data on the next-to-next-to-leading-order determination of HERAPDF2.0 parton distributions
The HERAPDF2.0 ensemble of parton distribution functions (PDFs) was introduced in 2015. The final stage is presented, a next-to-next-to-leading-order (NNLO) analysis of the HERA data on inclusive deep inelastic ep scattering together with jet data as published by the H1 and ZEUS collaborations. A perturbative QCD fit, simultaneously of αs(M2Z) and the PDFs, was performed with the result αs(M2Z)=0.1156±0.0011 (exp) +0.0001−0.0002 (model +parameterisation) ±0.0029 (scale). The PDF sets of HERAPDF2.0Jets NNLO were determined with separate fits using two fixed values of αs(M2Z), αs(M2Z)=0.1155 and 0.118, since the latter value was already chosen for the published HERAPDF2.0 NNLO analysis based on HERA inclusive DIS data only. The different sets of PDFs are presented, evaluated and compared. The consistency of the PDFs determined with and without the jet data demonstrates the consistency of HERA inclusive and jet-production cross-section data. The inclusion of the jet data reduced the uncertainty on the gluon PDF. Predictions based on the PDFs of HERAPDF2.0Jets NNLO give an excellent description of the jet-production data used as input
Semi-Automated Image Analysis for the Assessment of Megafaunal Densities at the Arctic Deep-Sea Observatory HAUSGARTEN
Megafauna play an important role in benthic ecosystem function and are sensitive indicators of environmental change. Non-invasive monitoring of benthic communities can be accomplished by seafloor imaging. However, manual quantification of megafauna in images is labor-intensive and therefore, this organism size class is often neglected in ecosystem studies. Automated image analysis has been proposed as a possible approach to such analysis, but the heterogeneity of megafaunal communities poses a non-trivial challenge for such automated techniques. Here, the potential of a generalized object detection architecture, referred to as iSIS (intelligent Screening of underwater Image Sequences), for the quantification of a heterogenous group of megafauna taxa is investigated. The iSIS system is tuned for a particular image sequence (i.e. a transect) using a small subset of the images, in which megafauna taxa positions were previously marked by an expert. To investigate the potential of iSIS and compare its results with those obtained from human experts, a group of eight different taxa from one camera transect of seafloor images taken at the Arctic deep-sea observatory HAUSGARTEN is used. The results show that inter- and intra-observer agreements of human experts exhibit considerable variation between the species, with a similar degree of variation apparent in the automatically derived results obtained by iSIS. Whilst some taxa (e. g. Bathycrinus stalks, Kolga hyalina, small white sea anemone) were well detected by iSIS (i. e. overall Sensitivity: 87%, overall Positive Predictive Value: 67%), some taxa such as the small sea cucumber Elpidia heckeri remain challenging, for both human observers and iSIS
Semi-Automated Image Analysis for the Assessment of Megafaunal Densities at the Arctic Deep-Sea Observatory HAUSGARTEN
Megafauna play an important role in benthic ecosystem function and are sensitive indicators of environmental change. Non-invasive monitoring of benthic communities can be accomplished by seafloor imaging. However, manual quantification of megafauna in images is labor-intensive and therefore, this organism size class is often neglected in ecosystem studies. Automated image analysis has been proposed as a possible approach to such analysis, but the heterogeneity of megafaunal communities poses a non-trivial challenge for such automated techniques. Here, the potential of a generalized object detection architecture, referred to as iSIS (intelligent Screening of underwater Image Sequences), for the quantification of a heterogenous group of megafauna taxa is investigated. The iSIS system is tuned for a particular image sequence (i.e. a transect) using a small subset of the images, in which megafauna taxa positions were previously marked by an expert. To investigate the potential of iSIS and compare its results with those obtained from human experts, a group of eight different taxa from one camera transect of seafloor images taken at the Arctic deep-sea observatory HAUSGARTEN is used. The results show that inter- and intra-observer agreements of human experts exhibit considerable variation between the species, with a similar degree of variation apparent in the automatically derived results obtained by iSIS. Whilst some taxa (e. g. Bathycrinus stalks, Kolga hyalina, small white sea anemone) were well detected by iSIS (i. e. overall Sensitivity: 87%, overall Positive Predictive Value: 67%), some taxa such as the small sea cucumber Elpidia heckeri remain challenging, for both human observers and iSIS
Understanding, diagnosing, and treating Myalgic encephalomyelitis/chronic fatigue syndrome - State of the art: Report of the 2nd international meeting at the Charité fatigue center.
Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) is a devastating disease affecting millions of people worldwide. Due to the 2019 pandemic of coronavirus disease (COVID-19), we are facing a significant increase of ME/CFS prevalence. On May 11th to 12th, 2023, the second international ME/CFS conference of the Charité Fatigue Center was held in Berlin, Germany, focusing on pathomechanisms, diagnosis, and treatment. During the two-day conference, more than 100 researchers from various research fields met on-site and over 700 attendees participated online to discuss the state of the art and novel findings in this field. Key topics from the conference included: the role of the immune system, dysfunction of endothelial and autonomic nervous system, and viral reactivation. Furthermore, there were presentations on innovative diagnostic measures and assessments for this complex disease, cutting-edge treatment approaches, and clinical studies. Despite the increased public attention due to the COVID-19 pandemic, the subsequent rise of Long COVID-19 cases, and the rise of funding opportunities to unravel the pathomechanisms underlying ME/CFS, this severe disease remains highly underresearched. Future adequately funded research efforts are needed to further explore the disease etiology and to identify diagnostic markers and targeted therapies
- …