931 research outputs found

    Building realistic potential patient queries for medical information retrieval evaluation

    Get PDF
    To evaluate and improve medical information retrieval, benchmarking data sets need to be created. Few benchmarks have been focusing on patients’ information needs. There is a need for additional benchmarks to enable research into effective retrieval methods. In this paper we describe the manual creation of patient queries and investigate their automatic generation. This work is conducted in the framework of a medical evaluation campaign, which aims to evaluate and improve technologies to help patients and laypeople access eHealth data. To this end, the campaign is composed of different tasks, including a medical information retrieval (IR) task. Within this IR task, a web crawl of medically related documents, as well as patient queries are provided to participants. The queries are built to represent the potential information needs patients may have while reading their medical report. We start by describing typical types of patients’ information needs. We then describe how these queries have been manually generated from medical reports for the first two years of the eHealth campaign. We then explore techniques that would enable us to automate the query generation process. This process is particularly challenging, as it requires an understanding of the patients’ information needs, and of the electronic health records. We describe various approaches to automatically generate potential patient queries from medical reports and describe our future development and evaluation phase

    Finite Temperature Wave-Function Renormalization, A Comparative Analysis

    Get PDF
    We compare two competing theories regarding finite temperature wave-function corrections for the process H→e+e−H \to e^+e^- and for n+Μ→p+e−n+\nu \to p+e^- and related processes of interest for primordial nucleosynthesis. Although the two methods are distinct (as shown in H→e+e−H \to e^+e^-) they yield the same finite temperature correction for all n→pn\to p and p→np \to n processes. Both methods yield an increase in the He/H ratio of .01% due to finite temperature renormalization rather than a decrease of .16% as previously predicted.Comment: 12 pages, 3 figures. LaTe

    Overview of the ShARe/CLEF eHealth evaluation lab 2013

    Get PDF
    Discharge summaries and other free-text reports in healthcare transfer information between working shifts and geographic locations. Patients are likely to have difficulties in understanding their content, because of their medical jargon, non-standard abbreviations, and ward-specific idioms. This paper reports on an evaluation lab with an aim to support the continuum of care by developing methods and resources that make clinical reports in English easier to understand for patients, and which helps them in finding information related to their condition. This ShARe/CLEFeHealth2013 lab offered student mentoring and shared tasks: identification and normalisation of disorders (1a and 1b) and normalisation of abbreviations and acronyms (2) in clinical reports with respect to terminology standards in healthcare as well as information retrieval (3) to address questions patients may have when reading clinical reports. The focus on patients' information needs as opposed to the specialised information needs of physicians and other healthcare workers was the main feature of the lab distinguishing it from previous shared tasks. De-identied clinical reports for the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Task 3 were from the Internet and originated from the Khresmoi project. Task 1 annotations originated from the ShARe annotations. For Tasks 2 and 3, new annotations, queries, and relevance assessments were created. 64, 56, and 55 people registered their interest in Tasks 1, 2, and 3, respectively. 34 unique teams (3 members per team on average) participated with 22, 17, 5, and 9 teams in Tasks 1a, 1b, 2 and 3, respectively. The teams were from Australia, China, France, India, Ireland, Republic of Korea, Spain, UK, and USA. Some teams developed and used additional annotations, but this strategy contributed to the system performance only in Task 2. The best systems had the F1 score of 0.75 in Task 1a; Accuracies of 0.59 and 0.72 in Tasks 1b and 2; and Precision at 10 of 0.52 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development

    Magnetic fields in protoplanetary disks

    Full text link
    Magnetic fields likely play a key role in the dynamics and evolution of protoplanetary discs. They have the potential to efficiently transport angular momentum by MHD turbulence or via the magnetocentrifugal acceleration of outflows from the disk surface, and magnetically-driven mixing has implications for disk chemistry and evolution of the grain population. However, the weak ionisation of protoplanetary discs means that magnetic fields may not be able to effectively couple to the matter. I present calculations of the ionisation equilibrium and magnetic diffusivity as a function of height from the disk midplane at radii of 1 and 5 AU. Dust grains tend to suppress magnetic coupling by soaking up electrons and ions from the gas phase and reducing the conductivity of the gas by many orders of magnitude. However, once grains have grown to a few microns in size their effect starts to wane and magnetic fields can begin to couple to the gas even at the disk midplane. Because ions are generally decoupled from the magnetic field by neutral collisions while electrons are not, the Hall effect tends to dominate the diffusion of the magnetic field when it is able to partially couple to the gas. For a standard population of 0.1 micron grains the active surface layers have a combined column of about 2 g/cm^2 at 1 AU; by the time grains have aggregated to 3 microns the active surface density is 80 g/cm^2. In the absence of grains, x-rays maintain magnetic coupling to 10% of the disk material at 1 AU (150 g/cm^2). At 5 AU the entire disk thickness becomes active once grains have aggregated to 1 micron in size.Comment: 11 pages, 11 figs, aastex.cls. Accepted for publication in Astrophysics & Space Science. v3 corrects bibliograph

    The luminosities of protostars in the spitzer c2d and gould belt legacy clouds

    Get PDF
    Journal ArticlePublished version available online at the Astronomical Journal, Volume 145, Number 4, Article 94; doi: doi: 10.1088/0004-6256/145/4/94Motivated by the long-standing "luminosity problem" in low-mass star formation whereby protostars are underluminous compared to theoretical expectations, we identify 230 protostars in 18 molecular clouds observed by two Spitzer Space Telescope Legacy surveys of nearby star-forming regions. We compile complete spectral energy distributions, calculate L bol for each source, and study the protostellar luminosity distribution. This distribution extends over three orders of magnitude, from 0.01 L ÈŻ to 69 L ÈŻ, and has a mean and median of 4.3 L ÈŻ and 1.3 L ÈŻ, respectively. The distributions are very similar for Class 0 and Class I sources except for an excess of low luminosity (L bol â‰Č 0.5 L) Class I sources compared to Class 0. 100 out of the 230 protostars (43%) lack any available data in the far-infrared and submillimeter (70 ÎŒm <λ < 850 ÎŒm) and have L bol underestimated by factors of 2.5 on average, and up to factors of 8-10 in extreme cases. Correcting these underestimates for each source individually once additional data becomes available will likely increase both the mean and median of the sample by 35%-40%. We discuss and compare our results to several recent theoretical studies of protostellar luminosities and show that our new results do not invalidate the conclusions of any of these studies. As these studies demonstrate that there is more than one plausible accretion scenario that can match observations, future attention is clearly needed. The better statistics provided by our increased data set should aid such future work. © 2013. The American Astronomical Society. All rights reserved..National Science FoundationNational Aeronautics and Space AdministrationJet Propulsion Laboratory, California Institute of Technolog

    Environment-Induced Decoherence and the Transition From Quantum to Classical

    Get PDF
    We study dynamics of quantum open systems, paying special attention to those aspects of their evolution which are relevant to the transition from quantum to classical. We begin with a discussion of the conditional dynamics of simple systems. The resulting models are straightforward but suffice to illustrate basic physical ideas behind quantum measurements and decoherence. To discuss decoherence and environment-induced superselection einselection in a more general setting, we sketch perturbative as well as exact derivations of several master equations valid for various systems. Using these equations we study einselection employing the general strategy of the predictability sieve. Assumptions that are usually made in the discussion of decoherence are critically reexamined along with the ``standard lore'' to which they lead. Restoration of quantum-classical correspondence in systems that are classically chaotic is discussed. The dynamical second law -it is shown- can be traced to the same phenomena that allow for the restoration of the correspondence principle in decohering chaotic systems (where it is otherwise lost on a very short time-scale). Quantum error correction is discussed as an example of an anti-decoherence strategy. Implications of decoherence and einselection for the interpretation of quantum theory are briefly pointed out.Comment: 80 pages, 7 figures included, Lectures given by both authors at the 72nd Les Houches Summer School on "Coherent Matter Waves", July-August 199
    • 

    corecore