252 research outputs found

    Damage to insula abolishes cognitive distortions during simulated gambling.

    Get PDF
    This is the accepted version of an article originally published in PNAS. The version of record is available at http://www.pnas.org/content/early/2014/04/02/1322295111.Gambling is a naturalistic example of risky decision-making. During gambling, players typically display an array of cognitive biases that create a distorted expectancy of winning. This study investigated brain regions underpinning gambling-related cognitive distortions, contrasting patients with focal brain lesions to the ventromedial prefrontal cortex (vmPFC), insula, or amygdala ("target patients") against healthy comparison participants and lesion comparison patients (i.e., with lesions that spare the target regions). A slot machine task was used to deliver near-miss outcomes (i.e., nonwins that fall spatially close to a jackpot), and a roulette game was used to examine the gambler's fallacy (color decisions following outcome runs). Comparison groups displayed a heightened motivation to play following near misses (compared with full misses), and manifested a classic gambler's fallacy effect. Both effects were also observed in patients with vmPFC and amygdala damage, but were absent in patients with insula damage. Our findings indicate that the distorted cognitive processing of near-miss outcomes and event sequences may be ordinarily supported by the recruitment of the insula. Interventions to reduce insula reactivity could show promise in the treatment of disordered gambling.LC was supported by a grant from the Medical Research Council (UK) (G1100554). BS was supported by a PhD studentship from the Medical Research Council. AB and DT, as well as the lesion patient research, were supported by grants from the National Institute of Health, namely the National Institute of Neurological Disorders and Stroke [P01 NS19632], and by the National Institute on Drug Abuse [R01 DA023051, R01 DA022549]

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    The effectiveness, acceptability and cost-effectiveness of psychosocial interventions for maltreated children and adolescents: an evidence synthesis.

    Get PDF
    BACKGROUND: Child maltreatment is a substantial social problem that affects large numbers of children and young people in the UK, resulting in a range of significant short- and long-term psychosocial problems. OBJECTIVES: To synthesise evidence of the effectiveness, cost-effectiveness and acceptability of interventions addressing the adverse consequences of child maltreatment. STUDY DESIGN: For effectiveness, we included any controlled study. Other study designs were considered for economic decision modelling. For acceptability, we included any study that asked participants for their views. PARTICIPANTS: Children and young people up to 24 years 11 months, who had experienced maltreatment before the age of 17 years 11 months. INTERVENTIONS: Any psychosocial intervention provided in any setting aiming to address the consequences of maltreatment. MAIN OUTCOME MEASURES: Psychological distress [particularly post-traumatic stress disorder (PTSD), depression and anxiety, and self-harm], behaviour, social functioning, quality of life and acceptability. METHODS: Young Persons and Professional Advisory Groups guided the project, which was conducted in accordance with Cochrane Collaboration and NHS Centre for Reviews and Dissemination guidance. Departures from the published protocol were recorded and explained. Meta-analyses and cost-effectiveness analyses of available data were undertaken where possible. RESULTS: We identified 198 effectiveness studies (including 62 randomised trials); six economic evaluations (five using trial data and one decision-analytic model); and 73 studies investigating treatment acceptability. Pooled data on cognitive-behavioural therapy (CBT) for sexual abuse suggested post-treatment reductions in PTSD [standardised mean difference (SMD) -0.44 (95% CI -4.43 to -1.53)], depression [mean difference -2.83 (95% CI -4.53 to -1.13)] and anxiety [SMD -0.23 (95% CI -0.03 to -0.42)]. No differences were observed for post-treatment sexualised behaviour, externalising behaviour, behaviour management skills of parents, or parental support to the child. Findings from attachment-focused interventions suggested improvements in secure attachment [odds ratio 0.14 (95% CI 0.03 to 0.70)] and reductions in disorganised behaviour [SMD 0.23 (95% CI 0.13 to 0.42)], but no differences in avoidant attachment or externalising behaviour. Few studies addressed the role of caregivers, or the impact of the therapist-child relationship. Economic evaluations suffered methodological limitations and provided conflicting results. As a result, decision-analytic modelling was not possible, but cost-effectiveness analysis using effectiveness data from meta-analyses was undertaken for the most promising intervention: CBT for sexual abuse. Analyses of the cost-effectiveness of CBT were limited by the lack of cost data beyond the cost of CBT itself. CONCLUSIONS: It is not possible to draw firm conclusions about which interventions are effective for children with different maltreatment profiles, which are of no benefit or are harmful, and which factors encourage people to seek therapy, accept the offer of therapy and actively engage with therapy. Little is known about the cost-effectiveness of alternative interventions. LIMITATIONS: Studies were largely conducted outside the UK. The heterogeneity of outcomes and measures seriously impacted on the ability to conduct meta-analyses. FUTURE WORK: Studies are needed that assess the effectiveness of interventions within a UK context, which address the wider effects of maltreatment, as well as specific clinical outcomes. STUDY REGISTRATION: This study is registered as PROSPERO CRD42013003889. FUNDING: The National Institute for Health Research Health Technology Assessment programme

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    ACVIM consensus statement: Guidelines for the identification, evaluation, and management of systemic hypertension in dogs and cats

    Get PDF
    An update to the 2007 American College of Veterinary Internal Medicine (ACVIM) consensus statement on the identification, evaluation, and management of systemic hypertension in dogs and cats was presented at the 2017 ACVIM Forum in National Harbor, MD. The updated consensus statement is presented here. The consensus statement aims to provide guidance on appropriate diagnosis and treatment of hypertension in dogs and cats

    The emerging role of magnetic resonance imaging and multidetector computed tomography in the diagnosis of dilated cardiomyopathy

    Get PDF
    Magnetic resonance imaging and multidetector computed tomography are new imaging methods that have much to offer clinicians caring for patients with dilated cardiomyopathy. In this article we briefly describe the clinical, pathophysiological and histological aspects of dilated cardiomyopathy. Then we discuss in detail the use of both imaging methods for measurement of chamber size, global and regional function, for myocardial tissue characterisation, including myocardial viability assessment, and determination of arrhythmogenic substrate, and their emerging role in cardiac resynchronisation therapy

    The Winchcombe meteorite, a unique and pristine witness from the outer solar system.

    Get PDF
    Direct links between carbonaceous chondrites and their parent bodies in the solar system are rare. The Winchcombe meteorite is the most accurately recorded carbonaceous chondrite fall. Its pre-atmospheric orbit and cosmic-ray exposure age confirm that it arrived on Earth shortly after ejection from a primitive asteroid. Recovered only hours after falling, the composition of the Winchcombe meteorite is largely unmodified by the terrestrial environment. It contains abundant hydrated silicates formed during fluid-rock reactions, and carbon- and nitrogen-bearing organic matter including soluble protein amino acids. The near-pristine hydrogen isotopic composition of the Winchcombe meteorite is comparable to the terrestrial hydrosphere, providing further evidence that volatile-rich carbonaceous asteroids played an important role in the origin of Earth's water

    Antimalarial drug targets in Plasmodium falciparum predicted by stage-specific metabolic network analysis

    Get PDF

    A systematic review of the psychometric properties of self-report research utilization measures used in healthcare

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In healthcare, a gap exists between what is known from research and what is practiced. Understanding this gap depends upon our ability to robustly measure research utilization.</p> <p>Objectives</p> <p>The objectives of this systematic review were: to identify self-report measures of research utilization used in healthcare, and to assess the psychometric properties (acceptability, reliability, and validity) of these measures.</p> <p>Methods</p> <p>We conducted a systematic review of literature reporting use or development of self-report research utilization measures. Our search included: multiple databases, ancestry searches, and a hand search. Acceptability was assessed by examining time to complete the measure and missing data rates. Our approach to reliability and validity assessment followed that outlined in the <it>Standards for Educational and Psychological Testing</it>.</p> <p>Results</p> <p>Of 42,770 titles screened, 97 original studies (108 articles) were included in this review. The 97 studies reported on the use or development of 60 unique self-report research utilization measures. Seven of the measures were assessed in more than one study. Study samples consisted of healthcare providers (92 studies) and healthcare decision makers (5 studies). No studies reported data on acceptability of the measures. Reliability was reported in 32 (33%) of the studies, representing 13 of the 60 measures. Internal consistency (Cronbach's Alpha) reliability was reported in 31 studies; values exceeded 0.70 in 29 studies. Test-retest reliability was reported in 3 studies with Pearson's <it>r </it>coefficients > 0.80. No validity information was reported for 12 of the 60 measures. The remaining 48 measures were classified into a three-level validity hierarchy according to the number of validity sources reported in 50% or more of the studies using the measure. Level one measures (n = 6) reported evidence from any three (out of four possible) <it>Standards </it>validity sources (which, in the case of single item measures, was all applicable validity sources). Level two measures (n = 16) had evidence from any two validity sources, and level three measures (n = 26) from only one validity source.</p> <p>Conclusions</p> <p>This review reveals significant underdevelopment in the measurement of research utilization. Substantial methodological advances with respect to construct clarity, use of research utilization and related theory, use of measurement theory, and psychometric assessment are required. Also needed are improved reporting practices and the adoption of a more contemporary view of validity (<it>i.e.</it>, the <it>Standards</it>) in future research utilization measurement studies.</p
    corecore