208 research outputs found

    Do patients with schizophrenia exhibit aberrant salience?

    Get PDF
    BACKGROUND: It has been suggested that some psychotic symptoms reflect ‘aberrant salience’, related to dysfunctional reward learning. To test this hypothesis we investigated whether patients with schizophrenia showed impaired learning of task-relevant stimulusreinforcement associations in the presence of distracting task-irrelevant cues. METHODS: We tested 20 medicated patients with schizophrenia and 17 controls on a reaction time game, the Salience Attribution Test. In this game, participants made a speeded response to earn money in the presence of conditioned stimuli (CSs). Each CS comprised two visual dimensions, colour and form. Probability of reinforcement varied over one of these dimensions (task-relevant), but not the other (task-irrelevant). Measures of adaptive and aberrant motivational salience were calculated on the basis of latency and subjective reinforcement probability rating differences over the task-relevant and task-irrelevant dimensions respectively. RESULTS: Participants rated reinforcement significantly more likely and responded significantly faster on high-probability reinforced relative to lowprobability reinforced trials, representing adaptive motivational salience. Patients exhibited reduced adaptive salience relative to controls, but the two groups did not differ in terms of aberrant salience. Patients with delusions exhibited significantly greater aberrant salience than those without delusions, and aberrant salience also correlated with negative symptoms. In the controls, aberrant salience correlated significantly with ‘introvertive anhedonia’ schizotypy. CONCLUSIONS: These data support the hypothesis that aberrant salience is related to the presence of delusions in medicated patients with schizophrenia, but are also suggestive of a link with negative symptoms. The relationship between aberrant salience and psychotic symptoms warrants further investigation in unmedicated patients

    Population-based screen-detected type 2 diabetes mellitus is associated with less need for insulin therapy after 10 years

    Get PDF
    INTRODUCTION: With increased duration of type 2 diabetes, most people have a growing need of glucose-lowering medication and eventually might require insulin. Presumptive evidence is reported that early detection (eg, by population-based screening) and treatment of hyperglycemia will postpone the indication for insulin treatment. A treatment legacy effect of population-based screening for type 2 diabetes of about 3 years is estimated. Therefore, we aim to compare insulin prescription and glycemic control in people with screen-detected type 2 diabetes after 10 years with data from people diagnosed with type 2 diabetes seven (treatment legacy effect) and 10 years before during care-as-usual. RESEARCH DESIGN AND METHODS: Three cohorts were compared: one screen-detected cohort with 10 years diabetes duration (Anglo-Danish-Dutch study of Intensive Treatment in People with Screen-Detected Diabetes in Primary care (ADDITION-NL): n=391) and two care-as-usual cohorts, one with 7-year diabetes duration (Groningen Initiative to Analyze Type 2 Diabetes Treatment (GIANTT) and Zwolle Outpatient Diabetes project Integrating Available Care (ZODIAC): n=4473) and one with 10-year diabetes duration (GIANTT and ZODIAC: n=2660). Insulin prescription (primary outcome) and hemoglobin A1c (HbA1c) of people with a known diabetes duration of 7 years or 10 years at the index year 2014 were compared using regression analyses. RESULTS: Insulin was prescribed in 10.5% (10-year screen detection), 14.7% (7-year care-as-usual) and 19.0% (10-year care-as-usual). People in the 7-year and 10-year care-as-usual groups had a 1.5 (95% CI 1.0 to 2.1) and 1.8 (95% CI 1.3 to 2.7) higher adjusted odds for getting insulin prescribed than those after screen detection. Lower HbA1c values were found 10 years after screen detection (mean 50.1 mmol/mol (6.7%) vs 51.8 mmol/mol (6.9%) and 52.8 mmol/mol (7.0%)), compared with 7 years and 10 years after care-as-usual (MDadjusted: 1.6 mmol/mol (95% CI 0.6 to 2.6); 0.1% (95% CI 0.1 to 0.2) and 1.8 mmol/mol (95% CI 0.7 to 2.9); and 0.2% (95% CI 0.1 to 0.3)). CONCLUSION: Population-based screen-detected type 2 diabetes is associated with less need for insulin after 10 years compared with people diagnosed during care-as-usual. Glycemic control was better after screen detection but on average good in all groups

    The CARE accelerator R&D programme in Europe

    No full text
    Published online on JACoWCARE, an ambitious and coordinated programme of accelerator research and developments oriented towards high energy physics projects, has been launched in January 2004 by the main European laboratories and the European Commission. This project aims at improving existing infrastructures dedicated to future projects such as linear colliders, upgrades of hadron colliders and high intensity proton drivers. We describe the CARE R&D plans, mostly devoted to advancing the performance of the superconducting technology, both in the fields of RF cavities for electron or proton acceleration and of high field magnets, as well as to developing high intensity electron and proton injectors. We highlight some results and progress obtained so far

    Use of mental health services among disaster survivors: predisposing factors

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Given the high prevalence of mental health problems after disasters it is important to study health services utilization. This study examines predictors for mental health services (MHS) utilization among survivors of a man-made disaster in the Netherlands (May 2000).</p> <p>Methods</p> <p>Electronic records of survivors (n = 339; over 18 years and older) registered in a mental health service (MHS) were linked with general practice based electronic medical records (EMRs) of survivors and data obtained in surveys. EMR data were available from 16 months pre-disaster until 3 years post-disaster. Symptoms and diagnoses in the EMRs were coded according to the International Classification of Primary Care (ICPC). Surveys were carried out 2–3 weeks and 18 months post-disaster, and included validated questionnaires on psychological distress, post-traumatic stress reactions and social functioning. Demographic and disaster-related variables were available. Predisposing factors for MHS utilization 0–18 months and 18–36 months post-disaster were examined using multiple logistic regression models.</p> <p>Results</p> <p>In multiple logistic models, adjusting for demographic and disaster related variables, MHS utilization was predicted by demographic variables (young age, immigrant, public health insurance, unemployment), disaster-related exposure (relocation and injuries), self-reported psychological problems and pre- and post-disaster physician diagnosed health problems (chronic diseases, musculoskeletal problems). After controlling for all health variables, disaster intrusions and avoidance reactions (OR:2.86; CI:1.48–5.53), hostility (OR:2.04; CI:1.28–3.25), pre-disaster chronic diseases (OR:1.82; CI:1.25–2.65), injuries as a result of the disaster (OR:1.80;CI:1.13–2.86), social functioning problems (OR:1.61;CI:1.05–2.44) and younger age (OR:0.98;CI:0.96–0.99) predicted MHS utilization within 18 months post-disaster. Furthermore, disaster intrusions and avoidance reactions (OR:2.29;CI:1.04–5.07) and hostility (OR:3.77;CI:1.51–9.40) predicted MHS utilization following 18 months post-disaster.</p> <p>Conclusion</p> <p>This study showed that several demographic and disaster-related variables and self-reported and physician diagnosed health problems predicted post-disaster MHS-use. The most important factors to predict post-disaster MHS utilization were disaster intrusions and avoidance reactions and symptoms of hostility (which can be identified as symptoms of PTSD) and pre-disaster chronic diseases.</p

    Cerebral activations related to ballistic, stepwise interrupted and gradually modulated movements in parkinson patients

    Get PDF
    Patients with Parkinson's disease (PD) experience impaired initiation and inhibition of movements such as difficulty to start/stop walking. At single-joint level this is accompanied by reduced inhibition of antagonist muscle activity. While normal basal ganglia (BG) contributions to motor control include selecting appropriate muscles by inhibiting others, it is unclear how PD-related changes in BG function cause impaired movement initiation and inhibition at single-joint level. To further elucidate these changes we studied 4 right-hand movement tasks with fMRI, by dissociating activations related to abrupt movement initiation, inhibition and gradual movement modulation. Initiation and inhibition were inferred from ballistic and stepwise interrupted movement, respectively, while smooth wrist circumduction enabled the assessment of gradually modulated movement. Task-related activations were compared between PD patients (N = 12) and healthy subjects (N = 18). In healthy subjects, movement initiation was characterized by antero-ventral striatum, substantia nigra (SN) and premotor activations while inhibition was dominated by subthalamic nucleus (STN) and pallidal activations, in line with the known role of these areas in simple movement. Gradual movement mainly involved antero-dorsal putamen and pallidum. Compared to healthy subjects, patients showed reduced striatal/SN and increased pallidal activation for initiation, whereas for inhibition STN activation was reduced and striatal-thalamo-cortical activation increased. For gradual movement patients showed reduced pallidal and increased thalamo-cortical activation. We conclude that PD-related changes during movement initiation fit the (rather static) model of alterations in direct and indirect BG pathways. Reduced STN activation and regional cortical increased activation in PD during inhibition and gradual movement modulation are better explained by a dynamic model that also takes into account enhanced responsiveness to external stimuli in this disease and the effects of hyper-fluctuating cortical inputs to the striatum and STN in particular

    Mobile element insertions in rare diseases: a comparative benchmark and reanalysis of 60,000 exome samples

    Get PDF
    Mobile element insertions (MEIs) are a known cause of genetic disease but have been underexplored due to technical limitations of genetic testing methods. Various bioinformatic tools have been developed to identify MEIs in Next Generation Sequencing data. However, most tools have been developed specifically for genome sequencing (GS) data rather than exome sequencing (ES) data, which remains more widely used for routine diagnostic testing. In this study, we benchmarked six MEI detection tools (ERVcaller, MELT, Mobster, SCRAMble, TEMP2 and xTea) on ES data and on GS data from publicly available genomic samples (HG002, NA12878). For all the tools we evaluated sensitivity and precision of different filtering strategies. Results show that there were substantial differences in tool performance between ES and GS data. MELT performed best with ES data and its combination with SCRAMble increased substantially the detection rate of MEIs. By applying both tools to 10,890 ES samples from Solve-RD and 52,624 samples from Radboudumc we were able to diagnose 10 patients who had remained undiagnosed by conventional ES analysis until now. Our study shows that MELT and SCRAMble can be used reliably to identify clinically relevant MEIs in ES data. This may lead to an additional diagnosis for 1 in 3000 to 4000 patients in routine clinical ES

    Amplified melt and flow of the Greenland ice sheet driven by late-summer cyclonic rainfall

    Get PDF
    Intense rainfall events significantly affect Alpine and Alaskan glaciers through enhanced melting, ice-flow acceleration and subglacial sediment erosion, yet their impact on the Greenland ice sheet has not been assessed. Here we present measurements of ice velocity, subglacial water pressure and meteorological variables from the western margin of the Greenland ice sheet during a week of warm, wet cyclonic weather in late August and early September 2011. We find that extreme surface runoff from melt and rainfall led to a widespread acceleration in ice flow that extended 140 km into the ice-sheet interior. We suggest that the late-season timing was critical in promoting rapid runoff across an extensive bare ice surface that overwhelmed a subglacial hydrological system in transition to a less-efficient winter mode. Reanalysis data reveal that similar cyclonic weather conditions prevailed across southern and western Greenland during this time, and we observe a corresponding ice-flow response at all land- and marine-terminating glaciers in these regions for which data are available. Given that the advection of warm, moist air masses and rainfall over Greenland is expected to become more frequent in the coming decades, our findings portend a previously unforeseen vulnerability of the Greenland ice sheet to climate change
    corecore