254 research outputs found

    Understanding jumping to conclusions in patients with persecutory delusions: working memory and intolerance of uncertainty

    No full text
    Background. Persecutory delusions are a key psychotic experience. A reasoning style known as ‘jumping to conclusions’ (JTC) – limited information gathering before reaching certainty in decision making – has been identified as a contributory factor in the occurrence of delusions. The cognitive processes that underpin JTC need to be determined in order to develop effective interventions for delusions. In the current study two alternative perspectives were tested: that JTC partially results from impairment in information-processing capabilities and that JTC is a motivated strategy to avoid uncertainty.Method. A group of 123 patients with persistent persecutory delusions completed assessments of JTC (the 60:40 beads task), IQ, working memory, intolerance of uncertainty, and psychiatric symptoms. Patients showing JTC were compared with patients not showing JTC.Results. A total of 30 (24%) patients with delusions showed JTC. There were no differences between patients who did and did not jump to conclusions in overall psychopathology. Patients who jumped to conclusions had poorer working memory performance, lower IQ, lower intolerance of uncertainty and lower levels of worry.Working memory and worry independently predicted the presence of JTC.Conclusions. Hasty decision making in patients with delusions may partly arise from difficulties in keeping information in mind. Interventions for JTC are likely to benefit from addressing working memory performance, while in vivo techniques for patients with delusions will benefit from limiting the demands on working memory. The study provides little evidence for a contribution to JTC from top down motivational beliefs about uncertainty

    The role of interpersonal contingency and self-focused attention in the development of trust in clinical paranoia: a virtual reality study

    Get PDF
    Aims: Research into interpersonal processes involved in paranoia remains limited. This study aimed to assess the feasibility of using interactive virtual reality in a clinical sample with psychosis and persecutory delusions. The study aimed to replicate an experiment which found healthy individuals high in paranoia showed a hypersensitivity to contingent behaviour which increased their perceived trust towards the avatar. A further aim was to investigate the impact of self-focused attention on the perception of interpersonal contingency and trust. Method: Eighteen male participants with psychosis and paranoia completed the virtual reality exercise. Participants entered a virtual flat and interviewed a virtual flatmate whose non-verbal responses were either high or low in contingency in relation to the participant. Trust towards the avatar was measured by self-report and behaviour towards the virtual flatmate, operationalised as interpersonal distance. Focus of attention, affect and immersion in the virtual reality scenario were assessed. Results: Overall, participants enjoyed and were immersed in the interactive virtual reality environment. Interpersonal distance was predicted by severity of persecutory delusions and negative affect. Exploratory graphic analyses showed no evidence of hypersensitivity to avatar contingency or moderating effect of self-focus attention. Persecutory delusion severity was associated with other-focus attention, which in turn, unexpectedly, predicted higher self-focused attention. Conclusions: Interactive virtual reality is a safe and feasibility research tool for individuals with clinical paranoia. Severity of persecutory delusions, rather than environmental manipulation, predicted trust. However, the lack of power in the current study prevents clear conclusions about the impact of interpersonal contingency on trust in clinical paranoia from being drawn. Replication is required with a larger sample and a more ambiguous scenario

    Commissioning and Calibrating the CMS Silicon Strip Tracker

    Get PDF
    The data acquisition system for the CMS Silicon Strip Tracker (SST) is based around a custom analogue front-end ASIC, an analogue optical link system and an off-detector VME board that performs digitization, zero-suppression and data formatting. A complex procedure is required to optimally configure, calibrate and synchronize the 107 channels of the SST readout system. We present an overview of this procedure, which will be used to commission and calibrate the SST during the integration, Start-Up and operational phases of the experiment. Recent experiences from the CMS Magnet Test Cosmic Challenge and system tests at the Tracker Integration Facility are also reported

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    A temperate former West Antarctic ice sheet suggested by an extensive zone of bed channels

    Get PDF
    Several recent studies predict that the West Antarctic Ice Sheet will become increasingly unstable under warmer conditions. Insights on such change can be assisted through investigations of the subglacial landscape, which contains imprints of former ice-sheet behavior. Here, we present radio-echo sounding data and satellite imagery revealing a series of ancient large sub-parallel subglacial bed channels preserved in the region between the Möller and Foundation Ice Streams, West Antarctica. We suggest that these newly recognized channels were formed by significant meltwater routed along the icesheet bed. The volume of water required is likely substantial and can most easily be explained by water generated at the ice surface. The Greenland Ice Sheet today exemplifies how significant seasonal surface melt can be transferred to the bed via englacial routing. For West Antarctica, the Pliocene (2.6–5.3 Ma) represents the most recent sustained period when temperatures could have been high enough to generate surface melt comparable to that of present-day Greenland. We propose, therefore, that a temperate ice sheet covered this location during Pliocene warm periods

    Data acquisition software for the CMS strip tracker

    Get PDF
    The CMS silicon strip tracker, providing a sensitive area of approximately 200 m2 and comprising 10 million readout channels, has recently been completed at the tracker integration facility at CERN. The strip tracker community is currently working to develop and integrate the online and offline software frameworks, known as XDAQ and CMSSW respectively, for the purposes of data acquisition and detector commissioning and monitoring. Recent developments have seen the integration of many new services and tools within the online data acquisition system, such as event building, online distributed analysis, an online monitoring framework, and data storage management. We review the various software components that comprise the strip tracker data acquisition system, the software architectures used for stand-alone and global data-taking modes. Our experiences in commissioning and operating one of the largest ever silicon micro-strip tracking systems are also reviewed

    Slepian functions and their use in signal estimation and spectral analysis

    Full text link
    It is a well-known fact that mathematical functions that are timelimited (or spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the finite precision of measurement and computation unavoidably bandlimits our observation and modeling scientific data, and we often only have access to, or are only interested in, a study area that is temporally or spatially bounded. In the geosciences we may be interested in spectrally modeling a time series defined only on a certain interval, or we may want to characterize a specific geographical area observed using an effectively bandlimited measurement device. It is clear that analyzing and representing scientific data of this kind will be facilitated if a basis of functions can be found that are "spatiospectrally" concentrated, i.e. "localized" in both domains at the same time. Here, we give a theoretical overview of one particular approach to this "concentration" problem, as originally proposed for time series by Slepian and coworkers, in the 1960s. We show how this framework leads to practical algorithms and statistically performant methods for the analysis of signals and their power spectra in one and two dimensions, and on the surface of a sphere.Comment: Submitted to the Handbook of Geomathematics, edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verla

    Scalar and vector Slepian functions, spherical signal estimation and spectral analysis

    Full text link
    It is a well-known fact that mathematical functions that are timelimited (or spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the finite precision of measurement and computation unavoidably bandlimits our observation and modeling scientific data, and we often only have access to, or are only interested in, a study area that is temporally or spatially bounded. In the geosciences we may be interested in spectrally modeling a time series defined only on a certain interval, or we may want to characterize a specific geographical area observed using an effectively bandlimited measurement device. It is clear that analyzing and representing scientific data of this kind will be facilitated if a basis of functions can be found that are "spatiospectrally" concentrated, i.e. "localized" in both domains at the same time. Here, we give a theoretical overview of one particular approach to this "concentration" problem, as originally proposed for time series by Slepian and coworkers, in the 1960s. We show how this framework leads to practical algorithms and statistically performant methods for the analysis of signals and their power spectra in one and two dimensions, and, particularly for applications in the geosciences, for scalar and vectorial signals defined on the surface of a unit sphere.Comment: Submitted to the 2nd Edition of the Handbook of Geomathematics, edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verlag. This is a slightly modified but expanded version of the paper arxiv:0909.5368 that appeared in the 1st Edition of the Handbook, when it was called: Slepian functions and their use in signal estimation and spectral analysi

    The cost effectiveness of REACH-HF and home-based cardiac rehabilitation compared with the usual medical care for heart failure with reduced ejection fraction:a decision model-based analysis

    Get PDF
    This is the final version. Available from Sage Publications via the DOI in this record.Background The REACH-HF (Rehabilitation EnAblement in CHronic Heart Failure) trial found that the REACH-HF home-based cardiac rehabilitation intervention resulted in a clinically meaningful improvement in disease-specific health-related quality of life in patients with reduced ejection fraction heart failure (HFrEF). The aims of this study were to assess the long-term cost-effectiveness of the addition of REACH-HF intervention or home-based cardiac rehabilitation to usual care compared with usual care alone in patients with HFrEF. Design and methods A Markov model was developed using a patient lifetime horizon and integrating evidence from the REACH-HF trial, a systematic review/meta-analysis of randomised trials, estimates of mortality and hospital admission and UK costs at 2015/2016 prices. Taking a UK National Health and Personal Social Services perspective we report the incremental cost per quality-adjusted life-year (QALY) gained, assessing uncertainty using probabilistic and deterministic sensitivity analyses. Results In base case analysis, the REACH-HF intervention was associated with per patient mean QALY gain of 0.23 and an increased mean cost of £400 compared with usual care, resulting in a cost per QALY gained of £1720. Probabilistic sensitivity analysis indicated a 78% probability that REACH-HF is cost effective versus usual care at a threshold of £20,000 per QALY gained. Results were similar for home-based cardiac rehabilitation versus usual care. Sensitivity analyses indicate the findings to be robust to changes in model assumptions and parameters. Conclusions Our cost-utility analyses indicate that the addition of the REACH-HF intervention and home-based cardiac rehabilitation programmes are likely to be cost-effective treatment options versus usual care alone in patients with HFrEF.National Institute for Health Research (NIHR
    corecore