22 research outputs found

    The Effects of a Single Transcranial Direct Current Stimulation Session on Impulsivity and Risk Among a Sample of Adult Recreational Cannabis Users

    Get PDF
    Individuals with substance use disorders exhibit risk-taking behaviors, potentially leading to negative consequences and difficulty maintaining recovery. Non-invasive brain stimulation techniques such as transcranial direct current stimulation (tDCS) have yielded mixed effects on risk-taking among healthy controls. Given the importance of risk-taking behaviors among substance-using samples, this study aimed to examine the effects of tDCS on risk-taking among a sample of adults using cannabis. Using a double-blind design, 27 cannabis users [M(SD) age = 32.48 (1.99), 41% female] were randomized, receiving one session of active or sham tDCS over the bilateral dorsolateral prefrontal cortex (dlPFC). Stimulation parameters closely followed prior studies with anodal right dlPFC and cathodal left dlPFC stimulation. Risk-taking—assessed via a modified Cambridge Gambling Task—was measured before and during tDCS. Delay and probability discounting tasks were assessed before and after stimulation. No significant effects of stimulation on risk-taking behavior were found. However, participants chose the less risky option ∼86% of the trials before stimulation which potentially contributed to ceiling effects. These results contradict one prior study showing increased risk-taking among cannabis users following tDCS. There was a significant increase in delay discounting of a $1000 delayed reward during stimulation for the sham group only, but no significant effects for probability discounting. The current study adds to conflicting and inconclusive literature on tDCS and cognition among substance-using samples. In conclusion, results suggest the ineffectiveness of single session dlPFC tDCS using an established stimulation protocol on risk-taking, although ceiling effects at baseline may have also prevented behavior change following tDCS

    Delay Discounting as a Transdiagnostic Process in Psychiatric Disorders: A Meta-analysis

    Get PDF
    Importance Delay discounting is a behavioral economic index of impulsive preferences for smaller-immediate or larger-delayed rewards that is argued to be a transdiagnostic process across health conditions. Studies suggest some psychiatric disorders are associated with differences in discounting compared with controls, but null findings have also been reported. Objective To conduct a meta-analysis of the published literature on delay discounting in people with psychiatric disorders. Data Sources PubMed, MEDLINE, PsycInfo, Embase, and Web of Science databases were searched through December 10, 2018. The psychiatric keywords used were based on DSM-IV or DSM-5 diagnostic categories. Collected data were analyzed from December 10, 2018, through June 1, 2019. Study Selection Following a preregistered Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) protocol, 2 independent raters reviewed titles, abstracts, and full-text articles. English-language articles comparing monetary delay discounting between participants with psychiatric disorders and controls were included. Data Extraction and Synthesis Hedges g effect sizes were computed and random-effects models were used for all analyses. Heterogeneity statistics, one-study-removed analyses, and publication bias indices were also examined. Main Outcomes and Measures Categorical comparisons of delay discounting between a psychiatric group and a control group. Results The sample included 57 effect sizes from 43 studies across 8 diagnostic categories. Significantly steeper discounting for individuals with a psychiatric disorder compared with controls was observed for major depressive disorder (Hedges g = 0.37; P = .002; k = 7), schizophrenia (Hedges g = 0.46; P = .004; k = 12), borderline personality disorder (Hedges g = 0.60; P < .001; k = 8), bipolar disorder (Hedges g = 0.68; P < .001; k = 4), bulimia nervosa (Hedges g = 0.41; P = .001; k = 4), and binge-eating disorder (Hedges g = 0.34; P = .001; k = 7). In contrast, anorexia nervosa exhibited statistically significantly shallower discounting (Hedges g = –0.30; P < .001; k = 10). Modest evidence of publication bias was indicated by a statistically significant Egger test for schizophrenia and at the aggregate level across studies. Conclusions and Relevance Results of this study appear to provide empirical support for delay discounting as a transdiagnostic process across most of the psychiatric disorders examined; the literature search also revealed limited studies in some disorders, notably posttraumatic stress disorder, which is a priority area for research

    Factors affecting continuation of clean intermittent catheterisation in people with multiple sclerosis: results of the COSMOS mixed-methods study

    Get PDF
    Background:  Clean intermittent catheterisation (CIC) is often recommended for people with multiple sclerosis (MS).  Objective:  To determine the variables that affect continuation or discontinuation of the use of CIC.  Methods:  A three-part mixed-method study (prospective longitudinal cohort (n = 56), longitudinal qualitative interviews (n = 20) and retrospective survey (n = 456)) was undertaken, which identified the variables that influenced CIC continuation/discontinuation. The potential explanatory variables investigated in each study were the individual’s age, gender, social circumstances, number of urinary tract infections, bladder symptoms, presence of co-morbidity, stage of multiple sclerosis and years since diagnosis, as well as CIC teaching method and intensity.  Results:  For some people with MS the prospect of undertaking CIC is difficult and may take a period of time to accept before beginning the process of using CIC. Ongoing support from clinicians, support at home and a perceived improvement in symptoms such as nocturia were positive predictors of continuation. In many cases, the development of a urinary tract infection during the early stages of CIC use had a significant detrimental impact on continuation.  Conclusion:  Procedures for reducing the incidence of urinary tract infection during the learning period (i.e. when being taught and becoming competent) should be considered, as well as the development of a tool to aid identification of a person’s readiness to try CIC

    Dissolved organic carbon transformations and microbial community response to variations in recharge waters in a shallow carbonate aquifer

    Get PDF
    © 2016, The Author(s). In carbonate aquifers, dissolved organic carbon from the surface drives heterotrophic metabolism, generating CO2 in the subsurface. Although this has been a proposed mechanism for enhanced dissolution at the water table, respiration rates and their controlling factors have not been widely evaluated. This study investigates the composition and concentration of dissolved organic carbon (DOC) reaching the water table from different recharge pathways on a subtropical carbonate island using a combination of DOC concentration measurements, fluorescence and absorption characterisation. In addition, direct measurements of the microbial response to the differing water types were made. Interactions of rainfall with the vegetation, via throughfall and stemflow, increase the concentration of DOC. The highest DOC concentrations are associated with stemflow, overland recharge and dissolution hole waters which interact with bark lignin and exhibit strong terrestrial-derived characteristics. The groundwater samples exhibit the lowest concentrations of DOC and are comprised of refractory humic-like organic matter. The heterotrophic response seems to be controlled by the concentration of DOC in the sample. The terrestrially sourced humic-like matter in the stemflow and dissolution hole samples was highly labile, thus increasing the amount of biologically produced CO2 to drive dissolution. Based on the calculated respiration rates, microbial activity could enhance carbonate dissolution, increasing porosity generation by a maximum of 1%kyr−1 at the top of the freshwater lens

    Antarctic ice sheet sensitivity to atmospheric CO2 variations in the early to mid-Miocene

    Get PDF
    Geological records from the Antarctic margin offer direct evidence of environmental variability at high southern latitudes and provide insight regarding ice sheet sensitivity to past climate change. The early to mid-Miocene (23-14 Mya) is a compelling interval to study as global temperatures and atmospheric CO2 concentrations were similar to those projected for coming centuries. Importantly, this time interval includes the Miocene Climatic Optimum, a period of global warmth during which average surface temperatures were 3-4 °C higher than today. Miocene sediments in the ANDRILL-2A drill core from the Western Ross Sea, Antarctica, indicate that the Antarctic ice sheet (AIS) was highly variable through this key time interval. A multiproxy dataset derived from the core identifies four distinct environmental motifs based on changes in sedimentary facies, fossil assemblages, geochemistry, and paleotemperature. Four major disconformities in the drill core coincide with regional seismic discontinuities and reflect transient expansion of grounded ice across the Ross Sea. They correlate with major positive shifts in benthic oxygen isotope records and generally coincide with intervals when atmospheric CO2 concentrations were at or below preindustrial levels (∼280 ppm). Five intervals reflect ice sheet minima and air temperatures warm enough for substantial ice mass loss during episodes of high (∼500 ppm) atmospheric CO2. These new drill core data and associated ice sheet modeling experiments indicate that polar climate and the AIS were highly sensitive to relatively small changes in atmospheric CO2 during the early to mid-Miocene

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Food intake is influenced by sensory sensitivity.

    Get PDF
    Wide availability of highly palatable foods is often blamed for the rising incidence of obesity. As palatability is largely determined by the sensory properties of food, this study investigated how sensitivity to these properties affects how much we eat. Forty females were classified as either high or low in sensory sensitivity based on their scores on a self-report measure of sensory processing (the Adult Sensory Profile), and their intake of chocolate during the experiment was measured. Food intake was significantly higher for high-sensitivity compared to low-sensitivity individuals. Furthermore, individual scores of sensory sensitivity were positively correlated with self-reported emotional eating. These data could indicate that individuals who are more sensitive to the sensory properties of food have a heightened perception of palatability, which, in turn, leads to a greater food intake

    Effects of intentional movement preparation on response times to symbolic and imitative cues

    No full text
    Speeded responses to an external cue are slower when the cue interrupts preparation to perform the same or a similar action in a self-paced manner. To explore the mechanism underlying this ‘cost of intention’, we examined whether the size of the cost is influenced by the nature of the external cue. Specifically, we assessed whether the cost of intention is different for movements made in response to an imitative cue (an on-screen hand movement) compared to those made in response to a symbolic cue. Consistent with previous reports, externally cued responses were significantly slower on trials where participants were preparing to perform an internally driven movement later in the trial. Also as predicted, simple response times to the imitative cue were faster than those made to the symbolic cue. Critically, the cost of intention was similar for each cue type, suggesting that preparing an intentional action influenced responses cued by the symbolic and imitative cues to a similar degree. These findings suggest that the nature of the external cue does not influence the response time delay associated with concurrent intentional preparation. Together with previous findings, the results of the current study shed further light on the potential mechanisms underlying the cost of intention
    corecore