101 research outputs found

    Risk assessment and decision making about in-labour transfer from rural maternity care: a social judgment and signal detection analysis

    Get PDF
    Background: The importance of respecting women's wishes to give birth close to their local community is supported by policy in many developed countries. However, persistent concerns about the quality and safety of maternity care in rural communities have been expressed. Safe childbirth in rural communities depends on good risk assessment and decision making as to whether and when the transfer of a woman in labour to an obstetric led unit is required. This is a difficult decision. Wide variation in transfer rates between rural maternity units have been reported suggesting different decision making criteria may be involved; furthermore, rural midwives and family doctors report feeling isolated in making these decisions and that staff in urban centres do not understand the difficulties they face. In order to develop more evidence based decision making strategies greater understanding of the way in which maternity care providers currently make decisions is required. This study aimed to examine how midwives working in urban and rural settings and obstetricians make intrapartum transfer decisions, and describe sources of variation in decision making. Methods: The study was conducted in three stages. 1. 20 midwives and four obstetricians described factors influencing transfer decisions. 2. Vignettes depicting an intrapartum scenario were developed based on stage one data. 3. Vignettes were presented to 122 midwives and 12 obstetricians who were asked to assess the level of risk in each case and decide whether to transfer or not. Social judgment analysis was used to identify the factors and factor weights used in assessment. Signal detection analysis was used to identify participants' ability to distinguish high and low risk cases and personal decision thresholds. Results: When reviewing the same case information in vignettes midwives in different settings and obstetricians made very similar risk assessments. Despite this, a wide range of transfer decisions were still made, suggesting that the main source of variation in decision making and transfer rates is not in the assessment but the personal decision thresholds of clinicians. Conclusions: Currently health care practice focuses on supporting or improving decision making through skills training and clinical guidelines. However, these methods alone are unlikely to be effective in improving consistency of decision making

    The Vigilance Decrement in Executive Function Is Attenuated When Individual Chronotypes Perform at Their Optimal Time of Day

    Get PDF
    Time of day modulates our cognitive functions, especially those related to executive control, such as the ability to inhibit inappropriate responses. However, the impact of individual differences in time of day preferences (i.e. morning vs. evening chronotype) had not been considered by most studies. It was also unclear whether the vigilance decrement (impaired performance with time on task) depends on both time of day and chronotype. In this study, morning-type and evening-type participants performed a task measuring vigilance and response inhibition (the Sustained Attention to Response Task, SART) in morning and evening sessions. The results showed that the vigilance decrement in inhibitory performance was accentuated at non-optimal as compared to optimal times of day. In the morning-type group, inhibition performance decreased linearly with time on task only in the evening session, whereas in the morning session it remained more accurate and stable over time. In contrast, inhibition performance in the evening-type group showed a linear vigilance decrement in the morning session, whereas in the evening session the vigilance decrement was attenuated, following a quadratic trend. Our findings imply that the negative effects of time on task in executive control can be prevented by scheduling cognitive tasks at the optimal time of day according to specific circadian profiles of individuals. Therefore, time of day and chronotype influences should be considered in research and clinical studies as well as real-word situations demanding executive control for response inhibition.This work was supported by the Spanish Ministerio de Ciencia e Innovación (Ramón y Cajal programme: RYC-2007-00296 and PLAN NACIONAL de I+D+i: PSI2010-15399) and Junta de Andalucía (SEJ-3054)

    Probe-caught spontaneous and deliberate mind wandering in relation to self-reported inattentive, hyperactive and impulsive traits in adults.

    Get PDF
    Research has revealed a positive relationship between types of mind wandering and ADHD at clinical and subclinical levels. However, this work did not consider the relationship between mind wandering and the core symptoms of ADHD: inattention, hyperactivity and impulsivity. Given that the DMS-V attributes mind wandering to inattention only, and that only inattention is thought to result from impairment to the executive function linked to mind wandering, the present research sought to examine this relationship in 80 undiagnosed adults. Using both standard and easy versions of the Sustained Attention to Response Task (SART) we measured both spontaneous and deliberate mind wandering. We found that spontaneous mind wandering was related to self-reported inattentive traits when the task was cognitively more challenging (standard SART). However, hyperactive and impulsive traits were related to spontaneous mind wandering independent of task difficulty. The results suggest inattentive traits are not uniquely related to mind wandering; indeed, adults with hyperactive/impulsive traits were more likely to experience mind wandering, suggesting that mind wandering might not be useful diagnostic criteria for inattention

    Neural Correlates of Ongoing Conscious Experience: Both Task-Unrelatedness and Stimulus-Independence Are Related to Default Network Activity

    Get PDF
    The default mode network (DMN) is a set of brain regions that consistently shows higher activity at rest compared to tasks requiring sustained focused attention toward externally presented stimuli. The cognitive processes that the DMN possibly underlies remain a matter of debate. It has alternately been proposed that DMN activity reflects unfocused attention toward external stimuli or the occurrence of internally generated thoughts. The present study aimed at clarifying this issue by investigating the neural correlates of the various kinds of conscious experiences that can occur during task performance. Four classes of conscious experiences (i.e., being fully focused on the task, distractions by irrelevant sensations/perceptions, interfering thoughts related to the appraisal of the task, and mind-wandering) that varied along two dimensions (“task-relatedness” and “stimulus-dependency”) were sampled using thought-probes while the participants performed a go/no-go task. Analyses performed on the intervals preceding each probe according to the reported subjective experience revealed that both dimensions are relevant to explain activity in several regions of the DMN, namely the medial prefrontal cortex, posterior cingulate cortex/precuneus, and posterior inferior parietal lobe. Notably, an additive effect of the two dimensions was demonstrated for midline DMN regions. On the other hand, lateral temporal regions (also part of the DMN) were specifically related to stimulus-independent reports. These results suggest that midline DMN regions underlie cognitive processes that are active during both internal thoughts and external unfocused attention. They also strengthen the view that the DMN can be fractionated into different subcomponents and reveal the necessity to consider both the stimulus-dependent and the task-related dimensions of conscious experiences when studying the possible functional roles of the DMN

    The evolution of lung cancer and impact of subclonal selection in TRACERx

    Get PDF
    Lung cancer is the leading cause of cancer-associated mortality worldwide1. Here we analysed 1,644 tumour regions sampled at surgery or during follow-up from the first 421 patients with non-small cell lung cancer prospectively enrolled into the TRACERx study. This project aims to decipher lung cancer evolution and address the primary study endpoint: determining the relationship between intratumour heterogeneity and clinical outcome. In lung adenocarcinoma, mutations in 22 out of 40 common cancer genes were under significant subclonal selection, including classical tumour initiators such as TP53 and KRAS. We defined evolutionary dependencies between drivers, mutational processes and whole genome doubling (WGD) events. Despite patients having a history of smoking, 8% of lung adenocarcinomas lacked evidence of tobacco-induced mutagenesis. These tumours also had similar detection rates for EGFR mutations and for RET, ROS1, ALK and MET oncogenic isoforms compared with tumours in never-smokers, which suggests that they have a similar aetiology and pathogenesis. Large subclonal expansions were associated with positive subclonal selection. Patients with tumours harbouring recent subclonal expansions, on the terminus of a phylogenetic branch, had significantly shorter disease-free survival. Subclonal WGD was detected in 19% of tumours, and 10% of tumours harboured multiple subclonal WGDs in parallel. Subclonal, but not truncal, WGD was associated with shorter disease-free survival. Copy number heterogeneity was associated with extrathoracic relapse within 1 year after surgery. These data demonstrate the importance of clonal expansion, WGD and copy number instability in determining the timing and patterns of relapse in non-small cell lung cancer and provide a comprehensive clinical cancer evolutionary data resource

    Strategies for the Use of Fallback Foods in Apes

    Get PDF
    Researchers have suggested that fallback foods (FBFs) shape primate food processing adaptations, whereas preferred foods drive harvesting adaptations, and that the dietary importance of FBFs is central in determining the expression of a variety of traits. We examine these hypotheses in extant apes. First, we compare the nature and dietary importance of FBFs used by each taxon. FBF importance appears greatest in gorillas, followed by chimpanzees and siamangs, and least in orangutans and gibbons (bonobos are difficult to place). Next, we compare 20 traits among taxa to assess whether the relative expression of traits expected for consumption of FBFs matches their observed dietary importance. Trait manifestation generally conforms to predictions based on dietary importance of FBFs. However, some departures from predictions exist, particularly for orang-utans, which express relatively more food harvesting and processing traits predicted for consuming large amounts of FBFs than expected based on observed dietary importance. This is probably due to the chemical, mechanical, and phenological properties of the apes’ main FBFs, in particular high importance of figs for chimpanzees and hylobatids, compared to use of bark and leaves—plus figs in at least some Sumatran populations—by orang-utans. This may have permitted more specialized harvesting adaptations in chimpanzees and hylobatids, and required enhanced processing adaptations in orang-utans. Possible intercontinental differences in the availability and quality of preferred and FBFs may also be important. Our analysis supports previous hypotheses suggesting a critical influence of the dietary importance and quality of FBFs on ape ecology and, consequently, evolution
    corecore