142 research outputs found
Modular morals:Mapping the organization of the moral brain
Is morality the product of multiple domain-specific psychological mechanisms, or one domain-general mechanism? Previous research suggests that morality consists of a range of solutions to the problems of cooperation recurrent in human social life. This theory of ‘morality as cooperation’ suggests that there are (at least) seven specific moral domains: family values, group loyalty, reciprocity, heroism, deference, fairness and property rights. However, it is unclear how these types of morality are implemented at the neuroanatomical level. The possibilities are that morality is (1) the product of multiple distinct domain-specific adaptations for cooperation, (2) the product of a single domain-general adaptation which learns a range of moral rules, or (3) the product of some combination of domain-specific and domain-general adaptations. To distinguish between these possibilities, we first conducted an anatomical likelihood estimation meta-analysis of previous studies investigating the relationship between these seven moral domains and neuroanatomy. This meta-analysis provided evidence for a combination of specific and general adaptations. Next, we investigated the relationship between the seven types of morality – as measured by the Morality as Cooperation Questionnaire (Relevance) – and grey matter volume in a large neuroimaging (n = 607) sample. No associations between moral values and grey matter volume survived whole-brain exploratory testing. We conclude that whatever combination of mechanisms are responsible for morality, either they are not neuroanatomically localised, or else their localisation is not manifested in grey matter volume. Future research should employ phylogenetically informed a priori predictions, as well as alternative measures of morality and of brain function
Patterns of stressful life events and polygenic scores for five mental disorders and neuroticism among adults with depression
The dominant (‘general’) version of the diathesis-stress theory of depression views stressors and genetic vulnerability as independent risks. In the Australian Genetics of Depression Study (N = 14,146; 75% female), we tested whether polygenic scores (PGS) for major depression, bipolar disorder, schizophrenia, anxiety, ADHD, and neuroticism were associated with reported exposure to 32 childhood, past-year, lifetime, and accumulated stressful life events (SLEs). In false discovery rate-corrected models, the clearest PGS-SLE relationships were for the ADHD- and depression-PGSs, and to a lesser extent, the anxiety- and schizophrenia-PGSs. We describe the associations for childhood and accumulated SLEs, and the 2–3 strongest past-year/lifetime SLE associations. Higher ADHD-PGS was associated with all childhood SLEs (emotional abuse, emotional neglect, physical neglect; ORs = 1.09–1.14; p’s < 1.3 × 10−5), more accumulated SLEs, and reported exposure to sudden violent death (OR = 1.23; p = 3.6 × 10−5), legal troubles (OR = 1.15; p = 0.003), and sudden accidental death (OR = 1.14; p = 0.006). Higher depression-PGS was associated with all childhood SLEs (ORs = 1.07–1.12; p’s < 0.013), more accumulated SLEs, and severe human suffering (OR = 1.17; p = 0.003), assault with a weapon (OR = 1.12; p = 0.003), and living in unpleasant surroundings (OR = 1.11; p = 0.001). Higher anxiety-PGS was associated with childhood emotional abuse (OR = 1.08; p = 1.6 × 10−4), more accumulated SLEs, and serious accident (OR = 1.23; p = 0.004), physical assault (OR = 1.08; p = 2.2 × 10−4), and transportation accident (OR = 1.07; p = 0.001). Higher schizophrenia-PGS was associated with all childhood SLEs (ORs = 1.12–1.19; p’s < 9.3−8), more accumulated SLEs, and severe human suffering (OR = 1.16; p = 0.003). Higher neuroticism-PGS was associated with living in unpleasant surroundings (OR = 1.09; p = 0.007) and major financial troubles (OR = 1.06; p = 0.014). A reversed pattern was seen for the bipolar-PGS, with lower odds of reported physical assault (OR = 0.95; p = 0.014), major financial troubles (OR = 0.93; p = 0.004), and living in unpleasant surroundings (OR = 0.92; p = 0.007). Genetic risk for several mental disorders influences reported exposure to SLEs among adults with moderately severe, recurrent depression. Our findings emphasise that stressors and diatheses are inter-dependent and challenge diagnosis and subtyping (e.g., reactive/endogenous) based on life events
Noticing Future Me: Reducing Egocentrism Through Mental Imagery.
People drastically overestimate how often others attend to them or notice their unusual features, a phenomenon termed the spotlight effect Despite the prevalence of this egocentric bias, little is known about how to reduce the tendency to see oneself as the object of others' attention. Here, we tested the hypothesis that a basic property of mental imagery-the visual perspective from which an event is viewed-may alleviate a future-oriented variant of the spotlight effect. The results of three experiments supported this prediction. Experiment 1 revealed a reduction in egocentric spotlighting when participants imagined an event in the far compared with near future. Experiments 2 and 3 demonstrated reduced spotlighting and feelings of embarrassment when participants viewed an impending event from a third-person (vs. first-person) vantage point. Simple changes in one's visual perspective may be sufficient to diminish the illusion of personal salience
Caloric restriction augments radiation efficacy in breast cancer.
Dietary modification such as caloric restriction (CR) has been shown to decrease tumor initiation and progression. We sought to determine if nutrient restriction could be used as a novel therapeutic intervention to enhance cytotoxic therapies such as radiation (IR) and alter the molecular profile of triple-negative breast cancer (TNBC), which displays a poor prognosis. In two murine models of TNBC, significant tumor regression is noted with IR or diet modification, and a greater regression is observed combining diet modification with IR. Two methods of diet modification were compared, and it was found that a daily 30% reduction in total calories provided more significant tumor regression than alternate day feeding. At the molecular level, tumors treated with CR and IR showed less proliferation and more apoptosis. cDNA array analysis demonstrated the IGF-1R pathway plays a key role in achieving this physiologic response, and multiple members of the IGF-1R pathway including IGF-1R, IRS, PIK3ca and mTOR were found to be downregulated. The innovative use of CR as a novel therapeutic option has the potential to change the biology of tumors and enhance the opportunity for clinical benefit in the treatment of patients with TNBC
Personalized Nutrition as a Key Contributor to Improving Radiation Response in Breast Cancer
Understanding metabolic and immune regulation inherent to patient populations is key to improving the radiation response for our patients. To date, radiation therapy regimens are prescribed based on tumor type and stage. Patient populations who are noted to have a poor response to radiation such as those of African American descent, those who have obesity or metabolic syndrome, or senior adult oncology patients, should be considered for concurrent therapies with radiation that will improve response. Here, we explore these populations of breast cancer patients, who frequently display radiation resistance and increased mortality rates, and identify the molecular underpinnings that are, in part, responsible for the radiation response and that result in an immune-suppressive tumor microenvironment. The resulting immune phenotype is discussed to understand how antitumor immunity could be improved. Correcting nutrient deficiencies observed in these populations should be considered as a means to improve the therapeutic index of radiation therapy
Turning I into me: Imagining your future self.
A widely endorsed belief is that perceivers imagine their present selves using a different representational format than imagining their future selves (i.e., near future=first-person; distant future=third-person). But is this really the case? Responding to the paucity of work on this topic, here we considered how temporal distance influences the extent to which individuals direct their attention outward or inward during a brief imaginary episode. Using a non-verbal measure of visual perspective taking (i.e., letter-drawing task) our results confirmed the hypothesized relation between temporal distance and conceptions of the self. Whereas simulations of an event in the near future were dominated by a first-person representation of the self, this switched to a third-person depiction when the event was located in the distant future. Critically, this switch in vantage point was restricted to self-related simulations. The theoretical and practical implications of these findings are considered
Multi-ancestry genome-wide association study of major depression aids locus discovery, fine mapping, gene prioritization and causal inference
Most genome-wide association studies (GWAS) of major depression (MD) have been conducted in samples of European ancestry. Here we report a multi-ancestry GWAS of MD, adding data from 21 cohorts with 88,316 MD cases and 902,757 controls to previously reported data. This analysis used a range of measures to define MD and included samples of African (36% of effective sample size), East Asian (26%) and South Asian (6%) ancestry and Hispanic/Latin American participants (32%). The multi-ancestry GWAS identified 53 significantly associated novel loci. For loci from GWAS in European ancestry samples, fewer than expected were transferable to other ancestry groups. Fine mapping benefited from additional sample diversity. A transcriptome-wide association study identified 205 significantly associated novel genes. These findings suggest that, for MD, increasing ancestral and global diversity in genetic studies may be particularly important to ensure discovery of core genes and inform about transferability of findings.</p
SARS-CoV-2 Viremia is Associated with COVID-19 Severity and Predicts Clinical Outcomes
Background: SARS-CoV-2 viral RNA (vRNA) is detected in the bloodstream of some patients with COVID-19 (“RNAemia”) but it is not clear whether this RNAemia reflects viremia (i.e., virus particles) and how RNAemia/viremia is related to host immune responses and outcomes.
Methods: SARS-CoV-2 vRNA was quantified by ultra-sensitive RT-PCR in plasma samples (0.5-1.0 ml) from observational cohorts of 51 COVID-19 patients including 9 outpatients, 19 hospitalized (non-ICU), and 23 ICU patients, and vRNA levels compared with cross-sectional indices of COVID-19 severity and prospective clinical outcomes. We used multiple imaging methods to visualize virions in pelleted plasma.
Results: SARS-CoV-2 vRNA was detected in plasma of 100%, 52.6% and 11.1% of ICU, non-ICU, and outpatients respectively. Virions were detected in plasma pellets by electron tomography and immunostaining. Plasma vRNA levels were significantly higher in ICU > non-ICU > outpatients (p6,000 copies/ml was strongly associated with mortality (HR: 10.7). Levels of vRNA were significantly associated with several inflammatory biomarkers (p<0.01) but not with plasma neutralizing antibody titers (p=0.8).
Conclusions: Visualization of virus particles in plasma indicates that SARS-CoV-2 RNAemia is due, at least in part, to viremia. The levels of SARS-CoV-2 RNAemia quantified by ultrasensitive RT-PCR correlate strongly with disease severity, patient outcome and specific inflammatory biomarkers but not neutralizing antibody titers
Factors influencing terrestriality in primates of the Americas and Madagascar
Among mammals, the order Primates is exceptional in having a high taxonomic richness in which the taxa are arboreal, semiterrestrial, or terrestrial. Although habitual terrestriality is pervasive among the apes and African and Asian monkeys (catarrhines), it is largely absent among monkeys of the Americas (platyrrhines), as well as galagos, lemurs, and lorises (strepsirrhines), which are mostly arboreal. Numerous ecological drivers and species-specific factors are suggested to set the conditions for an evolutionary shift from arboreality to terrestriality, and current environmental conditions may provide analogous scenarios to those transitional periods. Therefore, we investigated predominantly arboreal, diurnal primate genera from the Americas and Madagascar that lack fully terrestrial taxa, to determine whether ecological drivers (habitat canopy cover, predation risk, maximum temperature, precipitation, primate species richness, human population density, and distance to roads) or species-specific traits (bodymass, group size, and degree of frugivory) associate with increased terrestriality. We collated 150,961 observation hours across 2,227 months from 47 species at 20 sites in Madagascar and 48 sites in the Americas. Multiple factors were associated with ground use in these otherwise arboreal species, including increased temperature, a decrease in canopy cover, a dietary shift away from frugivory, and larger group size. These factors mostly explain intraspecific differences in terrestriality. As humanity modifies habitats and causes climate change, our results suggest that species already inhabiting hot, sparsely canopied sites, and exhibiting more generalized diets, are more likely to shift toward greater ground use
The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study
AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease
- …