285 research outputs found

    Boundaries of Semantic Distraction: Dominance and Lexicality Act at Retrieval

    Get PDF
    Three experiments investigated memory for semantic information with the goal of determining boundary conditions for the manifestation of semantic auditory distraction. Irrelevant speech disrupted the free recall of semantic category-exemplars to an equal degree regardless of whether the speech coincided with presentation or test phases of the task (Experiment 1) and occurred regardless of whether it comprised random words or coherent sentences (Experiment 2). The effects of background speech were greater when the irrelevant speech was semantically related to the to-be-remembered material, but only when the irrelevant words were high in output dominance (Experiment 3). The implications of these findings in relation to the processing of task material and the processing of background speech is discussed

    Five views of a secret: does cognition change during middle adulthood?

    Full text link
    This study examined five aspects of change (or stability) in cognitive abilities in middle adulthood across a 12-year period. Data come from the Interdisciplinary Study on Adult Development. The sample consisted of N = 346 adults (43.8 years on average, 48.6% female). In total, 11 cognitive tests were administered to assess fluid and crystallized intelligence, memory, and processing speed. In a first series of analyses, strong measurement invariance was established. Subsequently, structural stability, differential stability, stability of divergence, absolute stability, and the generality of changes were examined. Factor covariances were shown to be equal across time, implying structural stability. Stability coefficients were around .90 for fluid and crystallized intelligence, and speed, indicating high, yet not perfect differential stability. The coefficient for memory was .58. Only in processing speed the variance increased across time, indicating heterogeneity in interindividual development. Significant mean-level changes emerged, with an increase in crystallized intelligence and decline in the other three abilities. A number of correlations among changes in cognitive abilities were significant, implying that cognitive change

    Interaction between CRHR1 and BDNF Genes Increases the Risk of Recurrent Major Depressive Disorder in Chinese Population

    Get PDF
    BACKGROUND: An important etiological hypothesis about depression is stress has neurotoxic effects that damage the hippocampal cells. Corticotropin-releasing hormone (CRH) regulates brain-derived neurotrophic factor (BDNF) expression through influencing cAMP and Ca2+ signaling pathways during the course. The aim of this study is to examine the single and combined effects of CRH receptor 1 (CRHR1) and BDNF genes in recurrent major depressive disorder (MDD). METHODOLOGY/PRINCIPAL FINDING: The sample consists of 181 patients with recurrent MDD and 186 healthy controls. Whether genetic variations interaction between CRHR1 and BDNF genes might be associated with increased susceptibility to recurrent MDD was studied by using a gene-based association analysis of single-nucleotide polymorphisms (SNPs). CRHR1 gene (rs1876828, rs242939 and rs242941) and BDNF gene (rs6265) were identified in the samples of patients diagnosed with recurrent MDD and matched controls. Allelic association between CRHR1 rs242939 and recurrent MDD was found in our sample (allelic: p = 0.018, genotypic: p = 0.022) with an Odds Ratio 0.454 (95% CI 0.266-0.775). A global test of these four haplotypes showed a significant difference between recurrent MDD group and control group (chi-2 = 13.117, df = 3, P = 0.016. Furthermore, BDNF and CRHR1 interactions were found in the significant 2-locus, gene-gene interaction models (p = 0.05) using a generalized multifactor dimensionality reduction (GMDR) method. CONCLUSION: Our results suggest that an interaction between CRHR1 and BDNF genes constitutes susceptibility to recurrent MDD

    Effects, equity, and cost of school-based and community-wide treatment strategies for soil-transmitted helminths in Kenya: a cluster-randomised controlled trial

    Get PDF
    Background School-based deworming programmes can reduce morbidity attributable to soil-transmitted helminths in children but do not interrupt transmission in the wider community. We assessed the effects of alternative mass treatment strategies on community soil-transmitted helminth infection. Methods In this cluster-randomised controlled trial, 120 community units (clusters) serving 150 000 households in Kenya were randomly assigned (1:1:1) to receive albendazole through annual school-based treatment targeting 2–14 year olds or annual or biannual community-wide treatment targeting all ages. The primary outcome was community hookworm prevalence, assessed at 12 and 24 months through repeat cross-sectional surveys. Secondary outcomes were Ascaris lumbricoides and Trichuris trichiura prevalence, infection intensity of each soil-transmitted helminth species, and treatment coverage and costs. Analysis was by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT02397772. Findings After 24 months, prevalence of hookworm changed from 18·6% (95% CI 13·9–23·2) to 13·8% (10·5–17·0) in the annual school-based treatment group, 17·9% (13·7–22·1) to 8·0% (6·0–10·1) in the annual community-wide treatment group, and 20·6% (15·8–25·5) to 6·2% (4·9–7·5) in the biannual community-wide treatment group. Relative to annual school-based treatment, the risk ratio for annual community-wide treatment was 0·59 (95% CI 0·42–0·83; p<0·001) and for biannual community-wide treatment was 0·46 (0·33–0·63; p<0·001). More modest reductions in risk were observed after 12 months. Risk ratios were similar across demographic and socioeconomic subgroups after 24 months. No adverse events related to albendazole were reported. Interpretation Community-wide treatment was more effective in reducing hookworm prevalence and intensity than school-based treatment, with little additional benefit of treating every 6 months, and was shown to be remarkably equitable in coverage and effects. Funding Bill & Melinda Gates Foundation, the Joint Global Health Trials Scheme of the Medical Research Council, the UK Department for International Development, the Wellcome Trust, and the Children’s Investment Fund Foundation

    Effects of spines and thorns on Australian arid zone herbivores of different body masses

    Full text link
    We investigated the effects of thorns and spines on the feeding of 5 herbivore species in arid Australia. The herbivores were the rabbit ( Oryctolagus cuniculus ), euro kangaroo ( Macropus robustus ), red kangaroo ( Macropus rufus ), sheep ( Ovis aries ), and cattle ( Bos taurus ). Five woody plants without spines or thorns and 6 woody plants with thorns were included in the study. The spines and thorns were not found to affect the herbivores' rates of feeding (items ingested/min), but they did reduce the herbivores' rates of biomass ingestion (g-dry/item). The reduction in biomass ingested occurred in two ways: at a given diameter, twigs with spines and thorns had less mass than undefended plants, and the herbivores consumed twigs with smaller diameters on plants with spines and thorns. The relative importance of the two ways that twigs with spines and thorns provided less biomass varied with herbivore body mass. Reduced twig mass was more important for small herbivores, while large herbivores selected smaller diameters. The effectiveness of spines and thorns as anti-herbivore defenses did not vary with the evolutionary history of the herbivores (i.e. native vs. introduced). Spines and thorns mainly affected the herbivores' selection of maximum twig sizes (reducing diameter and mass), but the minimum twig sizes selected were also reduced.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/47795/1/442_2004_Article_BF00317715.pd

    An Incomplete TCA Cycle Increases Survival of Salmonella Typhimurium during Infection of Resting and Activated Murine Macrophages

    Get PDF
    In comparison to the comprehensive analyses performed on virulence gene expression, regulation and action, the intracellular metabolism of Salmonella during infection is a relatively under-studied area. We investigated the role of the tricarboxylic acid (TCA) cycle in the intracellular replication of Salmonella Typhimurium in resting and activated macrophages, epithelial cells, and during infection of mice.We constructed deletion mutations of 5 TCA cycle genes in S. Typhimurium including gltA, mdh, sdhCDAB, sucAB, and sucCD. We found that the mutants exhibited increased net intracellular replication in resting and activated murine macrophages compared to the wild-type. In contrast, an epithelial cell infection model showed that the S. Typhimurium ΔsucCD and ΔgltA strains had reduced net intracellular replication compared to the wild-type. The glyoxylate shunt was not responsible for the net increased replication of the TCA cycle mutants within resting macrophages. We also confirmed that, in a murine infection model, the S. Typhimurium ΔsucAB and ΔsucCD strains are attenuated for virulence.Our results suggest that disruption of the TCA cycle increases the ability of S. Typhimurium to survive within resting and activated murine macrophages. In contrast, epithelial cells are non-phagocytic cells and unlike macrophages cannot mount an oxidative and nitrosative defence response against pathogens; our results show that in HeLa cells the S. Typhimurium TCA cycle mutant strains show reduced or no change in intracellular levels compared to the wild-type. The attenuation of the S. Typhimurium ΔsucAB and ΔsucCD mutants in mice, compared to their increased net intracellular replication in resting and activated macrophages suggest that Salmonella may encounter environments within the host where a complete TCA cycle is advantageous

    Predicting change in quality of life from age 79 to 90 in the Lothian Birth Cohort 1921

    Get PDF
    Purpose: Quality of life (QoL) decreases in very old age, and is strongly related to health outcomes and mortality. Understanding the predictors of QoL and change in QoL amongst the oldest old may suggest potential targets for intervention. This study investigated change in QoL from age 79 to 90 years in a group of older adults in Scotland, and identified potential predictors of that change. Method: Participants were members of the Lothian Birth Cohort 1921 who attended clinic visits at age 79 (n = 554) and 90 (n = 129). Measures at both time points included QoL (WHOQOL-BREF: four domains and two single items), anxiety and depression, objective health, functional ability, self-rated health, loneliness, and personality. Results: Mean QoL declined from age 79 to 90. Participants returning at 90 had scored significantly higher at 79 on most QoL measures, and exhibited better objective health and functional ability, and lower anxiety and depression than non-returners. Hierarchical multiple regression models accounted for 20.3–56.3% of the variance in QoL at age 90. Baseline QoL was the strongest predictor of domain scores (20.3–35.6% variance explained), suggesting that individual differences in QoL judgements remain largely stable. Additional predictors varied by the QoL domain and included self-rated health, loneliness, and functional and mood decline between age 79 and 90 years. Conclusions: This study has identified potential targets for interventions to improve QoL in the oldest old. Further research should address causal pathways between QoL and functional and mood decline, perceived health and loneliness
    • …
    corecore