38 research outputs found

    The relationship between cognitive ability and chess skill: a comprehensive meta-analysis

    Get PDF
    Why are some people more skilled in complex domains than other people? Here, we conducted a meta-analysis to evaluate the relationship between cognitive ability and skill in chess. Chess skill correlated positively and significantly with fluid reasoning (Gf) (r−=0.24), comprehension-knowledge (Gc) (r−=0.22), short-term memory (Gsm) (r−=0.25), and processing speed (Gs) (r−=0.24); the meta-analytic average of the correlations was (r−=0.24). Moreover, the correlation between Gf and chess skill was moderated by age (r−=0.32 for youth samples vs. r−=0.11 for adult samples), and skill level (r−=0.32 for unranked samples vs. r−=0.14 for ranked samples). Interestingly, chess skill correlated more strongly with numerical ability (r−=0.35) than with verbal ability (r−=0.19) or visuospatial ability (r−=0.13). The results suggest that cognitive ability contributes meaningfully to individual differences in chess skill, particularly in young chess players and/or at lower levels of skill

    Checking the academic selection argument. Chess players outperform non-chess players in cognitive skills related to intelligence: a meta-analysis

    Get PDF
    Substantial research in the psychology of expertise has shown that experts in several fields (e.g., science, mathematics) perform better than non-experts on standardized tests of intelligence. This evidence suggests that intelligence plays an important role in the acquisition of expertise. However, a counter argument is that the difference between experts and non-experts is not due to individuals' traits but to academic selection processes. For instance, in science, high scores on standardized tests (e.g., SAT and then GRE) are needed to be admitted to a university program for training. Thus, the “academic selection process” hypothesis is that expert vs. non-expert differences in cognitive ability reflect ability-related differences in access to training opportunities. To test this hypothesis, we focused on a domain in which there are no selection processes based on test scores: chess. This meta-analysis revealed that chess players outperformed non-chess players in intelligence-related skills (d−=0.49). Therefore, this outcome does not corroborate the academic selection process argument, and consequently, supports the idea that access to training alone cannot explain expert performance

    Mindset Premise Study

    No full text

    Deliberate Practice and Sports Performance: Meta-analysis

    No full text
    Open Data: Macnamara, Moreau, & Hambrick (in press). Perspectives on Psychological Sciences

    Replication and Extension: Seminal DP Study

    No full text

    Cognitive abilities of bilinguals

    No full text
    Open data collected from first semester and last semester Conference Interpretation, Translation & Interpretation, and Translation M.A. students at the Monterey Institute of International Studies (now the Middlebury Institute of International Studies at Monterey) and bilingual and second language learning students at Princeton University. This project was funded by the National Science Foundation (BCS-1250638)

    The genetic influence of spatial reasoning: A meta-analysis

    No full text

    Individual Responses versus Aggregate Group-Level Results: Examining the Strength of Evidence for Growth Mindset Interventions on Academic Performance

    No full text
    Mindset theory assumes that students’ beliefs about their intelligence—whether these are fixed or can grow—affects students’ academic performance. Based on this assumption, mindset theorists have developed growth mindset interventions to teach students that their intelligence or another attribute can be developed, with the goal of improving academic outcomes. Though many papers have reported benefits from growth mindset interventions, others have reported no effects or even detrimental effects. Recently, proponents of mindset theory have called for a “heterogeneity revolution” to understand when growth mindset interventions are effective and when—and for whom—they are not. We sought to examine the whole picture of heterogeneity of treatment effects, including benefits, lack of impacts, and potential detriments of growth mindset interventions on academic performance. We used a recently proposed approach that considers persons as effect sizes; this approach can reveal individual-level heterogeneity often lost in aggregate data analyses. Across three papers, we find that this approach reveals substantial individual-level heterogeneity unobservable at the group level, with many students and teachers exhibiting mindset and performance outcomes that run counter to the authors’ claims. Understanding and reporting heterogeneity, including benefits, null effects, and detriments, will lead to better guidance for educators and policymakers considering the role of growth mindset interventions in schools

    Reconsidering the Use of the Mindset Assessment Profile in Educational Contexts

    No full text
    The Mindset Assessment Profile is a popular questionnaire purportedly designed to measure mindset—an individual’s belief in whether intelligence is malleable or stable. Despite its widespread use, the questionnaire appears to assess an individual’s need for cognition and goal orientation more than mindset. We assessed the reliability, construct validity, and factor structure of the Mindset Assessment Profile in a sample of 992 undergraduates. The reliability of the Mindset Assessment Profile was questionable (α = .63) and significantly lower than the reliability of the Implicit Theories of Intelligence Questionnaire (α = .94), an established measure of mindset. The Mindset Assessment Profile also lacked convergent and discriminant validity. Overall scores on the Mindset Assessment Profile correlated significantly more strongly with need for cognition than with mindset. Item-level analyses supported this finding: most items correlated weakly or not at all with mindset, and correlated significantly more strongly with need for cognition and learning goal orientation. Exploratory factor analysis indicated that three factors were underlying scores on the Mindset Assessment Profile: need for cognition, mindset, and performance goal orientation. Based on its questionable reliability and poor construct validity, we do not recommend that researchers and educators use the Mindset Assessment Profile to measure mindset
    corecore