27 research outputs found

    Online Evidence Charts to Help Students Systematically Evaluate Theories and Evidence

    Get PDF
    To achieve intellectual autonomy, university students should learn how to critically evaluate hypotheses and theories using evidence from the research literature. Typically this occurs in the context of writing an essay, or in planning the introduction and conclusion sections of a laboratory project. To be successful, a student must distill relevant evidence from the research literature, evaluate evidence quality, and evaluate hypotheses or theories in light of the evidence. To help students achieve these goals, we have created a web-based “evidence-charting” tool (available at www.evidencechart.org). The main feature of the website is an interactive chart, providing students a structure to list the evidence (from research articles or experiments), list the theories, and enter their evaluation of how the evidence supports or undermines each theory/hypothesis. The chart also elicits from students their reasoning about why the evidence supports or undermines each hypothesis, and invites them to consider how someone with an opposing view might respond. The online chart provides a summary view of the evidence the student has indicated to be most important, and discussion tools to elaborate on this information. Upon completing a chart, the student is well positioned to write their essay or report, and the instructor has an at-a-glance view to provide formative feedback indicating whether the student has successfully reviewed the literature and understands the evidence and theories. These benefits are being evaluated in the context of introductory and advanced psychology classes

    Elastic Analysis Procedures--An Incurable (but Preventable) Problem in the Fertility Effect Literature: Comment on Gildersleeve, Haselton, & Fales (2013).

    Get PDF
    Gildersleeve, Haselton, and Fales (2014) presented a meta-analysis of the effects of fertility on mate preferences in women. Research in this area has categorized fertility using a great variety of methods, chiefly based on self-reported cycle length and time since last menses. We argue that this literature is particularly prone to hidden experimenter degrees of freedom. Studies vary greatly in the duration and timing of windows used to define fertile versus nonfertile phases, criteria for excluding subjects, and the choice of what moderator variables to include, as well as other variables. These issues raise the concern that many or perhaps all results may have been created by exploitation of unacknowledged degrees of freedom ("p-hacking"). Gildersleeve et al. sought to dismiss such concerns, but we contend that their arguments rest upon statistical and logical errors. The possibility that positive results in this literature may have been created, or at least greatly amplified, by p-hacking receives additional support from the fact that recent attempts at exact replication of fertility results have mostly failed. Our concerns are also supported by findings of another recent review of the literature (Wood, Kressel, Joshi, & Louie, 2014). We conclude on a positive note, arguing that if fertility-effect researchers take advantage of the rapidly emerging opportunities for study preregistration, the validity of this literature can be rapidly clarified

    Manipulations of Choice Familiarity in Multiple-Choice Testing Support a Retrieval Practice Account of the Testing Effect

    Get PDF
    We performed 4 experiments assessing the learning that occurs when taking a test. Our experiments used multiple-choice tests because the processes deployed during testing can be manipulated by varying the nature of the choice alternatives. Previous research revealed that a multiple-choice test that includes "none of the above" (NOTA) produces better performance on a subsequent test only when the correct answer is something other than NOT

    A consensus-based transparency checklist

    Get PDF
    We present a consensus-based checklist to improve and document the transparency of research reports in social and behavioural research. An accompanying online application allows users to complete the form and generate a report that they can submit with their manuscript or post to a public repository

    Methods Video for Spring 2016 UCSD Replication Attempt on Vohs, Mead & Goode (2006, Experiment 1, Science, 314(5802), 1154-1156)

    No full text
    This video shows the procedures used in a 2016 replication attempt carried out at UCSD by Hal Pashler and colleagues.<br

    Raw Data for Spring 2016 UCSD Replication Attempt on Vohs, Mead & Goode (2006, Experiment 1, Science, 314(5802), 1154-1156)

    No full text
    <p>Dataset provided in both CSV and XLS format.<br> <br> Contains 24 fields X 180 records (subjects).<br> <br> Variables are:<br> <br> 1. Subject #<br> <br> 2. Condition (A = neutral priming; B = monopoly prime; C = money prime)<br> <br> 3. Total time in seconds (this is total time before subject sought help if they did seek help, 600 if they never sought help). For subjects in condition I and S, the value provided is the time it took them to contact with the experimenter.<br></p> <p>4. Outcome (H= asked for a hint; I = provided incorrect solution; S = solved puzzle correctly on their own; T = never asked for a hint and remained in room entire for 10 minutes).</p> <p>5. – 24. PANAS responses</p

    Raw Data for Tran and Pashler "Learning to Exploit a Hidden Predictor in Skill Acquisition: Tight Linkage to Conscious Awareness"

    No full text
    The spreadsheet contains the data for Experiments 1 and 2 in "Learning to Exploit a Hidden Predictor in Skill Acquisition: Tight Linkage to Conscious Awareness

    Data

    No full text
    corecore