14 research outputs found

    Aligning the Goals of Learning Analytics with its Research Scholarship: An Open Peer Commentary Approach

    Get PDF
    To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community. This paper brings those commentaries together, and we recommend reading it as a companion piece to the original paper by Motz et al. (2023), which also appears in this issu

    Creative destruction in science

    Get PDF
    Drawing on the concept of a gale of creative destruction in a capitalistic economy, we argue that initiatives to assess the robustness of findings in the organizational literature should aim to simultaneously test competing ideas operating in the same theoretical space. In other words, replication efforts should seek not just to support or question the original findings, but also to replace them with revised, stronger theories with greater explanatory power. Achieving this will typically require adding new measures, conditions, and subject populations to research designs, in order to carry out conceptual tests of multiple theories in addition to directly replicating the original findings. To illustrate the value of the creative destruction approach for theory pruning in organizational scholarship, we describe recent replication initiatives re-examining culture and work morality, working parents\u2019 reasoning about day care options, and gender discrimination in hiring decisions. Significance statement It is becoming increasingly clear that many, if not most, published research findings across scientific fields are not readily replicable when the same method is repeated. Although extremely valuable, failed replications risk leaving a theoretical void\u2014 reducing confidence the original theoretical prediction is true, but not replacing it with positive evidence in favor of an alternative theory. We introduce the creative destruction approach to replication, which combines theory pruning methods from the field of management with emerging best practices from the open science movement, with the aim of making replications as generative as possible. In effect, we advocate for a Replication 2.0 movement in which the goal shifts from checking on the reliability of past findings to actively engaging in competitive theory testing and theory building. Scientific transparency statement The materials, code, and data for this article are posted publicly on the Open Science Framework, with links provided in the article

    Texas Teacher Roles Inventory (TEX-TRI)

    No full text

    Psychology faculty overestimate the magnitude of Cohen’s d effect sizes by half a standard deviation

    No full text
    In this experiment, we recruited 261 psychology faculty to determine the extent to which they were able to visually estimate the overlap of two distributions given a Cohen’s d effect size; and vice-versa estimate d given two distributions of varying overlap. In a pre-test, participants in both conditions over-estimated effect sizes by half a standard deviation on average. No significant differences in estimation accuracy by psychology sub-field were found, but having taught statistics coursework was a significant predictor of better performance. After a short training session, participants improved substantially on both tasks on the post-test, with ~63% reduction in absolute error and negligible overall bias (~98% bias reduction). Furthermore, post-test performance indicates that learning transferred across answering modes. Teachers of statistics might find it beneficial to include a short exercise (less than 10 minutes) requiring the visual estimation of effect sizes in statistics coursework to better train future psychology researchers

    Schuetze & Yan - Cohen's d Perceptual Estimation

    No full text

    Muenks et al. - Cost Separate from Value?

    No full text
    corecore