362 research outputs found

    Student behavior and test security in online conceptual assessment

    Full text link
    Historically, the implementation of research-based assessments (RBAs) has been a driver of education change within physics and helped motivate adoption of interactive engagement pedagogies. Until recently, RBAs were given to students exclusively on paper and in-class; however, this approach has important drawbacks including decentralized data collection and the need to sacrifice class time. Recently, some RBAs have been moved to online platforms to address these limitations. Yet, online RBAs present new concerns such as student participation rates, test security, and students' use of outside resources. Here, we report on a pilot study addressing these concerns. We gave two upper-division RBAs to courses at five institutions; the RBAs were hosted online and featured embedded JavaScript code which collected information on students' behaviors (e.g., copying text, printing). With these data, we examine the prevalence of these behaviors, and their correlation with students' scores, to determine if online and paper-based RBAs are comparable. We find that browser loss of focus is the most common online behavior while copying and printing events were rarer.We found no statistically significant correlation between any of these online behaviors and students scores. We also found that participation rates for our upper-division population went up when the RBA was given online. These results indicates that, for our upper-division population, scores on online administrations of these RBAs were comparable to in-class versions.Comment: 6 pages, 0 figures, submitted to the 2019 Physics Education Research Conferenc

    The paradigm of patient must evolve: Why a false sense of limited capacity can subvert all attempts at patient involvement

    Get PDF
    This essay reviews the role of paradigms in molding the thoughts of a scientific field and looks rigorously at what two key terms mean – empowered and engaged – and how their interaction points to a new way forward, requiring a re-examination of our “paradigm of patient.” Five years ago, the Institute of Medicine’s Best Care at Lower Cost declared that patient-clinician partnerships are a cornerstone of a learning health system, a declaration that’s foundational to the era of involvement. How can we engineer that era correctly if our conception of “patient” is out of date? And how can we validate whether our model works? In the past eight years, the author has spoken at or participated in over 500 events in sixteen countries, and although declaring himself “just a patient,” he has observed persistent cultural patterns that make one thing clear: there is a need to change our understanding of the role of the patient in achieving best possible care

    Alternative model for the administration and analysis of research-based assessments

    Full text link
    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof-of-concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.Comment: 7 pages, 1 figure, accepted in Phys. Rev. PE

    GDL today: Reaching a viable alternative to IDL

    Full text link
    We report at the ADASS XXVII session the progresses made by GDL, the free clone of the proprietary IDL software. We argue that GDL can replace IDL for everyday use.Comment: 4 pages. Contributed paper at the ADASS XXVII conference, held in Santiago de Chile, Chile, October 2017. Proceedings to be published in ASP Conf. Ser. 522, 641, Ballester, P. et al., Eds., 201

    Spartan Daily, October 7, 2014

    Get PDF
    Volume 143, Issue 17https://scholarworks.sjsu.edu/spartandaily/1516/thumbnail.jp
    corecore