4 research outputs found

    Computer Versus Paper-Does It Make Any Difference in Test Performance?

    No full text
    Construct: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Background: Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. Approach: A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. Results: The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Conclusions: Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low-performing students) guess at a higher rate. Further studies are necessary to understand this finding

    Comparison of the evaluation of formative assessment at two medical faculties with different conditions of undergraduate training, assessment and feedback

    No full text
    Introduction: Both formative and summative assessments have their place in medical curricula: formative assessment to accompany the learning process and summative assessment to ensure that minimum standards are achieved. Depending on the conditions of undergraduate training, assessment and feedback, students place more or less importance on formative assessment, and thus the fulfilment of its function may be questionable. This study describes how the low-stakes formative Berlin Progress Test (BPT) is embedded at two medical faculties with partially different framework conditions and what effects these have on the students' testing efforts and the evaluation of the test, especially the perception of its benefits and (intangible) costs, such as non-participation in contemporaneous activities and emotional impairments. Methods: In this study, the proportion of non-serious BPT participants at two medical faculties (total sample: N-F1=1,410, N-F2=1,176) in winter term 2015/16 was determined both by the number of unanswered questions on the test itself and in a survey using a standardized instrument (N-F1,=415, N-F2=234). Furthermore, open questions were asked in this survey about perceived benefits and perceived costs, which were analyzed with qualitative and quantitative methods. Results: The BPT is generally better accepted at Faculty 2. This can be seen in the higher proportion of serious test takers, the lower perceived costs and the higher reported benefit, as well as the higher proportion of constructive comments. Faculty 2 students better understood the principle of formative testing and used the results of the BPT as feedback on their own knowledge progress, motivation to learn and reduction of exam fear. Discussion: When medical faculties integrate formative assessments into the curriculum, they have to provide a framework in which these assessments are perceived as an important part of the curriculum. Otherwise, it is questionable whether they can fulfil their function of accompanying the learning process

    Publication activity in medical education research: A descriptive analysis of submissions to the GMS Zeitschrift fur Medizinische Ausbildung in 2007-2015

    No full text
    Objectives: The significance of medical education research has increased internationally. In this context we investigated whether, and if so, how the quantity and quality of scientific papers reviewed and/or published by the GMS Zeitschrift fur Medizinische Ausbildung (GMS Z Med Ausbild) changed. Methods: The quantity and ratio of original papers, project reports and reviews submitted to or published in the GMS Z Med Ausbild were analysed. Published scientific articles were investigated in regard to the quality features study type and mode of data collection as well as the background (university affiliation) of the last authors. The citation frequency within the first five years after PubMed listing was compared to the one of BMC Medical Education in the corresponding period. Results: The number of submitted scientific manuscripts increased steadily. Most of the submissions and publications are original papers. For publications explorative studies and prospective data collection are most common. A shift over time is not observed. 16% of the published works come from one and 36% from four of the in total 39 universities represented by the last authors. The development of the citation frequency of articles published in GMS Z Med Ausbild is similar to that of BMC Medical Education. Conclusion: The rising number of submissions indicates an increasing significance of medical education research in German-speaking countries. The development of the number of citations reflects the growing appreciation of GMS Z Med Ausbild also indicated by the increasing number of online accesses. Our findings that study type and mode of data collection did not change has to be interpreted with caution since among other things choice and correct application of adequate methods are crucial regarding a scientific work's quality, too. These aspects, however, were not investigated in this paper

    Institutional strategies related to test-taking behavior in low stakes assessment

    No full text
    Low stakes assessment without grading the performance of students in educational systems has received increasing attention in recent years. It is used in formative assessments to guide the learning process as well as in large-scales assessments to monitor educational programs. Yet, such assessments suffer from high variation in students' test-taking effort. We aimed to identify institutional strategies related to serious test-taking behavior in low stakes assessment to provide medical schools with practical recommendations on how test-taking effort might be increased. First, we identified strategies that were already used by medical schools to increase the serious test-taking behavior on the low stakes Berlin Progress Test (BPT). Strategies which could be assigned to self-determination theory of Ryan and Deci were chosen for analysis. We conducted the study at nine medical schools in Germany and Austria with a total of 108,140 observations in an established low stakes assessment. A generalized linear-mixed effects model was used to assess the association between institutional strategies and the odds that students will take the BPT seriously. Overall, two institutional strategies were found to be positively related to more serious test-taking behavior: discussing low test performance with the mentor and consequences for not participating. Giving choice was negatively related to more serious test-taking behavior. At medical schools that presented the BPT as evaluation, this effect was larger in comparison to medical schools that presented the BPT as assessment
    corecore